backtop


Print 28 comment(s) - last by Argon18.. on Feb 19 at 11:16 AM

System could be useful for mission critical applications, such as combat robotics

Professor Peter Bentley of the University College of London and his colleague Christos Sakellariou aren't impressed with most everyday computers, which aren't very fault tolerant and can only multitask by flipping their cores between various sequential instruction streams in a program.

He describes in an interview with NewScientist, "Even when it feels like your computer is running all your software at the same time, it is just pretending to do that, flicking its attention very quickly between each program.  Nature isn't like that.  Its processes are distributed, decentralised and probabilistic. And they are fault tolerant, able to heal themselves. A computer should be able to do that."

So the pair set out to make a new hardware and a new operating system, capable of handling tasks differently from most current machines, which even if "parallel" deal with instructions sequentially.

The new machine has instruction set pairs that tell what to do when a certain set of data is encountered.  The instructions-data pairs are then sent to multiple "systems", which are chosen at random to produce results.  Each system has its own redundant stack of instructions, so if one gets corrupted, others can finish up the work.  And each system has its own memory and storage; so "crashes" due to memory/storage errors are eliminated.

Comments Prof. Bentley, "The pool of systems interact in parallel, and randomly, and the result of a computation simply emerges from those interactions."

The results will be presented at an April conference in Singapore. 

The team is currently working on coding the machine so that it can reprogram its own instructions to respond to changes in the environment.  That self-learning, combined with the redundant, pseudorandom nature of the system would make it quite a bit more similar to a human brain than a traditional computer.

Potential applications for such a system include military robotics, swarm robotics, and mission critical servers.  For example, if an unmanned aerial vehicle sustained damage or was hacked, it might be able to reprogram itself and escape errors thanks to the redundancy, allowing it to fly home.

The computer is somewhat similar to so-called "probabilistic" chip designs, which are being researched at other universities.

Source: New Scientist



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Not rocket science
By Beenthere on 2/16/2013 2:11:27 AM , Rating: 2
Current day computers are a poor excuse for what they could be primarily due to apathy. The proof is in the totally defective Windoze O/S's that have been forced on consumers thru illegal means. How could you ever expect a computer which is logic based to be reliable when forced to run a defective O/S that has millions of documented defects? Making the hardware more reliable is also possible if hardware makers actually cared and did proper validation.

In reality PC sales like all electronics is about making money, not about delivering reliable computers. The only reason current computers are unreliable is because consumers will buy this crap. Why waste time and money making a proper computer when you can sell crap for windfall profits? Sure better micro code can improve performance and it should be the basis for all computers, but don't expect to see it any time soon for commercial use as it's less profiatble than selling crapware.

This ain't rocket science and has been known for 30+ years.




"And boy have we patented it!" -- Steve Jobs, Macworld 2007














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki