Suppose you want to use your CPU to calculate your savings, which consists of $13,000.81, spanning several accounts. Now you could precisely add every digit of various accounts and get your precise balance, but at a significant computing cost. However, as an alternative, you could use weighted probabilistic calculations that were faster, but less accurate. While obtaining $50,000.81 would be a very undesirable result, obtaining $13,000.57 would be "close enough" in most cases. Thus applying greater weight to the tens, hundreds digits, and so on would yield a good calculation -- thus is the nature of probabilistic hardware computing.
With the limits of traditional computing being pushed to the brink, Moore's Law may soon expire. CPU manufacturers are preparing to launch 32 nm circuits late this year or early next year, and 22 nm is also in the works. However, past about 10 nm, using traditional light etching techniques begins to fail. Some are saying that this calls for ditching the traditional CPU entirely and adopting an entirely new design, such as optical computing or quantum computing. However, these options would be expensive and risky. The alternative is to repurpose the wheel -- make silicon computers that do the job better with the same number of transistors.
The idea for probabilistic computing has floated around for a while, and was born out of such a mindset. It has been largely pioneered and developed by Rice University Professor Krishna Palem. On Sunday, Professor Palem announced the results of his first chip and they're nothing short of groundbreaking.
His probabilistic CPU chip ran seven times as fast as traditional circuits and consumed only 1/30th of the energy. The results match or even exceed those predict by his mathematical models and computer simulations. He states, "The results were far greater than we expected. At first, I almost couldn’t believe them. I spent several sleepless nights verifying the results."
Professor Palem's chips could revolutionize fields such as computer-generated graphics, content streaming, and other applications that would accept a tradeoff of precision for increases in computing speed and decreases in power. Many experts in these fields which had been previously reticent about the idea of probabilistic computing have been convinced by Professor Palem's results.
Al Barr, a computer scientist at the California Institute of Technology, is among the new believers. He acknowledges former doubts, stating, "Initially there was definitely a lot of skepticism."
However, Barr and his colleagues are now planning to test new graphics software using Professor Palem's design or other upcoming probabilistic chips. Such designs might allow next generation cell phones and laptops to run for days more on a charge, while running significantly faster. While some artifacting might occur, it would only be a few missed pixels -- the overall image would remain. The human brain's imaging abilities can fill in most of this missing information, says Professor Palem. As he puts it, "In effect, we are putting a little more burden on the CPU in our heads and a little less burden on the CPU in our pockets."
Intel and other CPU makers have expressed interest in probabilistic designs, for lack of a clear die shrink solution in the long term. Professor Palem's results have them very excited. Shekhar Borkar, director of Intel’s Microprocessor Technology Lab lauds the work, stating, "This logic will prove extremely important, because basic physics dictates that future transistor-based logic will need probabilistic methods."
quote: So... What good is it then? "Here you go, this CPU is generally close enough to the right answer in its calculations. Use it for something!"
quote: Can you hear the difference between 0x11011001 and 0x11011010 on a cheap cell phone speaker?
quote: Can you hear the difference between 0x11011001 and 0x11011010 on a cheap cell phone speaker? probably not.
quote: So now, what can you use a device for that you KNOW will give wrong answers? games? nope,
quote: Even many scientific calculations would benefit from a huge increase in speed for the tradeoff of some precision. Stop being so alarmist and narrow-minded.
quote: NEVER once had I ever had either the incredible vision or the hand eye coordination to aim at a SINGLE PIXEL on the screen
quote: this CPU would inherently bring an unknown quantity of error into all computed calculations
quote: It is always better to do something slow and accurate, then fast and imprecise in the science world
quote: Well, how can you know the amount of error the CPU will bring? Will it not be different with every calculation,
quote: You forget that the error of the CPU will compound on itself. If it is off by a fraction of a degree on one calculation, takes that result and calculates from it again, it'll have an increased error on the next calculation
quote: Add in a processor that is increasing variance with every calculation, and it doesn't matter how much faster it is
quote: I'm afraid I will just have to disagree with you completely on this matter
quote: You forget that the error of the CPU will compound on itself. If it is off by a fraction of a degree on one calculation, takes that result and calculates from it again, it'll have an increased error on the next calculation based on the error from the first result being reused.
quote: Ah, I wouldn't go that far. Most of our models don't suck because we oversimplify them
quote: You're implying that we wouldn't have the realtime processing capability for the missile, (as if we can't already control them?)
quote: games? nope, Scientific calculations? HA HA no
quote: 1 or 2 pixels not noticeable? You ever have a monitor that has 1 or 2 pixels dead on it? Yes, you do notice it
quote: 1 or 2 pixels not noticeable? You ever have a monitor that has 1 or 2 pixels dead on it?
quote: but antialiasing makes things more precise.
quote: My statement that computers are very exact machines stands. Given the same input they ALWAYS give the same output,
quote: Yes, They only take a floating point accurately out to about 20 places... But come on your saying that basically "Well, they aren't exact anyways, so lets make them even less exact!"
quote: Ok, you got me, that could be useful in the case of higher precision floating points. However, its going to require a lot of changes. We are still dragging our feat with 32bit programs
quote: So now, what can you use a device for that you KNOW will give wrong answers? ...Scientific calculations? HA HA no.
quote: There hasn't been a scientific calculation done yet that yielded precisely the correct answer. Ever. There is always a certain degree of error.
quote: Professor Palem's chips could revolutionize fields such as computer-generated graphics, content streaming
quote: While some artifacting might occur, it would only be a few missed pixels -- the overall image would remain.
quote: Did you read the article? These chips aren't going to naturally incorporate AA/AF. They instead make missing pixels.
quote: quite the contrary; banks keep their numbers in BCD number representation to avoid the difference when converting from decimal to binary.
quote: i seriously doubt this computing hardware is the way forward, i mean look at us - mankind has been using computers not solely for the purpose of being very fast, its also the infallible accuracy it provides.
quote: Not only did the probabilistic processors produce nothing but five pages consisting largely of the letter S, the lead male began by bashing the keyboard with a stone, and the probabilistic processors continued by urinating and defecating on it.
quote: Please do not be alarmed," it said, "by anything you see or hear around you. You are bound to feel some initial ill effects as you have been rescued from certain death at an improbability level of two to the power two hundred and seventy-six thousand to one against – possibly much higher. We are now cruising at a level of two to the power of twenty-five thousand to one against and falling, and we will be restoring normality just as soon as we are sure of what is normal anyway.