Suppose you want to use your CPU to calculate your savings, which consists of $13,000.81, spanning several accounts. Now you could precisely add every digit of various accounts and get your precise balance, but at a significant computing cost. However, as an alternative, you could use weighted probabilistic calculations that were faster, but less accurate. While obtaining $50,000.81 would be a very undesirable result, obtaining $13,000.57 would be "close enough" in most cases. Thus applying greater weight to the tens, hundreds digits, and so on would yield a good calculation -- thus is the nature of probabilistic hardware computing.
With the limits of traditional computing being pushed to the brink, Moore's Law may soon expire. CPU manufacturers are preparing to launch 32 nm circuits late this year or early next year, and 22 nm is also in the works. However, past about 10 nm, using traditional light etching techniques begins to fail. Some are saying that this calls for ditching the traditional CPU entirely and adopting an entirely new design, such as optical computing or quantum computing. However, these options would be expensive and risky. The alternative is to repurpose the wheel -- make silicon computers that do the job better with the same number of transistors.
The idea for probabilistic computing has floated around for a while, and was born out of such a mindset. It has been largely pioneered and developed by Rice University Professor Krishna Palem. On Sunday, Professor Palem announced the results of his first chip and they're nothing short of groundbreaking.
His probabilistic CPU chip ran seven times as fast as traditional circuits and consumed only 1/30th of the energy. The results match or even exceed those predict by his mathematical models and computer simulations. He states, "The results were far greater than we expected. At first, I almost couldn’t believe them. I spent several sleepless nights verifying the results."
Professor Palem's chips could revolutionize fields such as computer-generated graphics, content streaming, and other applications that would accept a tradeoff of precision for increases in computing speed and decreases in power. Many experts in these fields which had been previously reticent about the idea of probabilistic computing have been convinced by Professor Palem's results.
Al Barr, a computer scientist at the California Institute of Technology, is among the new believers. He acknowledges former doubts, stating, "Initially there was definitely a lot of skepticism."
However, Barr and his colleagues are now planning to test new graphics software using Professor Palem's design or other upcoming probabilistic chips. Such designs might allow next generation cell phones and laptops to run for days more on a charge, while running significantly faster. While some artifacting might occur, it would only be a few missed pixels -- the overall image would remain. The human brain's imaging abilities can fill in most of this missing information, says Professor Palem. As he puts it, "In effect, we are putting a little more burden on the CPU in our heads and a little less burden on the CPU in our pockets."
Intel and other CPU makers have expressed interest in probabilistic designs, for lack of a clear die shrink solution in the long term. Professor Palem's results have them very excited. Shekhar Borkar, director of Intel’s Microprocessor Technology Lab lauds the work, stating, "This logic will prove extremely important, because basic physics dictates that future transistor-based logic will need probabilistic methods."