backtop


Print 68 comment(s) - last by eachus.. on May 6 at 8:33 PM


  (Source: LucasFilm)

Gordon Moore's prediction of doubling transistor counts every 2 years revolutionized the computer industry and his company, Intel.  (Source: New York Times)

An NVIDIA VP is declaring Moore's Law dead and GPUs the only hope for the industry.  (Source: TechCrunch)
In NVIDIA's eye the parallelism of the GPU is the only future for computing

NVIDIA has struggled this time around in the GPU war.  Its first DirectX 11 products were delivered a full seven months after AMD's.  While its new units are at last trickling onto the market and are very powerful, they're also hot, loud, and power hogs.  However, NVIDIA is staking much on the prediction that the computer industry will be ditching traditional architectures and moving towards parallel designs; a movement which it sees its CUDA GPU computing as an ideal solution for.

Intel and NVIDIA have long traded jabs, and Intel's recent failed GPU bid,
Larrabee, does little to warm to the ice.  In a recent op-ed entitled "Life After Moore's Law", published in Forbes, NVIDIA VP Bill Dally attacks the very foundation of Intel's business -- Moore's Law -- declaring it dead.

Moore's Law stemmed from a paper [PDF] published by Gordon Moore 45 years ago this month.  Moore, co-founder of Intel, predicted in the paper that the number of transistors per area on a circuit would double every 2 years (later revised to 18 months).  This prediction was later extend to predict that computing power would roughly double every 18 months, a prediction that became known as Moore's Law.

Now with die shrinks becoming more problematic, NVIDIA is convinced the end is nigh for Moore's Law (and Intel).  Writes Dally:

Moore's paper also contained another prediction that has received far less attention over the years. He projected that the amount of energy consumed by each unit of computing would decrease as the number of transistors increased. This enabled computing performance to scale up while the electrical power consumed remained constant. This power scaling, in addition to transistor scaling, is needed to scale CPU performance.
But in a development that's been largely overlooked, this power scaling has ended. And as a result, the CPU scaling predicted by Moore's Law is now dead. CPU performance no longer doubles every 18 months. And that poses a grave threat to the many industries that rely on the historic growth in computing performance.

Dally says that the only near-term hope for the computer industry now that Moore's Law is "over" is parallel computing -- splitting workloads up among a variety of processors.  However, he derides multi-core efforts by AMD and Intel, stating, "Building a parallel computer by connecting two to 12 conventional CPUs optimized for serial performance, an approach often called multi-core, will not work. This approach is analogous to trying to build an airplane by putting wings on a train. Conventional serial CPUs are simply too heavy (consume too much energy per instruction) to fly on parallel programs and to continue historic scaling of performance."

He concludes, "Let's enable the future of computing to fly--not rumble along on trains with wings."

In other words, he hopes you will buy NVIDIA GPUs and join the "Moore's Law is dead" party.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Performance Prophesy
By lotharamious on 5/3/2010 11:08:23 AM , Rating: 0
I'm not sure about serial performance doubling every 18 months to 2 years, but I know that transistor densities are still climing with each new manufacturing process. Intel is currently at 25nm with NAND? I think the real problems begin below 13nm, IMHO.

The problems with power in mobile devices will continue to get better as we'll be able to scale down transistors to use less power. Honestly, I'm not sure how the desktop fits in anymore. It's obviously the workhorse, but I'm convinced that more work needs to be done on the computer science end of computing in order to extract more "performace", or at least our perception of performance.

I believe that Microsoft has the right idea now. The only way to improve performance in the near-term is to cache EVERYTHING to memory, which will eventually include a search index. It's all about the memory hierarchy. It's about the only way we're going to continue to see performance improvements as time goes on and serial processing performance finds its asymptote. Nvidia's right, smashing more and more discrete serial cores together isn't going to solve the computation problem.

This pretty much means it's time to find a new computing paradigm. If you think about it, software is years behind hardware in terms of functionality. Why does everyone say that software gets slower even though hardware gets faster? The software paradigm must be at fault. Forget SSDs. They do help performance in an incredible way, but I'm talking about something greater. I'm amazed any piece of software still has to be "configured" for it to work properly anyway. The wave of the future will be just like in science fiction... human-computer interaction should be like interacting with another person; with voice and touch. And we're on the way there, just ask Apple or Microsoft. But the fact that computers still use keyboards and especially mice for input devices is beyond asinine. It's not a logical way to interact with any being. The sooner people start to think of their computer as a "being" and less of a "device", we should really start to see computers doing amazing things. The problem, of course, is that we don't know enough about how software can and should work that performance is even an issue anymore. Hint: Buy software stock. The time of lucrative hardware is over. Apple has shown us that it is all about the software, and if you think what we have today is impressive, just wait 20 years. Applications will be incredible in functionality and should be extremely intuitive.

And though this prophesy sounds a little out there and scary, it will be a smooth, and incredibly natural transition into the future; the marketing people will guarantee that. Remember how people were scared shitless about the idea of the government putting chips in everyone's head... hate to say it, but they didn't even have to. People did it to themselves... it's call the "cell phone".

Like horsepower in a car, computing horsepower will become less relevant (and it already is to some extent on the desktop) as software begins to act more naturally to user input.




"If you look at the last five years, if you look at what major innovations have occurred in computing technology, every single one of them came from AMD. Not a single innovation came from Intel." -- AMD CEO Hector Ruiz in 2007














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki