Print 68 comment(s) - last by eachus.. on May 6 at 8:33 PM

  (Source: LucasFilm)

Gordon Moore's prediction of doubling transistor counts every 2 years revolutionized the computer industry and his company, Intel.  (Source: New York Times)

An NVIDIA VP is declaring Moore's Law dead and GPUs the only hope for the industry.  (Source: TechCrunch)
In NVIDIA's eye the parallelism of the GPU is the only future for computing

NVIDIA has struggled this time around in the GPU war.  Its first DirectX 11 products were delivered a full seven months after AMD's.  While its new units are at last trickling onto the market and are very powerful, they're also hot, loud, and power hogs.  However, NVIDIA is staking much on the prediction that the computer industry will be ditching traditional architectures and moving towards parallel designs; a movement which it sees its CUDA GPU computing as an ideal solution for.

Intel and NVIDIA have long traded jabs, and Intel's recent failed GPU bid,
Larrabee, does little to warm to the ice.  In a recent op-ed entitled "Life After Moore's Law", published in Forbes, NVIDIA VP Bill Dally attacks the very foundation of Intel's business -- Moore's Law -- declaring it dead.

Moore's Law stemmed from a paper [PDF] published by Gordon Moore 45 years ago this month.  Moore, co-founder of Intel, predicted in the paper that the number of transistors per area on a circuit would double every 2 years (later revised to 18 months).  This prediction was later extend to predict that computing power would roughly double every 18 months, a prediction that became known as Moore's Law.

Now with die shrinks becoming more problematic, NVIDIA is convinced the end is nigh for Moore's Law (and Intel).  Writes Dally:

Moore's paper also contained another prediction that has received far less attention over the years. He projected that the amount of energy consumed by each unit of computing would decrease as the number of transistors increased. This enabled computing performance to scale up while the electrical power consumed remained constant. This power scaling, in addition to transistor scaling, is needed to scale CPU performance.
But in a development that's been largely overlooked, this power scaling has ended. And as a result, the CPU scaling predicted by Moore's Law is now dead. CPU performance no longer doubles every 18 months. And that poses a grave threat to the many industries that rely on the historic growth in computing performance.

Dally says that the only near-term hope for the computer industry now that Moore's Law is "over" is parallel computing -- splitting workloads up among a variety of processors.  However, he derides multi-core efforts by AMD and Intel, stating, "Building a parallel computer by connecting two to 12 conventional CPUs optimized for serial performance, an approach often called multi-core, will not work. This approach is analogous to trying to build an airplane by putting wings on a train. Conventional serial CPUs are simply too heavy (consume too much energy per instruction) to fly on parallel programs and to continue historic scaling of performance."

He concludes, "Let's enable the future of computing to fly--not rumble along on trains with wings."

In other words, he hopes you will buy NVIDIA GPUs and join the "Moore's Law is dead" party.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By Talon75 on 5/3/2010 10:46:16 AM , Rating: 4
That is a rather bold statement. While it is quite possible that he has a good point, going after Intel and AMD on this takes some brass ones...

RE: Interesting....
By spread on 5/3/2010 10:50:00 AM , Rating: 5
Don't worry. Nvidia's going to open up another can of whoop ass.

RE: Interesting....
By kattanna on 5/3/2010 11:07:31 AM , Rating: 5
but doesnt opening said can of whoop ass on yourself defeat the point?

RE: Interesting....
By gamerk2 on 5/3/2010 11:27:02 AM , Rating: 5
To be fair, the conclusion is probably correct. CPUs are serial process oriented, and are not designed to handle parralell workloads.

What NVIDIA failed to mention is that for heavy serial based tasks, GPU's also fall far short of the mark, as most of the computing resources avaliable end up going to waste. [Hence why Rasterization is done on GPUs: Each pixel's computations are fully independent from the rest, so a massivly parrelel structure makes far more senese].

I see a general trend toward more sepcilized chips in the near future; We're already seeing the trend toward a seperate physics processor (which will be massivly parrallel once multiple-object interactions become the norm, but remains VERY process heavy, making GPU's not the best option for computation...).

RE: Interesting....
By Aenslead on 5/3/10, Rating: -1
RE: Interesting....
By adiposity on 5/3/2010 12:28:34 PM , Rating: 2
Didn't it used to be nVidia? I guess that's just their logo.

RE: Interesting....
By jonmcc33 on 5/3/2010 12:33:03 PM , Rating: 2
It was always nVIDIA. Just look at the dies on their GPU. Lowercase "n" and the rest is capitalized.

RE: Interesting....
By adiposity on 5/3/2010 3:41:27 PM , Rating: 3
I wouldn't say "always" since it's now apparently NVIDIA (just check their website, copyrights, etc.)

RE: Interesting....
By oab on 5/3/2010 12:28:49 PM , Rating: 3
It's nVidia you insensitive clod!

RE: Interesting....
By oab on 5/3/2010 12:29:41 PM , Rating: 2
Yes, I know it is no longer nVidia, because they changed it. Just like NEXT/NeXt/etc.

RE: Interesting....
By deeznuts on 5/3/2010 7:14:35 PM , Rating: 2
You can always tell an nvidian posting by the way they write NVIDIA.

Are nvidians related to the ballchinians?

RE: Interesting....
By zmatt on 5/3/2010 8:04:40 PM , Rating: 3
I prefer the term Nvidiot.

RE: Interesting....
By MrPickins on 5/3/2010 1:37:30 PM , Rating: 2
I see a general trend toward more sepcilized chips in the near future; We're already seeing the trend toward a seperate physics processor (which will be massivly parrallel once multiple-object interactions become the norm, but remains VERY process heavy, making GPU's not the best option for computation...).

It make me wonder if we'll see more chips like the Cell in the future.

RE: Interesting....
By bbomb on 5/3/2010 4:36:06 PM , Rating: 2
LOL That one deserves a six lmao.

RE: Interesting....
By talonvor on 5/5/2010 9:39:20 PM , Rating: 2
You know 10 years from now, when the first quantum CPUs hit the market it wont matter anymore. A quantum CPU the size of a dime would out perform any mainframe on the planet. Problem solved!

"If you look at the last five years, if you look at what major innovations have occurred in computing technology, every single one of them came from AMD. Not a single innovation came from Intel." -- AMD CEO Hector Ruiz in 2007

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki