backtop


Print 68 comment(s) - last by eachus.. on May 6 at 8:33 PM


  (Source: LucasFilm)

Gordon Moore's prediction of doubling transistor counts every 2 years revolutionized the computer industry and his company, Intel.  (Source: New York Times)

An NVIDIA VP is declaring Moore's Law dead and GPUs the only hope for the industry.  (Source: TechCrunch)
In NVIDIA's eye the parallelism of the GPU is the only future for computing

NVIDIA has struggled this time around in the GPU war.  Its first DirectX 11 products were delivered a full seven months after AMD's.  While its new units are at last trickling onto the market and are very powerful, they're also hot, loud, and power hogs.  However, NVIDIA is staking much on the prediction that the computer industry will be ditching traditional architectures and moving towards parallel designs; a movement which it sees its CUDA GPU computing as an ideal solution for.

Intel and NVIDIA have long traded jabs, and Intel's recent failed GPU bid,
Larrabee, does little to warm to the ice.  In a recent op-ed entitled "Life After Moore's Law", published in Forbes, NVIDIA VP Bill Dally attacks the very foundation of Intel's business -- Moore's Law -- declaring it dead.

Moore's Law stemmed from a paper [PDF] published by Gordon Moore 45 years ago this month.  Moore, co-founder of Intel, predicted in the paper that the number of transistors per area on a circuit would double every 2 years (later revised to 18 months).  This prediction was later extend to predict that computing power would roughly double every 18 months, a prediction that became known as Moore's Law.

Now with die shrinks becoming more problematic, NVIDIA is convinced the end is nigh for Moore's Law (and Intel).  Writes Dally:

Moore's paper also contained another prediction that has received far less attention over the years. He projected that the amount of energy consumed by each unit of computing would decrease as the number of transistors increased. This enabled computing performance to scale up while the electrical power consumed remained constant. This power scaling, in addition to transistor scaling, is needed to scale CPU performance.
But in a development that's been largely overlooked, this power scaling has ended. And as a result, the CPU scaling predicted by Moore's Law is now dead. CPU performance no longer doubles every 18 months. And that poses a grave threat to the many industries that rely on the historic growth in computing performance.

Dally says that the only near-term hope for the computer industry now that Moore's Law is "over" is parallel computing -- splitting workloads up among a variety of processors.  However, he derides multi-core efforts by AMD and Intel, stating, "Building a parallel computer by connecting two to 12 conventional CPUs optimized for serial performance, an approach often called multi-core, will not work. This approach is analogous to trying to build an airplane by putting wings on a train. Conventional serial CPUs are simply too heavy (consume too much energy per instruction) to fly on parallel programs and to continue historic scaling of performance."

He concludes, "Let's enable the future of computing to fly--not rumble along on trains with wings."

In other words, he hopes you will buy NVIDIA GPUs and join the "Moore's Law is dead" party.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Awfully strong words...
By zmatt on 5/3/2010 2:28:16 PM , Rating: 1
+2 billion

Nvidia has no room to be criticizing anyone on architectures. Their GPUs are big and hot, and their programming environment is thrown together. Not to mention the serious differences on how GPUs are designed and work, they could never replace CPUs. CPUs may be less efficient in floating point calculations, but they are far more versatile, in other words ideal for the job of central processing unit, they can do everything. GPUs have a strict memory hierarchy and a very 1 dimensional (see what I did there) skill set. Some things run very very fast on a GPU, but most don't.

Not to mention how graphics companies are marketing driven and not engineering driven, their claimed performance is never indicative of real world numbers and they can run you in circles to get real answers. Something that the HPC world is not very receptive too.


RE: Awfully strong words...
By Inkjammer on 5/3/2010 3:15:47 PM , Rating: 2
Honestly, I think that Nvidia's 200 series were fantastic, and they held against ATI well. Nvidia needed to refine the 200 series, and eek out a smaller design and more performance. They should have done that and focused on getting Fermi RIGHT, not just getting it out the door. Most graphics card sales are at the $200 level. If Nvidia could have refined the low end with cooler, faster 200 series cards things would have been better all around.

DX11 is not important (yet) and Nvidia should have pushed the performance levels of the GTX 260 as mainstream (screw the 250). The price was right, the performance was right, and the GTX 260 could have/should have been the new 8800 GT while ATI did the same. Get it down to a $150 level, cooler, faster. Promote it was the baseline for gaming performance.

I see that as the failure of both ATI and Nvidia, personally. No standard performance expectations, and both companies keep pushing out crappy derivative cards that don't meet a set a minimum performance standard, which in turn leads to the "omfg, gaming computers cost $2,500!" stereotypes that hurts PC gaming so damn much by perceptions of people tying to keep up. It's always a race for the performance crown... while the baseline suffers. And in this instances, they suffered losses on the top and the bottom.


RE: Awfully strong words...
By zmatt on 5/3/2010 4:49:05 PM , Rating: 2
It would be impossible for them to shrink GT200 or fermi. TSMC is having enough trouble as it is with current gen fab processes and i doubt they can start cutting weight off the die to slim it down. they went with a large monolithic gpu and now they have to live with it. I think ATI saw the writing on the wall a long time ago and began to move towards a better way to make gpus, compared to Nvidia they are smaller and cooler for similar performance and lower prices. Not to say that ATIs chips are perfect, just better.

last gen nvidia was hurt but not beat, this time around they have been schooled.


RE: Awfully strong words...
By Inkjammer on 5/3/2010 3:17:22 PM , Rating: 3
And for the record, I'm not pro ATI. I love Nvidia, but they eff'd up so damn badly this round.


"So if you want to save the planet, feel free to drive your Hummer. Just avoid the drive thru line at McDonalds." -- Michael Asher














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki