backtop


Print 68 comment(s) - last by eachus.. on May 6 at 8:33 PM


  (Source: LucasFilm)

Gordon Moore's prediction of doubling transistor counts every 2 years revolutionized the computer industry and his company, Intel.  (Source: New York Times)

An NVIDIA VP is declaring Moore's Law dead and GPUs the only hope for the industry.  (Source: TechCrunch)
In NVIDIA's eye the parallelism of the GPU is the only future for computing

NVIDIA has struggled this time around in the GPU war.  Its first DirectX 11 products were delivered a full seven months after AMD's.  While its new units are at last trickling onto the market and are very powerful, they're also hot, loud, and power hogs.  However, NVIDIA is staking much on the prediction that the computer industry will be ditching traditional architectures and moving towards parallel designs; a movement which it sees its CUDA GPU computing as an ideal solution for.

Intel and NVIDIA have long traded jabs, and Intel's recent failed GPU bid,
Larrabee, does little to warm to the ice.  In a recent op-ed entitled "Life After Moore's Law", published in Forbes, NVIDIA VP Bill Dally attacks the very foundation of Intel's business -- Moore's Law -- declaring it dead.

Moore's Law stemmed from a paper [PDF] published by Gordon Moore 45 years ago this month.  Moore, co-founder of Intel, predicted in the paper that the number of transistors per area on a circuit would double every 2 years (later revised to 18 months).  This prediction was later extend to predict that computing power would roughly double every 18 months, a prediction that became known as Moore's Law.

Now with die shrinks becoming more problematic, NVIDIA is convinced the end is nigh for Moore's Law (and Intel).  Writes Dally:

Moore's paper also contained another prediction that has received far less attention over the years. He projected that the amount of energy consumed by each unit of computing would decrease as the number of transistors increased. This enabled computing performance to scale up while the electrical power consumed remained constant. This power scaling, in addition to transistor scaling, is needed to scale CPU performance.
But in a development that's been largely overlooked, this power scaling has ended. And as a result, the CPU scaling predicted by Moore's Law is now dead. CPU performance no longer doubles every 18 months. And that poses a grave threat to the many industries that rely on the historic growth in computing performance.

Dally says that the only near-term hope for the computer industry now that Moore's Law is "over" is parallel computing -- splitting workloads up among a variety of processors.  However, he derides multi-core efforts by AMD and Intel, stating, "Building a parallel computer by connecting two to 12 conventional CPUs optimized for serial performance, an approach often called multi-core, will not work. This approach is analogous to trying to build an airplane by putting wings on a train. Conventional serial CPUs are simply too heavy (consume too much energy per instruction) to fly on parallel programs and to continue historic scaling of performance."

He concludes, "Let's enable the future of computing to fly--not rumble along on trains with wings."

In other words, he hopes you will buy NVIDIA GPUs and join the "Moore's Law is dead" party.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

By aftlizard on 5/3/2010 12:16:29 PM , Rating: 4
I don't see a shift were mobile computing will exceed pc's in importance for enterprise and production, and definitely not gaming. The shift will be more about production where you can work on documents and slides on your mobile device and switch on the fly to your pc for completion. There will be more mobile devices for certain and that sector will continue to grow but I doubt it will pass the importance of PC's.

I have many mobile device and nothing, not even my laptop, can really replace the comfort of using my home PC with its large monitor and processing capability.


By Micronite on 5/3/2010 1:44:28 PM , Rating: 2
I definitely agree...

When you boil it all down, the major draw for mobile devices is the ability to be on the Web.
Maybe I'm wrong, but I don't see many people working on documents with their iPhone or Blackberry, but I see people all the time using their iPhones to browse the web.

On a side note, this makes Apple's stubbornness with Flash even more puzzling. Honestly, I might consider an iPhone if I were able to do anything I can do on the web at my desk.


By NanoTube1 on 5/3/2010 5:20:29 PM , Rating: 2
I think you are missing the point.

Tablets have the potential to replace PCs for the vast majority of people. The iPad and it's relative simplicity for example are the advent of mobile-PC-as-an-appliance, which is something no one could achieve before (for many reasons).

Most people use their PCs for web, skype, office work and some simple gaming - which are all practical on a tablet device. If the majority of these ordinary users switch to iPad like devices, the PC will return to be what it was designed for in the first place - a workstation for relatively complex tasks.

I never believed in the death of the PC and I still don't, but what we are witnessing here is the beginning of a major shift - and nVidia sees it as their opportunity to become a leader in the CPU/GPU market.


"This is from the DailyTech.com. It's a science website." -- Rush Limbaugh














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki