backtop


Print 84 comment(s) - last by dwalton.. on Jul 16 at 4:09 PM

Intel says CUDA will be nothing but a footnote in computer history

Intel and NVIDIA compete in many different ways. The most notable place we see competition between the two companies is in chipset manufacturing. Intel and NVIDIA also compete in the integrated graphics market where Intel’s integrated graphics chips lead the market.

NVIDIA started competing with Intel in the data processing arena with the CUDA programming language. Intel’s Pat Gelsinger, co-general manager of Intel’s Digital Enterprise Group, told Custom PC that NVIDIA’s CUDA programming model would be nothing more than an interesting footnote in the annals of computing history.

According to Gelsinger, programmers simply don’t have enough time to learn how to program for new architectures like CUDA. Gelsinger told Custom PC, “The problem that we’ve seen over and over and over again in the computing industry is that there’s a cool new idea, and it promises a 10x or 20x performance improvements, but you’ve just got to go through this little orifice called a new programming model. Those orifices have always been insurmountable as long as the general purpose computing models evolve into the future.”

The Sony Cell architecture illustrates the point according to Gelsinger. The Cell architecture promised huge performance gains compared to normal architectures, but the architecture still isn’t supported widely by developers.

Intel’s Larrabee graphics chip will be entirely based on Intel Architecture x86 cores says Gelsinger. The reason for this is so that developers can program for the graphics processor without having to learn a new language. Larrabee will have full support for APIs like DX and OpenGL.

NVIDIA’s CUDA architecture is what makes it possible to process complex physics calculations on the GPU, enabling PhysX on the GPU rather than CPU.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

CUDA a joke ???
By kilkennycat on 7/2/2008 4:45:04 PM , Rating: 2
quote:
According to Gelsinger, programmers simply don’t have enough time to learn how to program for new architectures like CUDA. Gelsinger told Custom PC, “The problem that we’ve seen over and over and over again in the computing industry is that there’s a cool new idea, and it promises a 10x or 20x performance improvements, but you’ve just got to go through this little orifice called a new programming model. Those orifices have always been insurmountable as long as the general purpose computing models evolve into the future.”


Er, Mr. Geisinger, have you by any chance asked those in the oil industry and in the weather forecasting business, in technical academic research and in high-tech engineering industries and in the professional image-processing industries whether CUDA is a joke?? Those with very deep pockets and time-critical computation needs not at all well served by traditional CPU farms. nVidia and ATi bring lots of parallel-computation power to these types of applications currently being publicly touted by IBM and Toshiba as being ideal applications for multiprocessor farms based on the Cell processor. Even in the semi-pro domain, Adobe is sure going full-bore on integrating GPGPU horsepower into their upcoming CS4 efforts. Is Intel afraid of being left out in the cold, with too little too late ?? Larrabee as currently proposed looks as if it may fall into the crack of neither serving central processing needs well nor graphics/parallel-processing. The only really bright spot for Larrabee might be in the lap-top business, finally solving Intel's IGP problems while bringing decent compute-perform




RE: CUDA a joke ???
By Mitch101 on 7/2/2008 6:06:40 PM , Rating: 2
Why is it that no one thinks Intel can make a graphics chip?

Intel never tried to really make a real 3d chip before. Their bread and butter was always corporate desktops which didn't need 3D. They only added enough to make the OS work well. They have just enough in the chip to do Vista but they were never designed to be graphics chips. Somehow I don't believe their current chips were designed with Crysis in mind. That's why there is a PCI-E slots on the mobo.

Somehow I think Intel will figure out how to draw a polygon and do it really fast.

Until you spot a PCI-E graphics card from Intel I would suggest not doing the NVIDIA and running off at the mouth.

Somehow I think the Intel bunny men engineers will have the last laugh when the time is right.

BTW Intel's bottom line seems to be doing incredibly well without a 3D graphics card to save them from bankruptcy and a GPU doesn't run very well without a mobo and cpu but a CPU runs quite well without a GPU. GPU's are dependent on a CPU still until they run withouth a CPU the GPU will need to play catchup.

Did everyone catch the call about NVIDIA lowering their revenue forcast?


RE: CUDA a joke ???
By SavagePotato on 7/3/2008 10:55:15 AM , Rating: 1
Most likely because Intel currently makes very lousy IGP solutions, and did indeed enter the discreet graphics arena once already. Which was a dismal failure.

The intel i740.
http://en.wikipedia.org/wiki/Intel740

We have one of those sitting on a shelf collecting dust somewhere in the box.


RE: CUDA a joke ???
By CBone on 7/2/2008 6:54:13 PM , Rating: 2
He's right. It is a pain to have to learn new programming models based on the huge performance gains touted, to only get a fraction of that out of it in the end. Those industries you mentioned would like to not have to use CUDA or CTM or any other new language. Do you know how much work it would take to troubleshoot and rewrite all of their code to be compatible or make new software from scratch?

quote:
Er, Mr. Geisinger, have you by any chance asked those in the oil industry and in the weather forecasting business, in technical academic research and in high-tech engineering industries and in the professional image-processing industries whether CUDA is a joke??


That sounds good, but it's a lot of work, new hardware, and research just to use them. If Larrabee can bring the speed while letting a company's current programming staff use what they already know, Intel will win if it can come anywhere close to their performance estimates.


"If you look at the last five years, if you look at what major innovations have occurred in computing technology, every single one of them came from AMD. Not a single innovation came from Intel." -- AMD CEO Hector Ruiz in 2007

Related Articles
GeForce 8 To Get Software PhysX Engine
February 15, 2008, 10:33 AM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki