backtop


Print 84 comment(s) - last by dwalton.. on Jul 16 at 4:09 PM

Intel says CUDA will be nothing but a footnote in computer history

Intel and NVIDIA compete in many different ways. The most notable place we see competition between the two companies is in chipset manufacturing. Intel and NVIDIA also compete in the integrated graphics market where Intel’s integrated graphics chips lead the market.

NVIDIA started competing with Intel in the data processing arena with the CUDA programming language. Intel’s Pat Gelsinger, co-general manager of Intel’s Digital Enterprise Group, told Custom PC that NVIDIA’s CUDA programming model would be nothing more than an interesting footnote in the annals of computing history.

According to Gelsinger, programmers simply don’t have enough time to learn how to program for new architectures like CUDA. Gelsinger told Custom PC, “The problem that we’ve seen over and over and over again in the computing industry is that there’s a cool new idea, and it promises a 10x or 20x performance improvements, but you’ve just got to go through this little orifice called a new programming model. Those orifices have always been insurmountable as long as the general purpose computing models evolve into the future.”

The Sony Cell architecture illustrates the point according to Gelsinger. The Cell architecture promised huge performance gains compared to normal architectures, but the architecture still isn’t supported widely by developers.

Intel’s Larrabee graphics chip will be entirely based on Intel Architecture x86 cores says Gelsinger. The reason for this is so that developers can program for the graphics processor without having to learn a new language. Larrabee will have full support for APIs like DX and OpenGL.

NVIDIA’s CUDA architecture is what makes it possible to process complex physics calculations on the GPU, enabling PhysX on the GPU rather than CPU.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Intel Is Scared
By AggressorPrime on 7/2/2008 1:52:55 PM , Rating: 5
For Gaming
What else can we expect Intel to say, that a competitor is making a product that will steal the industry? Take it from the game programmers' mouths, not Intel's.
1. What has Intel ever done for games? EE? Yeah, take a $500 CPU, add $1000 to the price tag, add 200MHz, and make it easier to OC (a feature all CPUs shared in the past). Spending the extra $1000 of GPUs will give you a lot more for your money. And you also must consider that even with Intel's fastest CPU's, they are still a bottleneck. No CPU can fully handle 3 GeForce GTX 280's at stock speeds. So instead of waiting for a powerful CPU to do physics, we had Ageia be born. And when they failed to sell in volume, nVidia bought them. Now every gamer with a DX10 card (nVidia + AMD) can now have much faster physics than any CPU can process. Larrabee may be faster than the CPU, but it can't compete against the GT200 giant. The best bet Intel has is to replace the standard CPUs in gaming machines with Larrabee CPUs.

For Videos
Intel again failed in this accord. For decoding, you can simply buy a $50 GPU to process 1080p Blu-Ray's without lag. You would have to spend $200 on a CPU to get the same effect, either a fast dual core or a quad core. And then we have encoding. Has Intel even seen the huge performance gains seen on the GT200 processing encoding of HD videos?

Other
Intel is saying that CUDA and really GPGPU in general is difficult to use, so no one will use it. Yet in addition with the things mentioned above, we have seen PDF's started to be rendered by the GPU, not the CPU, a general application! And the creators of Photoshop said they would program Photoshop to use the GPU due to its advantages. GPUs are also used to process wide-scale supercomputer applications like Folding@Home. All these applications would run much slower than the CPU.

Parallel Age
We are entering the age of parallization, it started with Intel's HT, continued with AMD's dual core, and became mainstream when GPU's started processing general applications. Developers who want the best performance for their applications look to nVidia's and AMD's GPUs for maximum performance, they look to the CPU only when the GPU can't do what the CPU can. And with the GT200, the gap shrunk again, with nVidia's introduction of double precision floating point operations. The only thing really missing is RAM, but yet you can get 4GB per GPU with the CUDA cards themselves. The best use of the CPU today is to manage the traffic of high powered GPUs.

Roadrunner
http://en.wikipedia.org/wiki/IBM_Roadrunner
(Read "Hybrid design")
And may I remind Intel that Cell didn't drop off. Not only does it power the PS3, it made it's way into the most powerful supercomputer on the planet, except of course when you consider supercomputer wide-scale networks like Folding@Home which use GPUs to make them the most powerful. Of course Intel probably didn't get the memo concerning its construction since it uses AMD CPUs, not Intel's.

Conclusion
The CPU is still important, but CUDA's power must be realized, not discouraged. Intel's Larrabee should not be seen as a replacement for CUDa, for I truly doubt it can come close to the performance we have seen from nVidia's GPUs, and considering its launch date of late 2009/early 2010, we still have at least one more generation of GPUs which most likely will have 2x the performance of last generation, giving us 2 teraFLOPS on a single chip. Intel's Larrabee should be a replacement for the CPU, like what we see with gaming consoles. It is still x86 so it should still work well with general applications, but it will also better be able to aid the GPUs to remove the CPU bottleneck.




RE: Intel Is Scared
By Clauzii on 7/2/2008 3:44:57 PM , Rating: 1
I don't see why this got down to -1???


RE: Intel Is Scared
By AggressorPrime on 7/2/2008 6:12:02 PM , Rating: 3
Lol, looks like some Intel employees view this site.


RE: Intel Is Scared
By CBone on 7/2/2008 6:56:52 PM , Rating: 5
It got rated down because it reads like Nvidia PR comments.


RE: Intel Is Scared
By DingieM on 7/3/08, Rating: -1
"So, I think the same thing of the music industry. They can't say that they're losing money, you know what I'm saying. They just probably don't have the same surplus that they had." -- Wu-Tang Clan founder RZA

Related Articles
GeForce 8 To Get Software PhysX Engine
February 15, 2008, 10:33 AM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki