backtop


Print 84 comment(s) - last by dwalton.. on Jul 16 at 4:09 PM

Intel says CUDA will be nothing but a footnote in computer history

Intel and NVIDIA compete in many different ways. The most notable place we see competition between the two companies is in chipset manufacturing. Intel and NVIDIA also compete in the integrated graphics market where Intel’s integrated graphics chips lead the market.

NVIDIA started competing with Intel in the data processing arena with the CUDA programming language. Intel’s Pat Gelsinger, co-general manager of Intel’s Digital Enterprise Group, told Custom PC that NVIDIA’s CUDA programming model would be nothing more than an interesting footnote in the annals of computing history.

According to Gelsinger, programmers simply don’t have enough time to learn how to program for new architectures like CUDA. Gelsinger told Custom PC, “The problem that we’ve seen over and over and over again in the computing industry is that there’s a cool new idea, and it promises a 10x or 20x performance improvements, but you’ve just got to go through this little orifice called a new programming model. Those orifices have always been insurmountable as long as the general purpose computing models evolve into the future.”

The Sony Cell architecture illustrates the point according to Gelsinger. The Cell architecture promised huge performance gains compared to normal architectures, but the architecture still isn’t supported widely by developers.

Intel’s Larrabee graphics chip will be entirely based on Intel Architecture x86 cores says Gelsinger. The reason for this is so that developers can program for the graphics processor without having to learn a new language. Larrabee will have full support for APIs like DX and OpenGL.

NVIDIA’s CUDA architecture is what makes it possible to process complex physics calculations on the GPU, enabling PhysX on the GPU rather than CPU.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: pwnd
By omnicronx on 7/2/2008 4:59:33 PM , Rating: 2
You seemed to have missed the biggest point of my post, you can't compare Cell architecture which was designed for mainstream consumer use, with a GPGPU only architecture such as CUDA. Where as CUDA will only be used in an environment in which you want to squeeze every extra bit of performance, and to tell you the truth, I really don't see programmers having a problem. And just so you know, CUDA is incredibly similar to stripped down version of C, so its not going to be light and day here either.

I also think over estimate the power of Intel. A line of GPUS will require different new Fabs, as I don't see them just taking over current CPU fabs, especially when the die sizes are totally different in the GPU world. Its not like Intel can just use all of its resources and just disregard their CPU line. Anyway you look at it, Intel is going to be playing catchup, and personally I don't think that a GPU line that scales across 3 separate platforms (GPU, Mobile GPU, GPGPU) is the answer. I would only expect AMD and NVIDIA to turn on the afterburners to distance themselves from an already distant Intel.


RE: pwnd
By Mitch101 on 7/2/2008 5:23:54 PM , Rating: 2
Sure you can a it all comes down to IPC but you have to consider what kind of IPC's it will process.

Intel's wont require a new fab or anything special there is no magic in making a chip. Somehow I think Intel's engineers and better at chip fabbing than NVIDIA.

To underestimate Intel would be the kiss of death. I would believe you over estimate NVIDIA.

If AMD had any kind of after burners they would have used it on the CPU. These miracle afterburners are just imaginary thoughts. NVIDIA has no afterburners either otherwise they wouldn't be worried about ATI's R770 is it?

However Intel does have magic afterburners. Probably 32nm ones soon. Where NVIDIA doesnt own any and must settle for 55nm ones.

Even if Intel doesn't take the crown they only need to get to the mainstream level and they can kill NVIDIA with price.


RE: pwnd
By Mitch101 on 7/2/2008 5:29:53 PM , Rating: 2
Nvidia expects lower second-quarter revenue, gross margin

http://www.marketwatch.com/News/Story/Story.aspx?g...

Even a god king can bleed.


"If you mod me down, I will become more insightful than you can possibly imagine." -- Slashdot

Related Articles
GeForce 8 To Get Software PhysX Engine
February 15, 2008, 10:33 AM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki