backtop


Print 38 comment(s) - last by R0B0Ninja.. on Aug 8 at 7:42 AM

NVIDIA says its GPUs are in fact programmable in C language

Yesterday, DailyTech ran a story about details on Intel's upcoming Larrabee architecture for the graphics market. One of Intel's most important talking points when it plays up the benefits of Larrabee over NVIDIA's GPUs is the fact that NVIDIA's GPUs require developers to learn a new programming language called CUDA.

Intel says that with its Larrabee architecture developers can simply program in C or C++ languages for just as they would for any other x86 processor. According to Intel, the ability to program Larrabee with C or C++ makes it much easier for developers to port applications from other platforms to the Larrabee architecture.

After DailyTech ran the story, NVIDIA wanted to address what it considers to be misinformation when it comes to CUDA. NVIDIA says:

CUDA is a C-language compiler that is based on the PathScale C compiler. This open source compiler was originally developed for the x86 architecture. The NVIDIA computing architecture was specifically designed to support the C language - like any other processor architecture. Competitive comments that the GPU is only partially programmable are incorrect - all the processors in the NVIDIA GPU are programmable in the C language.

NVIDIA's approach to parallel computing has already proven to scale from 8 to 240 GPU cores. Also, NVIDIA is just about to release a multi-core CPU version of the CUDA compiler. This allows the developer to write an application once and run across multiple platforms. Larrabee's development environment is proprietary to Intel and, at least disclosed in marketing materials to date, is different than a multi-core CPU software environment.

Andrew Humber from NVIDIA distilled things a bit further saying, "CUDA is just our brand name for the C-compiler. They aren't two different things."

Humber also pointed out that at NVIDIA's financial analyst day in April it showed an astrophysics simulation running on integrated graphics with an eight-core GPU, a GeForce 8 series GPU with 128 cores and a quad-core CPU. NVIDIA says that the demonstration used exactly the same binary program across the range of GPUs and the exact same source code for the CPU and GPU.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

You sure about that?
By marsbound2024 on 8/5/2008 1:38:05 PM , Rating: 5
quote:
and a quad-core CPU with four cores...


Pretty revolutionary, that.




RE: You sure about that?
By Sulphademus on 8/5/2008 1:42:18 PM , Rating: 5
Department of Redundancy Department.


RE: You sure about that?
By Brandon Hill (blog) on 8/5/2008 1:42:29 PM , Rating: 5
Not as revolutionary as a dozen of Krispy Kreme doughnuts which come 12 to a box.


RE: You sure about that?
By Oregonian2 on 8/5/2008 6:04:01 PM , Rating: 2
144 doughnuts are a lot of doughnuts!

:-)

Although I know what you meant to say...

:-)


RE: You sure about that?
By killerroach on 8/5/2008 10:16:27 PM , Rating: 5
quote:
144 doughnuts are a lot of doughnuts!

:-)

Although I know what you meant to say...

:-)


That's just [a] gross. :)


RE: You sure about that?
By JonnyDough on 8/7/2008 12:02:29 AM , Rating: 2
Not to a cop! Nothing against cops, unless they're here on behalf of the RIAA. Then I might not be too happy with them.


RE: You sure about that?
By tmouse on 8/6/2008 3:49:32 PM , Rating: 2
Thats why I always prefer a bakers dozen. ;)


RE: You sure about that?
By Amiga500 on 8/5/2008 1:49:45 PM , Rating: 2
Could have been saved with an emergency switch to:

and a quad-core CPU with four threads

But alas, it was too late - the sentence was confined to the grammatical underworld where half of my posts belong.


"If you mod me down, I will become more insightful than you can possibly imagine." -- Slashdot

Related Articles
Intel Talks Details on Larrabee
August 4, 2008, 12:46 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki