backtop


Print 24 comment(s) - last by Regs.. on Dec 17 at 12:40 PM


Larrabee wafer shown at IDF
Killed by performance of ATI's Radeons

Intel Corporation may make some of the best CPUs out there, but it has had limited success in making GPUs. It is the largest supplier of computer graphics thanks to its integrated chipsets, but performance has always been a big problem. It introduced the i740 discrete GPU in 1998 to help popularize the AGP interface, but disappointing performance meant that it disappeared from shelves after only a short while on the market.

A decade later, the company had hoped to make a discrete GPU comeback with Larrabee, a 45nm 32-core GPU which would use x86 instructions. However, the program has been chronically postponed even as advanced new GPU designs from ATI and NVIDIA have come on the market.

Intel has now decided to cancel the consumer GPU version of Larrabee which was supposed to have come out next year. It was supposed to feature two teraFLOPS of performance, but ATI broke that barrier earlier this year.  A teraFLOPs is 1 trillion FLoating point Operations Per Second, and is an indicator of CPU and GPU performance.

AMD's graphics division launched the 40nm Radeon HD 5870 with 2.72 teraFLOPS just before Intel showed off a prototype of Larrabee at the Intel Developer Forum in September. The recent launch of the Radeon HD 5970 with over 5 teraFLOPS was the final nail in the coffin. Intel decided that Larrabee just wouldn't be able to compete on price or performance.

"Larrabee silicon and software development are behind where we hoped to be at this point in the project," stated Intel in a email to DailyTech.

"As a result, our first Larrabee product will not be launched as a standalone discrete graphics product, but rather be used as a software development platform for internal and external use".

"While we are disappointed that the product is not yet where we expected, we remain committed to delivering world-class many-core graphics products to our customers. Additional plans for discrete graphics products will be discussed some time in 2010," the statement concluded.

The initial software development platform will be launched next year. The company had previously stated that Larrabee was to be just the first of several GPUs.

Larrabee was supposed to combine the raw parallel throughput of a GPU with the general programming ability of a CPU. Intel often highlighted rendering features that are difficult to achieve on GPUs like real-time raytracing and order-independent transparency. Larrabee would enable features like that through its tile-based rendering approach.

Born out of Intel's many-core Terascale Initiative, Larrabee's hardware is based on the Pentium P54C CPU. It contains vector-processing units to enhance the performance of graphics and video applications. The cores featured short instructional pipelines, with support for Hyper-Threading four execution threads per core, which has its own register set to access memory. A short instructional pipeline allows fast access to L1 cache. All cores on Larrabee share access to a partitioned L2 cache, while cache coherency across all cores would maintain data consistency. Communication between all of the Larrabee cores is through a 1024-bit bidirectional ring bus.

Intel recognized the importance of software and drivers to Larrabee's success, leading to the creating of the Intel Visual Computing Institute at Saarland University in Saarbrücken, Germany. The lab conducts basic and applied research in visual and parallel computing.

The Larrabee roadmap showed a future 48-core version built on a 32nm process, and hinted at more powerful versions built using Intel's 22nm SOC process in 2012. While we may or may not see products based on Larrabee, the cancellation of consumer Larrabee GPUs means that Intel will be able to deploy more resources on improving the 32nm integrated graphics which will be found in next year's Sandy Bridge chips. Nevertheless, this whole mess is an embarrassment for Intel, and a major retreat in their ongoing saga with AMD and NVIDIA.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Oh Intel...
By StevoLincolnite on 12/6/2009 10:25:31 PM , Rating: 2
Would have been nice for you to re-enter the Graphics chip market once again, but first you need to realize that competition does not sit still in the GPU industry, ATI and nVidia build multiple generations of GPU's in tandem, so that there isn't a 2-3 year delay between GPU releases like we had in the 1990's. (3dfx, was a big culprit for that, look where that got them!)

I personally had high hopes, but low expectations, having seen people burnt multiple times in the past over the Intel Decelerators, and there poor drivers and lacking feature sets as well as erratically poor performance.

Intel, instead of trying to develop and support/build the GPU "your way" - Try taking a note from your competitors and have multiple teams, and a top-notch driver development crew, so you are always ready for the competition. - And actually build a decent IGP!




RE: Oh Intel...
By Nik00117 on 12/7/2009 3:25:16 AM , Rating: 5
I don't think Intel can do it, ATI and Nvidia are good though :) One guy releases one GPU one week, the other releases a better one the week after and so forth. I then watch great GPU drop down to $60 a pop.


RE: Oh Intel...
By quiksilvr on 12/7/2009 3:08:23 PM , Rating: 2
The only thing I could see happening (which won't happen anytime soon) is Intel buying nVidia. If you can't beat 'em, buy 'em out.


RE: Oh Intel...
By Anoxanmore on 12/7/2009 9:15:39 AM , Rating: 2
ATI: HELLO! MY NAME IS ATI! YOU ATTEMPTED TO KILL MY GPU MARKET! PREPARE TO DIE!
[ATI corners Intel, knocks his GPU aside, and slashes his motherboard, giving him a scar just like ATI's]
ATI: Offer me money.
Intel: Yes!
ATI: Power, too, promise me that.
[He slashes his other side of the motherboard]
Intel: All that I have and more. Please...
ATI: Offer me anything I ask for.
Intel: Anything you want...
[Intel knocks ATI's GPU aside and lunges. But ATI traps his arm and aims his GPU at Intel's stomach]
ATI: I want my market share back, you son of a bitch!
[He runs Intel through and shoves him back against the table. Intel falls to the floor, dead]

: - )


RE: Oh Intel...
By morgan12x on 12/7/2009 10:45:00 AM , Rating: 2
The PC bride?


RE: Oh Intel...
By Anoxanmore on 12/7/2009 12:20:56 PM , Rating: 2
Aww... people didn't find it funny. T.T : - (

I like the PC Bride, good revamped title.


RE: Oh Intel...
By jonmcc33 on 12/7/2009 9:21:36 AM , Rating: 2
I think that Intel should just partner up with AMD/ATi for dedicated use on it's chipsets. That would be awesome for the Intel vs nVIDIA debacle.


RE: Oh Intel...
By elgueroloco on 12/8/2009 5:18:18 PM , Rating: 2
Yes, Intel should partner up with ATI, the GPU arm of its chief CPU competitor, giving AMD more R&D revenue to make competing products. I'm sure Intel would love to return to the days when AMD made better chips for half the price.


CUDA is not a new language...
By jconan on 12/6/2009 7:28:44 PM , Rating: 3
CUDA is basically C with extensions and CUDA is available in C++ and fortran. http://www.nvidia.com/object/cuda_education.html




RE: CUDA is not a new language...
By Samus on 12/7/2009 6:40:39 AM , Rating: 3
So yes, C++ applications can be recompiled in CUDA.

Will they perform well? No. Guaranteed. None of them.

Code needs to be optimized for the architecture, not the language. It would be a complete joke to have current GPU architecture run even basic multithreaded programs. Intel recognized this and came up with a rather revolutionary idea of having programs that perform well on GPU architecture compiled to run x86 instructions.

That said, I'd bet the issues they are running into are software and developer support related.


Shame
By corduroygt on 12/7/2009 9:05:36 AM , Rating: 4
GPU's today use too much power and generate too much heat, I was hoping Intel and their legendary manufacturing processes would change that...




RE: Shame
By Regs on 12/17/2009 12:40:24 PM , Rating: 2
Have you seen the size of these suckers? People commented above about how Intel would not be able to keep up with ATI or Nvidia. However all ATi and Nvidia have been doing is making the damn things bigger to keep up with the fight for the performance crown. Not too hard since 3D gaming is infinitely parallel in nature compared to x86.


Link to Anandtech's Prime
By AnnihilatorX on 12/7/2009 6:54:07 AM , Rating: 5
Since this is relevent might as well link it.

http://www.anandtech.com/cpuchipsets/showdoc.aspx?...




Faux Pas?
By JumpingJack on 12/6/2009 7:37:02 PM , Rating: 2
quote:
Larrabee was supposed to combine the raw throughput of a CPU with the parallel programming ability of a GPU


You mean, combine the raw throughput of a GPU with the general programming ability of a CPU.

GPUs are throughput monsters, but have limitations in programmability. CPUs are general featured and are supported by a large software base, and are generally more programmable.




Intel vs. nVidia
By dagamer34 on 12/6/2009 7:42:37 PM , Rating: 2
Wasn't it more about the fact that because Intel wanted to sue nVidia out of the chipset business, they no longer had access to any of their GPU IP (which is a lot, I hear).




By Nichols1986 on 12/9/2009 9:05:40 AM , Rating: 1
Dear Ladies and Gentlemen,Here are the most popular, most stylish and avant-garde shoes,handbags,Tshirts, jacket,Tracksuit w ect... For details, please consult http://www.coolforsale.com Christmas sale, free shipping discounts are beautifully gift.




Late Much?
By ImSpartacus on 12/6/09, Rating: -1
RE: Late Much?
By sprockkets on 12/7/2009 6:10:07 PM , Rating: 2
Don't criticize them, the dailytech lovers will mod you down for the drivel they get from Jason Mick.


Uh, I don't get it.
By sprockkets on 12/6/09, Rating: -1
RE: Uh, I don't get it.
By PorreKaj on 12/7/2009 2:24:49 AM , Rating: 4
I don't read anandtech...

'nuff said


RE: Uh, I don't get it.
By sprockkets on 12/7/09, Rating: 0
"We basically took a look at this situation and said, this is bullshit." -- Newegg Chief Legal Officer Lee Cheng's take on patent troll Soverain














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki