Print 50 comment(s) - last by RAMDRPC.. on Apr 11 at 10:37 AM

Intel looks to get into the graphics market in 2009, and in a big way

Next month heralds the 2008 Spring Intel Developer Forum in Shanghai, China.  Pre-show briefings opened up with a quick mention on the status of Larrabee, Intel's upcoming graphics core.

Larrabee differs significantly from AMD's Radeon and NVIDIA's GeForce processors.  For starters, Larrabee is based on the x86 instruction set found in CPU architecture.  Intel vice president Steve Smith emphasized that Larrabee is not just a GPU, but a multi-core die capable of any stream processing task.

Smith would not detail exactly how many cores reside on Larrabee, though early schematics from 2006 detail designs with 16 cores.  Each in-order core is capable of operating in excess of 2 GHz.

Larrabee can apparently scale to several thousand cores, sharing much of the same research as Intel's Tera-scale project.  In addition to the x86 approach, the company announced it will soon announce another SSE-like extension set, dubbed Advanced Vector Extensions. These extensions will likely be what separates Larrabee's x86 instruction set from the x86 instructions featured on Core 2 Duo and Phenom.  Smith said Larrabee will support OpenGL, DirectX and ray-tracing instructions.

However, to much disappointment, Larrabee will not find a home on 45nm Nehalem processors, scheduled for an early 2009 launch. Smith said Larrabee samples will be ready in Q4 2008, with shipments in 2009, though the initial launch appears to be only for discrete computing.

Likely a fully integrated CPU design will not come until the next architecture redesign of Nehalem, codenamed Sandy Bridge (previously Gesher).

An Intel engineer tells DailyTech, "You have to walk before you can run." Walking, at Intel, means discretely connecting a much simpler GPU into the processor first.

Intel’s hybrid CPU and GPU chips are set to be released in two flavors, both of which will be based on the Nehalem CPU architecture. The first version, dubbed Havendale, will be a desktop chip, while the second version, dubbed Auburndale, will be a notebook chip.

Auburndale and Havendale will have two Nehalem cores paired with a graphics subsystem. The twin cores will share 4MB of L2 cache and feature an integrated dual-channel memory controller that supports memory configurations up to DDR3-1333.

The graphics subsystem will be initially derived from Intel’s G45 integrated graphics. This indicates that neither Auburndale nor Havendale will be for heavy graphics processing, but will be more of an integrated graphics replacement.

According to Intel roadmaps, the new processors are expected to enter the market in the first half of 2009. This beats out the expected time of arrival of AMD’s Fusion processors, which are planned to debut in the second half of 2009.

In the meantime, Smith promises the discrete Larrabee offerings will compete competitively with Radeon and GeForce offerings when its finally announced.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By KristopherKubicki on 3/17/2008 5:18:46 PM , Rating: 5
I think the hybrid CPUs for Intel low end makes the most sense for a Nehalem chip. With current Intel chipsets, the GPU is integrated into the NB because its a low cost solution. With Nahalem, there's no reason for the NB anymore, so what will Intel offer to low-end customers? Why sell two chips when one does just fine.

Especially for the mobile market, there's going to be a big push for this, even though the GPU is going to fall down when positioned against a discrete mobile GPU.

By Orbs on 3/17/2008 5:21:11 PM , Rating: 5
I agree completely that it makes sense to integrate to the CPU. I also agree that the integrated market isn't really competing against the descrete market.

To my knowledge however, all Intel IGPs currently offer at least full DX9 support. This one won't. Why would someone 'upgrade' to a Nehalem chip if it means fewer features (like losing Aero on Vista, or not being able to play a game they could play before)?

The end-user doesn't care if the graphics are on a Northbridge or the CPU.

By KristopherKubicki on 3/17/2008 5:22:40 PM , Rating: 3
It will support everything G45 does already -- and that includes DX9 and DX10 ... but not all of the instructions like you'd get on a GeForce or Radeon right now

By Orbs on 3/17/2008 5:25:05 PM , Rating: 2
So when the article says:

In fact, both graphics cores leave out support for key features of DirectX 9 and DirectX 10.

is it only referring to optional features?

By KristopherKubicki on 3/17/2008 5:32:40 PM , Rating: 2
The roadmap doesn't really say. I'd assume it's the optional features of DX9, but probably most of DX10.

By ET on 3/17/2008 5:59:14 PM , Rating: 5
Makes no sense. DX10 has few optional features, you can't just do "most of DX10". And if you're doing DX10 you should be able to do all of DX9 including the optional features.

By KristopherKubicki on 3/18/2008 9:28:35 AM , Rating: 2
I removed that statement from the article. Some of my notes are inconsistent on that matter.

By Samus on 3/17/2008 5:57:42 PM , Rating: 2
Time for nVidia to get into the CPU biz.

By imperator3733 on 3/17/2008 9:02:22 PM , Rating: 2
That's what I've been thinking for a while now.

By MrDiSante on 3/17/2008 10:55:29 PM , Rating: 3
Er... Unless they're planning to outsource to TSMC (which will be problematic when you're going against giants like Intel who have a ridiculous advantage in terms of manufacturing process) they might be in over their heads. They have no experience whatsoever of making fabs or anything like that.

By rudy on 3/18/2008 12:51:27 AM , Rating: 2
Get in bed with IBM

By JKflipflop98 on 3/18/2008 2:47:34 AM , Rating: 2
It's working for AMD! ;)

By 16nm on 3/18/2008 2:52:40 PM , Rating: 2
It's working for AMD! ;)

That waits to be seen. But let's hope 45nm will be on par or better than Intel's product.

By JKflipflop98 on 3/21/2008 11:40:45 AM , Rating: 2
Actually, I was pointing out the irony that AMD is already in bed with IBM and they still suck balls.

By amx on 3/18/2008 3:55:02 AM , Rating: 2
It is not as easy as it seems.
Pretty much close to impossible for anyone else to start making x86 based CPUs. To license x86 from intel/amd it would run into hundreds of millions, if not billions.

Why would Intel give x86 license to Nvidia?

By Calin on 3/18/2008 7:44:30 AM , Rating: 2
Also, a non-x86 architecture won't probably have more success than Itanium (also, the Very Long Instructions and Morphing Technology of the Transmeta had little success).

By sabrewulf on 3/19/2008 9:46:55 AM , Rating: 2
I guess you haven't heard that nVidia (Nvidia, nvidia, however they capitalize it now) is trying to buy Via... Via already has an x86 license and a line of CPUs. This doesn't necessarily mean anything of course, I'm just pointing it out as interesting.

"So if you want to save the planet, feel free to drive your Hummer. Just avoid the drive thru line at McDonalds." -- Michael Asher
Related Articles
AMD Details Next-generation Platforms
December 13, 2007, 10:14 PM

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
No More Turtlenecks - Try Snakables
September 19, 2016, 7:44 AM
ADHD Diagnosis and Treatment in Children: Problem or Paranoia?
September 19, 2016, 5:30 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
Automaker Porsche may expand range of Panamera Coupe design.
September 18, 2016, 11:00 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki