backtop


Print 60 comment(s) - last by Integral9.. on Sep 29 at 7:55 AM


Working silicon of Larrabee, Intel's upcoming 2010 discrete GPU was shown running a ray-traced scene from id Software's Enemy Territory: Quake Wars. The GPU will become the only desktop GPU based on an x86 architecture. Intel brags of its efficiency, saying the water effects (reflections, transparency, etc.) were accomplished with only 10 lines of C++ code.  (Source: CNET/YouTube)
Intel prepares to jump into the discrete graphics market

In the 1990s a plethora of graphics chip makers rose and fell.  Eventually only two were left standing -- NVIDIA and ATI.  ATI would later be acquired by AMD.  Though AMD was in a fierce battle with Intel in the multicore era, it also managed to score some points in the last generation graphics war with its 4000 series cards (RV770). 

Before this happened a third player, Intel, had attacked the market and quietly built up a dominant marketshare in low-end graphics. Its integrated GPUs popped up on netbooks, low end laptops, and even a number of low-end desktops.  Now the graphics market seems poised to flip yet again.  NVIDIA is on the verge of seizing much of Intel's low-end and mobile marketshare, thanks to its Ion platform and new mobile GPUs.  And AMD (ATI) looks poised to attack in the latest round of the discrete graphics wars with aggressively priced DirectX 11 GPUs.  That leaves Intel, which is preparing to take on NVIDIA and AMD with a brand new discrete graphics offering called Larrabee.

First revealed at SIGGRAPH in August of 2008, Larrabee is now reality.  Intel has just shown off working models of Larrabee GPUs at its Intel Developers Forum in San Francisco.

Built on a multicore die-shrink of Intel's Pentium 54C architecture, the powerful new graphics chip was able to render a ray-traced scene from the id Software game, Enemy Territory: Quake Wars, with ease.  Ray-tracing is an advanced technique that has long been touted as an eventual replacement to rasterization in video games.  Currently it is used for the high quality 3D animation found in many films.

The demo also gave a peak at Gulftown, a 6 core, Core i9 processor. Built on the new upcoming Westmere architecture (a Nehalem die shrink), Gulftown is the "extreme edition" counterpart of Clarkdale (desktop) and Arrandale (laptop). 

Larrabee, however received the most interest.  The working chip was said to be on par with NVIDIA's GTX 285.  With AMD's latest offerings trouncing the GTX 285 in benchmarks, the real question will be the power envelope, how many Larrabees Intel can squeeze on a graphics board, what kind of Crossfire/SLI-esque scheme it can devise, and most importantly the price.

A direct comparison between NVIDIA's GTX 285 and Larrabee is somewhat misleading, though, because Larrabee is unique in several ways.  First Larrabee supports x86 instructions.  Secondly, it uses tile-based rendering to accomplish task like z-buffering, clipping, and blending that its competitors do in hardware, with software instead (Microsoft's Xbox 360 works this way too).  Third, all of its cores have cache coherency.  All of these features stack up to (in theory) make Larrabee easier to program games for than NVIDIA and AMD's discrete offerings.

The GPU also still has some time to grow and be fine tuned. It's not expected until the first half of 2010.  Intel is just now lining up board partners for the new card, so it should be interesting to see which companies jump on the bandwagon.

At IDF Intel also gave hints at the rest of its upcoming graphics plans.  It says that it's preparing to take a "big jump ahead" in integrated graphics with an upcoming product to be released next year, which will compete with the NVIDIA Ion.  Intel is also preparing a system-on-a-chip package with Larrabee and an Intel x86 processor, which it is using to target the mobile and handheld devices market. 

It's unclear, though, how low Intel can push the embedded Larrabee's power envelope or when the SoC will arrive.  It did make it clear, though, that it is targeting the chip for "non-PC" applications.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Hmm...
By StevoLincolnite on 9/24/2009 9:59:20 AM , Rating: 5
quote:
In the 1990s a plethora of graphics chip makers rose and fell. Eventually only two were left standing -- NVIDIA and ATI.


We have seen Intel compete with ATI and nVidia in the discreet Graphics card market before with the Intel i740 AGP graphics card, however it had poor driver support, much worst than there current drivers, which says something! And generally average performance, despite using the then-new AGP bus with it's large bandwidth advantage over PCI.

We have lost allot of graphics chip manufacturers, some are still around, Matrox changed there focus away from the Consumer market after the pretty dismal release of the Matrox Parhelia which got pretty beaten down by the Geforce 4 Ti and Radeon 9xxx series.

S3, which like intel went on to become a series of IGP's for VIA, and also make Discreet GPU's as well under the banner of "Chrome".

SiS still have pretty dismal IGP's as well, And PowerVR/STMicroelectrics pretty much fell off the map after there Kyro cards, which use a Tile-based renderer.
Which is also licensed by Intel for there IGP's, but they do compete well in the Ultra portable/low powered markets.

And the venerable 3DFX was gobbled up by nVidia, which the "3dfx" team ended up bringing us the pretty dismal Geforce FX, and gave nVidia the rights to use the acronym "SLI" as a method to combine 2 Graphics cards to help render a scene. (ATI had this technology back then as well with the Rage Fury Maxx combining 2 GPU's on the same card).

I officially feel old, as I've seen the rise and fall of many Graphics chip company's/products, and seen games advance over the years, Intel is just adding another chapter, whether it's successful or not remains to be seen, but one thing is to be sure, they have allot of hard work with there drivers if the i740/GMA series is anything to go by.

Back in the day (Think Pre-TnL) there were Dozens of GPU manufacturers, all competing for a piece of the exploding PIE, most tried for the Performance crown (3dfx), others tried for the best low-cost solutions (S3), some for maximum API compatibility and image quality (nVidia and ATI), some even focused on the best 2D Image quality and 2D performance (Matrox), yet all that remained after the dust settled was ATI and nVidia, however back then there was a massive amount of competition, if you went with a manufacturer you might have been locked out of certain game releases, for Instance if you bought ATI or nVidia, you were locked out of 3dfx GLIDE-only games, or the reverse was true, as 3dfx didn't support Direct 3D on a level like most other companies.

How far we have come... Back then a graphics card release was something almost seemingly un-expected as it felt like every week there was a new GPU released, now we have a pretty solid Tick-Tock strategy employed by ATI and nVidia where you can almost guess the month it will be released in well in advance.




RE: Hmm...
By MrRuckus on 9/24/2009 7:35:10 PM , Rating: 2
quote:
Back in the day (Think Pre-TnL) there were Dozens of GPU manufacturers, all competing for a piece of the exploding PIE, most tried for the Performance crown (3dfx), others tried for the best low-cost solutions (S3), some for maximum API compatibility and image quality (nVidia and ATI), some even focused on the best 2D Image quality and 2D performance (Matrox), yet all that remained after the dust settled was ATI and nVidia, however back then there was a massive amount of competition, if you went with a manufacturer you might have been locked out of certain game releases, for Instance if you bought ATI or nVidia, you were locked out of 3dfx GLIDE-only games, or the reverse was true, as 3dfx didn't support Direct 3D on a level like most other companies.


Back in those days, Nvidia was seriously junk. Think back to the Diamond Viper V330 with the Riva 128 chipset. While it was raved as being very fast, it was because Nvidia cut corners on its image quality. Ever try playing a game with that chipset? Wheels were blocks. I bought that card and played 5mins of Interstate 76 and took it back. The TNT was the first true chip to come from Nvidia in my eyes (I picked up a TNT based Diamond Viper V550). Before that they were garbage IMO.

You also forgot Rendition. They were a contendor in the lower end of the market. I bought a Diamond Stealth S220 which got me Monster3D like performance on a single 2D/3D card which had the V2100 chipset. I believe they also had their own API then went no where fast, Redline I believe it was called?


"A politician stumbles over himself... Then they pick it out. They edit it. He runs the clip, and then he makes a funny face, and the whole audience has a Pavlovian response." -- Joe Scarborough on John Stewart over Jim Cramer

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki