Before this happened a third player, Intel, had attacked the
market and quietly built up a dominant marketshare in low-end
graphics. Its integrated GPUs popped up on netbooks, low end laptops,
and even a number of low-end desktops. Now the graphics market
seems poised to flip yet again. NVIDIA is on the verge of
seizing much of Intel's low-end and mobile marketshare, thanks to its
platform and new mobile GPUs. And AMD (ATI) looks poised to
attack in the latest round of the discrete graphics wars with
priced DirectX 11 GPUs. That leaves Intel, which is
preparing to take on NVIDIA and AMD with a brand new discrete
graphics offering called Larrabee.
First revealed at
in August of 2008, Larrabee is now reality. Intel
has just shown
off working models of Larrabee GPUs at its Intel
Developers Forum in San Francisco.
Built on a multicore die-shrink of Intel's Pentium 54C architecture, the powerful new graphics chip was able to render a ray-traced scene
from the id Software game, Enemy Territory: Quake Wars,
with ease. Ray-tracing is an advanced technique that has long
been touted as an eventual replacement to rasterization in video
games. Currently it is used for the high quality 3D animation
found in many films.
The demo also gave a peak at Gulftown,
a 6 core, Core i9 processor. Built on the new upcoming Westmere
architecture (a Nehalem die shrink), Gulftown is the "extreme edition"
counterpart of Clarkdale (desktop) and Arrandale
Larrabee, however received the most
interest. The working chip was said to be on par with NVIDIA's
GTX 285. With AMD's latest offerings trouncing the GTX 285 in
benchmarks, the real question will be the power envelope, how many
Larrabees Intel can squeeze on a graphics board, what kind of
Crossfire/SLI-esque scheme it can devise, and most importantly the
A direct comparison between NVIDIA's GTX 285 and
Larrabee is somewhat misleading, though, because Larrabee
is unique in several ways. First Larrabee supports x86
instructions. Secondly, it uses tile-based rendering to
accomplish task like z-buffering, clipping, and blending that its
competitors do in hardware, with software instead (Microsoft's Xbox
360 works this way too). Third, all of its cores have cache
coherency. All of these features stack up to (in theory) make
Larrabee easier to program games for than NVIDIA and AMD's
The GPU also still has some time to grow
and be fine tuned. It's not expected until the first half of 2010.
Intel is just now lining up board partners for the new card, so it
should be interesting to see which companies jump on the
At IDF Intel also gave hints at the rest of its
upcoming graphics plans. It says that it's preparing to take a
"big jump ahead" in integrated graphics with an upcoming
product to be released next year, which will compete with the NVIDIA
Ion. Intel is also preparing a system-on-a-chip package with
Larrabee and an Intel x86 processor, which it is using to
target the mobile and handheld devices market.
It's unclear, though, how low Intel can
push the embedded Larrabee's power envelope or when the SoC
will arrive. It did make it clear, though, that it is targeting
the chip for "non-PC" applications.
quote: Back in the day (Think Pre-TnL) there were Dozens of GPU manufacturers, all competing for a piece of the exploding PIE, most tried for the Performance crown (3dfx), others tried for the best low-cost solutions (S3), some for maximum API compatibility and image quality (nVidia and ATI), some even focused on the best 2D Image quality and 2D performance (Matrox), yet all that remained after the dust settled was ATI and nVidia, however back then there was a massive amount of competition, if you went with a manufacturer you might have been locked out of certain game releases, for Instance if you bought ATI or nVidia, you were locked out of 3dfx GLIDE-only games, or the reverse was true, as 3dfx didn't support Direct 3D on a level like most other companies.