backtop


Print 51 comment(s) - last by aretche.. on Aug 7 at 12:54 AM


Larry Seiler and Stephen Junkins Speak at Intel Larrabee Brief  (Source: News.com)
Intel will begin sampling Larrabee in 2008 with products on market in 2009 or 2010

Today there are three main players in the graphics market producing hardware -- Intel, AMD, and NVIDIA. As the market stands right now, only AMD and NVIDIA manufacture discrete graphics cards with Intel sticking exclusively to on-board graphics that are common on the vast majority of notebook and desktop computers in the low and mid-range market.

Intel is looking to change that and will be bringing its own discrete products to market at some point. The discrete graphics cards from Intel will use the Larrabee architecture and according to eWeek; discrete graphics cards using the Larrabee architecture won’t be available until 2009 or 2010. EWeek does say that Intel will be sampling Larrabee in 2008.

Intel has begun talking about the Larrabee architecture and naturally, it feels that Larrabee is the best architecture out there. What makes Intel so enthused by its architecture is that the Larrabee core is based on the Pentium CPU and uses x86 cores. The use of x86 cores means that programmers and game developers can use the familiar programming languages -- like C and C++ -- that have been in use for a number of years, rather than having to learn a new programming language like NVIDIA's CUDA.

Intel says that its Larrabee is a many-core processor and eWeek reports that it will likely containing ten or more individual x86 processor cores inside the silicon package. Discrete graphics cards using the Larrabee architecture will initially be aimed at the gaming market. That means Intel is directly targeting AMD and NVIDIA with Larrabee.

Intel says Larrabee will support both DirectX and OpenGL APIs and it is encouraging developers to design new and graphic intense applications for the architecture. Larrabee will also bring a new era in parallel computing with developers being able to write applications for it using C and C++ programming languages.

Intel has combined the throughput of a CPU with the parallel programming ability of a GPU. Intel says that Larrabee will also contain vector-processing units to enhance the performance of graphics and video applications. The x86 cores feature short instructional pipelines and can support four execution threads with each core. Each core can also support a register set to help with memory. The short instructional pipeline allows faster access to L1 cache with each core.

Intel says that all cores on Larrabee will share access to a large L2 cache partitioned for each of the cores. The arrangement of the Larrabee architecture allows it to maintain an efficient in-order pipeline, yet allows the processor some benefits of an out-of-order processor to help with parallel applications. Communication between all of the Larrabee cores will be enhanced by using what Intel calls a bidirectional ring network.

Larry Seiler from Intel says, "What the graphics and general data parallel application market needs is an architecture that provides the full programming abilities of a CPU, the full capabilities of a CPU together with the parallelism that is inherent in graphics processors. Larrabee provides [that] and it's a practical solution to the limitations of current graphics processors."

According to News.com, one Intel slide shows that the performance of the Larrabee architecture scales linearly with four cores offering twice the performance of two cores. According to News.com core counts for Larrabee will range from 8 to 48 -- the exact core count for the Larrabee architecture is unknown at this time.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Countdown...
By zsdersw on 8/4/2008 12:58:47 PM , Rating: 1
.. until the usual parade of responses begin. They will include comments very similar to:

"Intel sucks at graphics.. why would Larrabee be any different?"

"Not gonna be competitive with ATI/Nvidia"

"Drivers are gonna suck"

They're coming.. just you watch.




RE: Countdown...
By ajvitaly on 8/4/2008 1:03:34 PM , Rating: 1
Those comments are coming. But you forgot one other important comment that I will personally be saying from now till release:

it'll be too complicated of a platform to squeeze performance out of ala PS3. It'll probably be a great product that developers will loathe.


RE: Countdown...
By masher2 (blog) on 8/4/2008 1:17:28 PM , Rating: 2
The Core (PS3) was SIMD. Larrabee will be MIMD...much more like traditional programming (and the Xbox 360 for that matter).


RE: Countdown...
By FITCamaro on 8/4/2008 1:27:58 PM , Rating: 2
Not only that but its not like anything special has to be done to program for it. You can program using DX or OGL for graphics or write any other kind of graphics (or non-graphics) in basic C/C++ same as you can for your system processor. You don't have to really learn anything new for this thing other than perhaps multi-threaded coding techniques.


RE: Countdown...
By Some1ne on 8/4/2008 6:05:17 PM , Rating: 5
quote:
"Intel sucks at graphics.. why would Larrabee be any different?"


They currently do. Why would it? How is speculating that Larrabee will be different any better than speculating that it won't be different? At least the people speculating that it won't be different have historical precedent on their side.

quote:
"Not gonna be competitive with ATI/Nvidia"


If you bother to run the numbers, there's justification for that one as well. Each core in the Larrabee architecture can sustain a maximum of 16 FLOPS per clock. The clock rate is expected to be about 2 GHz, and each card is expected to support in the neighborhood of 32 cores. That gives a maximum theoretical throughput of about 1 TFLOPS. Both AMD and Nvidia can already hit that mark with their current high-end cards. By the 2009/2010 timeframe, they will probably have at least doubled that number. So there's more reason to think that it won't be competitive than there is to think that it will be.

quote:
"Drivers are gonna suck"


They very well might, as they usually do when most new products are released. Do you have any evidence to suggest that they won't suck?

In other words, by implying that such comments are invalid/false, you've done nothing different than the people who assert that they are true. Speculation cuts both ways.


RE: Countdown...
By zsdersw on 8/4/2008 9:15:17 PM , Rating: 2
quote:
They currently do. Why would it? How is speculating that Larrabee will be different any better than speculating that it won't be different? At least the people speculating that it won't be different have historical precedent on their side.


Historical precedent is useful until it isn't. I'm not speculating on Larrabee, I'm criticizing the typical speculation that is likely to occur.

quote:
So there's more reason to think that it won't be competitive than there is to think that it will be.


"Competitive" is less a measure of whose got the most TFLOPS than it is who has the most TFLOPS [i]at the least cost[/i].

quote:
Do you have any evidence to suggest that they won't suck?


The people who currently write Intel's IGP drivers are not the people working on Larrabee's drivers.


"I'm an Internet expert too. It's all right to wire the industrial zone only, but there are many problems if other regions of the North are wired." -- North Korean Supreme Commander Kim Jong-il

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki