backtop


Print 51 comment(s) - last by aretche.. on Aug 7 at 12:54 AM


Larry Seiler and Stephen Junkins Speak at Intel Larrabee Brief  (Source: News.com)
Intel will begin sampling Larrabee in 2008 with products on market in 2009 or 2010

Today there are three main players in the graphics market producing hardware -- Intel, AMD, and NVIDIA. As the market stands right now, only AMD and NVIDIA manufacture discrete graphics cards with Intel sticking exclusively to on-board graphics that are common on the vast majority of notebook and desktop computers in the low and mid-range market.

Intel is looking to change that and will be bringing its own discrete products to market at some point. The discrete graphics cards from Intel will use the Larrabee architecture and according to eWeek; discrete graphics cards using the Larrabee architecture won’t be available until 2009 or 2010. EWeek does say that Intel will be sampling Larrabee in 2008.

Intel has begun talking about the Larrabee architecture and naturally, it feels that Larrabee is the best architecture out there. What makes Intel so enthused by its architecture is that the Larrabee core is based on the Pentium CPU and uses x86 cores. The use of x86 cores means that programmers and game developers can use the familiar programming languages -- like C and C++ -- that have been in use for a number of years, rather than having to learn a new programming language like NVIDIA's CUDA.

Intel says that its Larrabee is a many-core processor and eWeek reports that it will likely containing ten or more individual x86 processor cores inside the silicon package. Discrete graphics cards using the Larrabee architecture will initially be aimed at the gaming market. That means Intel is directly targeting AMD and NVIDIA with Larrabee.

Intel says Larrabee will support both DirectX and OpenGL APIs and it is encouraging developers to design new and graphic intense applications for the architecture. Larrabee will also bring a new era in parallel computing with developers being able to write applications for it using C and C++ programming languages.

Intel has combined the throughput of a CPU with the parallel programming ability of a GPU. Intel says that Larrabee will also contain vector-processing units to enhance the performance of graphics and video applications. The x86 cores feature short instructional pipelines and can support four execution threads with each core. Each core can also support a register set to help with memory. The short instructional pipeline allows faster access to L1 cache with each core.

Intel says that all cores on Larrabee will share access to a large L2 cache partitioned for each of the cores. The arrangement of the Larrabee architecture allows it to maintain an efficient in-order pipeline, yet allows the processor some benefits of an out-of-order processor to help with parallel applications. Communication between all of the Larrabee cores will be enhanced by using what Intel calls a bidirectional ring network.

Larry Seiler from Intel says, "What the graphics and general data parallel application market needs is an architecture that provides the full programming abilities of a CPU, the full capabilities of a CPU together with the parallelism that is inherent in graphics processors. Larrabee provides [that] and it's a practical solution to the limitations of current graphics processors."

According to News.com, one Intel slide shows that the performance of the Larrabee architecture scales linearly with four cores offering twice the performance of two cores. According to News.com core counts for Larrabee will range from 8 to 48 -- the exact core count for the Larrabee architecture is unknown at this time.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

By vignyan on 8/6/2008 1:34:28 AM , Rating: 2
First when i started with your post, i thought you liked being pessimistic... but soon realized you were negative! :(

While what you said seem true about the failures of Intel in the past, I want to bring up these points...

Itanium 64: What did you expect? that Intel is a supreme power that can see the future and tell that this will be a success? All they did was try to optimize a processor specifically for server purpose. Ofcourse its not going to be a X86 and hence its a underdog in that market where sun and IBM were quite the kings. But still it did kick all a**es with the existing servers in terms of performance (still not up to sun and IBM but thats also expected to go the Intel way!). So hold on to your comments about the Itanium market that you hardly know! :P

Timna, as you mention was a failure because of multiple reasons. The ones you mentioned are the negative side of it. The more influencing reasons are that it saw that you had to change your processor (4yr cycle) to keep up with your memory tech progression. Take AMD for example... How long after Intel supported DDR2 did AMD come up with a DDR2 compatible processor? Wait.. i think more than 12 months... That's a $40bn hole in intel's pocket if they had gone with Timna and developed its successors. Again, its not possible to predict correctly the DRAM technology 3-4years in advance!

The "infamous" Pentium 4: Lets get one thing very very clear... P4 was a humongous success.. It shipped over 500mn units during its course. And the thing you speak of P3 at 1.8G beating p4 at 3.2G, well its only couple of benchmarks... Most of them, P4 was much better as compared to p3.. Well Intel did admit that it was wrong to go behind the clock. FYI, p4 did run at 10G... well with liquid Nitro cooled.. check on youtube for these OC videos... As a OC fan, you should have seen this! (i guess!)

The 64 bits: You really are that person who actually thinks that Intel is too huge a company that can control all the software developers in the WORLD??? My god... 64 bit applications are not new to server platforms.. not is a 64-bit processor.. Sun had it since loooongg... Anyways, it was a marketing gimmic played by AMD and most of them fell for it. If my mem is not rusted, they had intro's the 64bit processor way back in 2k3. And they launched when they had no support from any of the software vendors. Back then too, AMD said, that this would be a future investment. 5 years past and 64bit OS is coming in to limelight of home and small offices now. So basically AMD tricked all ... And FYI, 64 bit OS requires a min of 4GB ram to perform well.. Another FYI, Intel still rocks in most of the 64-bit benchmarks... gosh... a lot more to type... but pls do your research.. I know i might be sounding like thrashing you and you want to curse me and all... but really have to let go and see the fair side of Intel too...

And about the Gflop comparison of Lbee, some one already corrected you. And about the 80-core, even that was very different archi(yes! intel has lots of money! :))... It was a concept to prove that vector processing is possible, testing of different communication topologies b/w cores... which it will benefit for the processors of the future.

And as for your closing statement, well, that's innovation for you.. there is a very famous saying... "smart people dont do special things. They do things specially."... A 32-bit processor was invented 20years ago. But no other company could come up with a processor like C2D!! Does'nt that strike you?? :O .. Another FYI, Intel EMT64 has been around since AMD64 launched... So almost all the mainstream processors from both were capable of handling 64-bit applications.

And dont be so hasty in concluding that "put together some 32/48 CPUs" and call it a GPU... A lot still depends on the compiler... software Fixed function models... This is a good idea (not for you probably), but Intel does not have to launch two different platforms like AMD/ATI or NVIDIA- one for gfx and other for high-perf computing... This will tend to be one stop solution for all supercomputing... check out some of the advantages of this before you shut it down!

Peace man! >:D


"Death Is Very Likely The Single Best Invention Of Life" -- Steve Jobs

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki