backtop


Print 37 comment(s) - last by Trisped.. on Nov 21 at 3:42 PM

CPU and GPU all in one to deliver the best performance-per-watt-per-dollar

AMD today during its analyst’s day conference call unveiled more details of its next-generation Fusion CPU and GPU hybrid. Early mentions of Fusion first appeared shortly after AMD’s acquisition of ATI Technologies was completed a few months ago. AMD is expected to debut its first Fusion processor in the late 2008 to early 2009 timeframe.

AMD claims: “Fusion-based processors will be designed to provide step-function increases in performance-per-watt-per-dollar over today’s CPU-only architectures, and provide the best customer experience in a world increasingly reliant upon 3D graphics, digital media and high performance computing.”

The GPU and CPU appear to be separate cores on a single die according to early diagrams of AMD’s Fusion architecture. CPU functionality will have access to its own cache while GPU functionality will have access to its own buffers. Joining together the CPU and GPU is a crossbar and integrated memory controller. Everything is connected via HyperTransport links. From there the Fusion processor will have direct access to system memory that appears to be shared between the CPU and GPU. It doesn’t appear the graphics functionality will have its own frame buffer.

While Fusion is a hybrid CPU and GPU architecture, AMD will continue to produce discrete graphics solutions. AMD still believes there’s a need for discrete graphics cards for high end users and physics processing.

Also mentioned during the conference call is AMD’s new branding scheme regarding ATI products. Under the new branding scheme, chipsets for Intel processors and graphics cards will continue on with the ATI brand name. ATI designed chipsets designed for AMD platforms will be branded under AMD as previously reported.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

By crazydrummer4562 on 11/17/2006 10:08:11 PM , Rating: 2
It would actually subtract a lot of performance, because both cards need to have the same pixel pipeline count, so the X1950 would be reduced to the same performance as the X300...rendering that completely pointless and a monumental waste of money.


By Trisped on 11/21/2006 3:42:10 PM , Rating: 2
No, it would depend on the cards crossfire compatibility and the rendering mode employed by the software in question.

Remember, the first Crossfire cards were not clocked the same as their companion cards, so ATI launched with a way of dividing the work up between the too cards based on how much could be done by each.

Still, I think that the differences between the x1300, x1600, and x1900 cards is enough to make them incompatable in crossfire. As a result, if they are sticking a 300 line card on the CPU, then you will probably have to get a matching 300 line card if you are going to run Crossfire. I doubt anyone would do that though, as these are meant to replace integrated graphics. I am sure for the extra cost added by a fusion chip and the cost of a 300series video card you could buy an add in card that was more powerful, and useful.


"If they're going to pirate somebody, we want it to be us rather than somebody else." -- Microsoft Business Group President Jeff Raikes

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki