Print 37 comment(s) - last by Trisped.. on Nov 21 at 3:42 PM

CPU and GPU all in one to deliver the best performance-per-watt-per-dollar

AMD today during its analyst’s day conference call unveiled more details of its next-generation Fusion CPU and GPU hybrid. Early mentions of Fusion first appeared shortly after AMD’s acquisition of ATI Technologies was completed a few months ago. AMD is expected to debut its first Fusion processor in the late 2008 to early 2009 timeframe.

AMD claims: “Fusion-based processors will be designed to provide step-function increases in performance-per-watt-per-dollar over today’s CPU-only architectures, and provide the best customer experience in a world increasingly reliant upon 3D graphics, digital media and high performance computing.”

The GPU and CPU appear to be separate cores on a single die according to early diagrams of AMD’s Fusion architecture. CPU functionality will have access to its own cache while GPU functionality will have access to its own buffers. Joining together the CPU and GPU is a crossbar and integrated memory controller. Everything is connected via HyperTransport links. From there the Fusion processor will have direct access to system memory that appears to be shared between the CPU and GPU. It doesn’t appear the graphics functionality will have its own frame buffer.

While Fusion is a hybrid CPU and GPU architecture, AMD will continue to produce discrete graphics solutions. AMD still believes there’s a need for discrete graphics cards for high end users and physics processing.

Also mentioned during the conference call is AMD’s new branding scheme regarding ATI products. Under the new branding scheme, chipsets for Intel processors and graphics cards will continue on with the ATI brand name. ATI designed chipsets designed for AMD platforms will be branded under AMD as previously reported.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

for vista
By shamgar03 on 11/17/2006 10:29:43 AM , Rating: 1
I hope that these processors will allow vista to run with nice desktop effects without adverse effects on game performance. Even though its not techically rendering graphics while you in a full screen game, there is probobly still some 3D rendering overhead associated with the extra effects. This could mean no more need for chipset based video solutions. First the memory controllor, now this. Eventually we won't even need motherboards....

RE: for vista
By Lazarus Dark on 11/17/2006 11:27:30 AM , Rating: 2
Exactly! cmon people this is for vista! this is great for nongamers, vista can run with full spiffy graphics for cheaper. also will help in making htpc's and other small form factor and portable solutions like umpc and laptop to be smaller, cheaper, cooler and with less power.

But gamers: why do you need to turn it off? Just buy a second monitor for your discrete graphics and use the fusion for desktop on the other monitor. everyones different I suppose, but one monitor never seems like enough to me. (personally i prefer 3 or 4 monitors maybe moving to 5 or 6 in a couple years; but I'm probably an extreme case)

“Then they pop up and say ‘Hello, surprise! Give us your money or we will shut you down!' Screw them. Seriously, screw them. You can quote me on that.” -- Newegg Chief Legal Officer Lee Cheng referencing patent trolls
Related Articles

Most Popular ArticlesAre you ready for this ? HyperDrive Aircraft
September 24, 2016, 9:29 AM
Leaked – Samsung S8 is a Dream and a Dream 2
September 25, 2016, 8:00 AM
Inspiron Laptops & 2-in-1 PCs
September 25, 2016, 9:00 AM
Snapchat’s New Sunglasses are a Spectacle – No Pun Intended
September 24, 2016, 9:02 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki