backtop


Print 37 comment(s) - last by Trisped.. on Nov 21 at 3:42 PM

CPU and GPU all in one to deliver the best performance-per-watt-per-dollar

AMD today during its analyst’s day conference call unveiled more details of its next-generation Fusion CPU and GPU hybrid. Early mentions of Fusion first appeared shortly after AMD’s acquisition of ATI Technologies was completed a few months ago. AMD is expected to debut its first Fusion processor in the late 2008 to early 2009 timeframe.

AMD claims: “Fusion-based processors will be designed to provide step-function increases in performance-per-watt-per-dollar over today’s CPU-only architectures, and provide the best customer experience in a world increasingly reliant upon 3D graphics, digital media and high performance computing.”

The GPU and CPU appear to be separate cores on a single die according to early diagrams of AMD’s Fusion architecture. CPU functionality will have access to its own cache while GPU functionality will have access to its own buffers. Joining together the CPU and GPU is a crossbar and integrated memory controller. Everything is connected via HyperTransport links. From there the Fusion processor will have direct access to system memory that appears to be shared between the CPU and GPU. It doesn’t appear the graphics functionality will have its own frame buffer.

While Fusion is a hybrid CPU and GPU architecture, AMD will continue to produce discrete graphics solutions. AMD still believes there’s a need for discrete graphics cards for high end users and physics processing.

Also mentioned during the conference call is AMD’s new branding scheme regarding ATI products. Under the new branding scheme, chipsets for Intel processors and graphics cards will continue on with the ATI brand name. ATI designed chipsets designed for AMD platforms will be branded under AMD as previously reported.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

By Connoisseur on 11/17/2006 10:34:15 AM , Rating: 2
Because they're essentially combining the cpu and gpu, it'll no longer be possible to buy one and upgrade the other later. Granted, from what the article says, there will still be discrete high end solutions but essentially, it looks like they're turning the PC into a console.




By dagamer34 on 11/17/2006 11:11:19 AM , Rating: 2
"Fusion" is meant for laptops, not desktops. Notice in the article that they said that they would still be making discreet graphics cards. The primary advantage of a "Fusion" CPU would be lower power consumption and increased performance, both of which don't carry over as well when talking about a desktop instead of a desktop.


By othercents on 11/17/2006 11:44:13 AM , Rating: 5
Actually "fusion" could be both laptop and desktop, but not for high end systems. About 80% of the desktop market is low end system with integrated graphics. AMD is trying to build those systems for cheaper and have a lower power utilization. These systems would be good for Home Entertainment boxes that could be left on all the time to run as DVRs and such.

Other


By Targon on 11/17/2006 2:20:36 PM , Rating: 3
Even for high end systems, with Crossfire, we could see even a low end graphics processor(as built into the fusion) boost the graphics performance of a system.

In theory, Crossfire should allow a Radeon X1950 and X300 to work together. Obviously the X300 won't add all that much compared to the X1950, but it should help a little. Picture a system with two high end video cards with some extra graphics processing provided by the on-die GPU.


By Pirks on 11/17/2006 3:06:39 PM , Rating: 3
i second that - as long as fusion on-die gpu is generic enough, has no fixed pipeline, is DX10 only and can be employed for vector/parallel calculation code like G80 today - this is some unduckingbelievable speedup for the cpus. the physics simulation on integrated gpu shaders? it's just the beginning. now we have two/four genric cpu cores - but then with 45nm we'll have some superduper version of cell with not just puny 7 SPEs - that's gonna be some serious stuff - folding at home and rc5 will suddenly have nothing to do after a couple of years - coz the speedup in vector fp code will be not even _orders_ of magnitude.

right, so this will be specialized cores scheme, not a bunch of generic cores like now, but this is actually good - see how nicely stuff is done today when we have specialized gpus and generic cpus working together? this cell idea is ripe and ready, sony is outdated now by the graphics with G80, but with fusion sony's cell will be total nothing - 7 puny SPEs versus ATI integrated DX10 massively parallel shader engine? ha ha ha ha [demonic laughter here]


By crazydrummer4562 on 11/17/2006 10:08:11 PM , Rating: 2
It would actually subtract a lot of performance, because both cards need to have the same pixel pipeline count, so the X1950 would be reduced to the same performance as the X300...rendering that completely pointless and a monumental waste of money.


By Trisped on 11/21/2006 3:42:10 PM , Rating: 2
No, it would depend on the cards crossfire compatibility and the rendering mode employed by the software in question.

Remember, the first Crossfire cards were not clocked the same as their companion cards, so ATI launched with a way of dividing the work up between the too cards based on how much could be done by each.

Still, I think that the differences between the x1300, x1600, and x1900 cards is enough to make them incompatable in crossfire. As a result, if they are sticking a 300 line card on the CPU, then you will probably have to get a matching 300 line card if you are going to run Crossfire. I doubt anyone would do that though, as these are meant to replace integrated graphics. I am sure for the extra cost added by a fusion chip and the cost of a 300series video card you could buy an add in card that was more powerful, and useful.


By OrSin on 11/17/2006 11:45:25 AM , Rating: 2
Actaully it meant for both. It will used more in business class desktops then euthunistic class mother boards. But my guess AMD will only make 1 Chipset for bothe business and games. Then you can decide if you want 4 core CPU or a 2 core CPU/GPU. Really by then physics might be big and even gamers might choose the 2 core CPU/GPU over a 4 core system. Amd will al ready have DDR3 in systems by 2008 so maybe the system memery will be fast eonough for GPU (not gaming speed but physics or desktop 3d)


By Pirks on 11/17/2006 3:10:15 PM , Rating: 2
no, they won't make a graphics core a generic x86 one, this will kill the idea. forget about switching between 2/4 cores - but it doesn't matter anyway - just run specialized physics or whatever code on integrated gpu like it's done in G80 and get your 10x/100x speedup on that. with dicrete graphics card you can do this - run AI on a half of the gpu shaders and physics on the other half while cpu sits there doing nothing :)


By Kuroyama on 11/17/2006 11:57:19 AM , Rating: 4
quote:
it'll no longer be possible to buy one and upgrade the other later.


Why do you say that? I can buy a motherboard with integrated graphics and still upgrade with a discrete graphics card later. I would imagine that this'll replace integrated graphics chips, but you'll probably still be able to add in a card later.


By MDme on 11/17/2006 4:44:08 PM , Rating: 2
in the event they change graphics slots in the future (e.g. AGP -> PCIE) it will also provide you with the ability to let's say....

upgrade your system by changing the mobo only (and keep the CPU) while you wait for a new videocard for the new graphics socket. this is something a lot of people stuck with high-end AGP graphics are stuck in because of the aggregate cost of a CPU/videocard/mobo/memory upgrade.


"A lot of people pay zero for the cellphone ... That's what it's worth." -- Apple Chief Operating Officer Timothy Cook

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki