AMD Reveals More "Fusion" Details
Anh Tuan Huynh
November 17, 2006 10:01 AM
comment(s) - last by
CPU and GPU all in one to deliver the best performance-per-watt-per-dollar
AMD today during its analyst’s day conference call unveiled more details of its next-generation
CPU and GPU hybrid. Early mentions of
first appeared shortly after AMD’s acquisition of ATI Technologies was completed a few months ago.
AMD is expected to debut its first
processor in the late 2008 to early 2009 timeframe.
AMD claims: “
-based processors will be designed to provide step-function increases in performance-per-watt-per-dollar over today’s CPU-only architectures, and provide the best customer experience in a world increasingly reliant upon 3D graphics, digital media and high performance computing.”
The GPU and CPU appear to be separate cores on a single die according to early diagrams of AMD’s
architecture. CPU functionality will have access to its own cache while GPU functionality will have access to its own buffers. Joining together the CPU and GPU is a crossbar and integrated memory controller. Everything is connected via HyperTransport links. From there the
processor will have direct access to system memory that appears to be shared between the CPU and GPU. It doesn’t appear the graphics functionality will have its own frame buffer.
is a hybrid CPU and GPU architecture, AMD will continue to produce discrete graphics solutions. AMD still believes there’s a need for discrete graphics cards for high end users and physics processing.
Also mentioned during the conference call is AMD’s new branding scheme regarding ATI products. Under the new branding scheme, chipsets for Intel processors and graphics cards will continue on with the ATI brand name. ATI designed
chipsets designed for AMD platforms will be branded under AMD as previously reported.
This article is over a month old, voting and posting comments is disabled
11/17/2006 1:14:41 PM
"The fusion project will not work at all."
- All knowing are we?
"First of all the motherboards will have to change drasticly."
- Why? They have this thing called "integrated graphics" where the graphics core is on the MB and shares system memory. All that is different is you're placing the graphics core in the CPU rather than in it's own socket.
"Second that doesent mean faster performance or even less power requirments! Why? - beacose people can buy Intel CPU which lets say consumes same power as AMD fusion CPU but may buy graphic card from Nvidia that requires less power than the fusion GPU!"
- So you're attempting to claim that a graphics core on a CPU that shares system memory and a lot of it's processing functions with a typical CPU core(or several - by the time this comes out both AMD and Intel will be using a lot of quad-cores) is going to somehow have HIGHER power consumption than a graphics card that has no shared functionality with the CPU and it's own memory to power? If you look at the power consumption of an integrated graphics chip you'll find they all consume a lot less power than any same-generation graphics card.
"Thirdly that means upgrading will be too expensive as you practically need to pay for better Fusion cpu/gpu, meaning paying for 2 products, while the conventional way will let you upgrade what you need, either the GPU or CPU not forcing you to pay both!"
- Once again, this is not targeted at gamers or the high end graphics market. This is targeted towards the other 95% of computer users out there where AMD has the least exposure and market share. The Fusion CPU is slated to provide all the functionality needed for things like web browsing, Aero-glass, and probably video decode. If you want to play Quake4 on your 30 inch LCD then you can still get a discrete solution and the fusion core can function as something else: Pre-processor for discrete graphics, physics calculations, other parallel processing tasks like folding.
"Last but not least AMD will either have to offer 20 different Fusion products or make them really cost effective which will mean little money for AMD. the first method will be more suitable for AMD but it doesn't guarantee them anything. For example i may want to buy the best graphic card and mid-range processor, while with Fusion you won't have that choice you can only get a CPU that comes with pre-determined GPU"
- Please don't make me say this again. AMD is not trying to replace discrete graphics solutions with this product. They are trying to create a better and more efficient alternative to integrated graphics and MAYBE the low end discrete graphics segment. They won't be selling a new FX processor with a FX -level graphics core for all of the reasons you mentioned (except power consumption - just plain wrong there).
The people at AMD aren't any dumber than you are, and probably they're a little better at this whole business than you are since they've been doing it a while, and doing pretty well considering the competition. If you think to yourself: Wow, that's dumb because... They've probably already realized that and adjusted their plans accordingly a long long time ago. Seriously, if you look at all the different threads about fusion you'll notice (shockingly) that everyone is making the same negative comments about the idea - without really knowing what the idea is. If hundreds of people like us can come up with these criticisms, don't you think it is vaguely possible that someone at AMD did too? Maybe it's possible that AMD realized that there are markets where a product like this makes sense and other markets where it doesn't?
I guess not. I mean, they only designed one of the most successful performance processors that had their main competitor beat for 4 years, which is a long time in this industry. I mean, what about that accomplishment would even suggest that they have the barest of cognitive powers. And yes, this was rhetorical.
By the by, your spelling is atrocious. Firefox 2.0 has a built in spell checker and having correct spelling and grammar goes a long way to making even foolish arguments look better.
"Game reviewers fought each other to write the most glowing coverage possible for the powerhouse Sony, MS systems. Reviewers flipped coins to see who would review the Nintendo Wii. The losers got stuck with the job." -- Andy Marken
AMD ATI CrossFire Xpress Chipsets Renamed, Again
November 15, 2006, 12:50 AM
AMD Announces "Fusion" CPU/GPU Program
October 25, 2006, 4:46 AM
Seagate Senior Researcher: Heat Can Kill Data on Stored SSDs
May 13, 2015, 2:49 PM
How to Recover Most Apps After Your NVIDIA Driver Crashes in Windows 10
March 30, 2015, 12:54 PM
Tinkerer Gets Old School Mac Plus Running on the Modern Web
March 24, 2015, 6:41 PM
Facebook-Backed Oculus Rift's Release Date Slips to 2016; Valve and HTC Salivate
March 16, 2015, 5:58 PM
Hackers Steal Roughly $1 Billion From Banks Using Malware RAT
February 17, 2015, 9:30 AM
NVIDIA Kills Mobile GPU Overclocking, Robs Customers Who Paid For It
February 16, 2015, 8:59 AM
Most Popular Articles
F-16 Schools Trillion-Dollar F-35 in Mock Combat, Fleeing is Best Option Pilot Admits
July 1, 2015, 5:53 PM
Apple Music: The Money, The Launch Hiccups, and the Nitty Gritty Details
June 30, 2015, 5:09 PM
Quick Note: Lumia 940 XL "Cityman" Phablet Gets Teased Via Tests
June 29, 2015, 5:51 PM
SpaceX Falcon 9's Seventh Supply Mission to ISS Ends w/ Fiery Stage 1 Explosion
June 28, 2015, 1:10 PM
Windows XP, Vista Users Can Get Free Windows 10 Upgrade Thanks to Loophole
June 23, 2015, 2:23 PM
Latest Blog Posts
Sceptre Airs 27", 120 Hz. 1080p Monitor/HDTV w/ 5 ms Response Time for $220
Dec 3, 2014, 10:32 PM
Costco Gives Employees Thanksgiving Off; Wal-Mart Leads "Black Thursday" Charge
Oct 29, 2014, 9:57 PM
"Bear Selfies" Fad Could Turn Deadly, Warn Nevada Wildlife Officials
Oct 28, 2014, 12:00 PM
The Surface Mini That Was Never Released Gets "Hands On" Treatment
Sep 26, 2014, 8:22 AM
ISIS Imposes Ban on Teaching Evolution in Iraq
Sep 17, 2014, 5:22 PM
More Blog Posts
Copyright 2015 DailyTech LLC. -
Terms, Conditions & Privacy Information