AMD Announces "Fusion" CPU/GPU Program
Anh Tuan Huynh
October 25, 2006 4:46 AM
comment(s) - last by
The future of CPU/GPU computing
completion of AMD’s acquisition of ATI
, AMD has announced its working on new CPU/GPU silicon that integrates the CPU and graphics processor into a single unit. The upcoming silicon is currently codenamed
and is expected in the late 2008 or early 2009 time frame. AMD claims Fusion will bring:
AMD intends to design Fusion processors to provide step-function increases in performance-per-watt relative to today’s CPU-only architectures, and to provide the best customer experience in a world increasingly reliant upon 3D graphics, digital media and high-performance computing. With Fusion processors, AMD will continue to promote an open platform and encourage companies throughout the ecosystem to create innovative new co-processing solutions aimed at further optimizing specific workloads. AMD-powered Fusion platforms will continue to fully support high-end discrete graphics, physics accelerators, and other PCI Express-based solutions to meet the ever-increasing needs of the most demanding enthusiast end-users.
AMD expects to integrate
for all its product categories including laptops, desktops, workstation, servers and consumer electronics products. Judging by the inclusion of PCI Express support, it would appear the integrated GPU is more of a value solution—similar to Intel’s cancelled
processor. It is unknown if AMD will retain the current Athlon and Opteron names with the launch of
. This isn't too surprising as
AMD and ATI previously promised unified product development including vague mentions of hybrid CPU and GPU products.
AMD also previously announced its
open architecture as well.
In addition to
, AMD expects to ship integrated platforms with ATI chipsets in 2007. The platforms are expected empower commercial clients, notebooks, gaming and media computing. AMD expects users will benefit from greater battery life on the next-generation Turion platforms and greater enhancements with AMD Live! systems.
reported on ATI's chipset roadmap which outlined various integrated graphics and enthusiast products.
With the development of
and upcoming integrated AMD platforms, it is unknown what will happen to NVIDIA’s chipset business, which currently relies mainly on AMD chipset sales.
This article is over a month old, voting and posting comments is disabled
RE: Vertex Shaders
10/25/2006 1:56:33 PM
Actually what I find interesting about this concept is that you can have a basic GPU core integrated into the CPU core which would be sufficient for everyday business applications, for basic workstations, for business laptops and for barebones computers which should cut costs for over 75% of all systems sold.
But what I find really smart about this concept is that, with the Torenza initiative, the CPU will now be able to communicate directly through the Hypertransport link with a bunch of addon cards. Most people so far have envisonned putting in a second or third GPU but what I see happening is actually a breaking down of the components of the GPU into separate parts. Apart from the obvious idea of increasing VRAM through an add-on card, think about being able to customize your GPU according to you usage scenario with specialized shader cards, geometry cards, mhz boosts, etc.
This system would be the ultimate in customization and would be much more price efficient for customers who would be able to get exactly what they need. And instead of changing a whole GPU when a new tech comes out, you could just change that particular add-on card giving a much longer lifespan to your video card, hence your system. Imagine just being able to upgrade to shader model 5,0 (or whatever it is then) just by changing your 50$ shader card instead of your whole video card like we have to today!
Also, assuming the technical hurdles can be overcomed, AMD would be the only one for a few cycles with this tech, creating a totally new market a bit like Nintendo is trying to do with its Wii and taking total control of it by catching the competition off guard because it would take Intel
a year to develop a competing product in the best case scenario. Disruption of an established market to gain the leadership in both CPU architecture and GPU add-on cards in one fell swoop. Quite a business strategy.
RE: Vertex Shaders
10/25/2006 4:24:23 PM
Integrating the GPU with the CPU is not all about graphics; it's about making the tremendous parallel processing power of the GPU available for general computation, including graphics. Admittedly, I cannot imagine all the different applications for such parallelism any more than you can. Scientific computation will use it, at least, but it goes far beyond that. The belief is that the general-purpose GPU is inherently, fundamentally such a sound concept that people like you and me will soon come up with a thousand creative ways to put it to work, given the chance.
Readers who have written assembly code or programmed microcontrollers will best understand the point I am trying to make, because at the lowest programming level, GPU programming differs radically from traditional CPU programming. The CPU is code-oriented; the GPU, data-oriented. Wherever the quantity of data is large and the parallel transformation to be applied en-masse to the data is relatively simple, the general-purpose GPU can, at least in theory, greatly outperform any traditional CPU. The CPU, of course, is far more flexible, and still offers by far the best way to chain
calculations together. The marriage of the CPU to a general-purpose GPU is thus a profound concept, indeed.
The general-purpose GPU is an idea whose time has come. By acquiring ATI, AMD makes a serious attempt to dominate the coming generation of computer technology, taking over Intel's accustomed role as pacesetter and standard bearer. Of course there is no reason to expect Intel to sleep through this transition. If Intel responds competently, as one assumes that it will, then we are in for some very interesting times in the coming few years, I think.
There is a third element, besides the CPU and the GPU, which will emerge soon to complement both, I think. This is the FPGA or field-programmable gate array. Close on the heels of the CPU-GPU marriage, the integration of the FPGA will make it a triumvirate, opening further capabilities to the user at modest additional cost.
AMD/ATI will not be able to ignore this development, even if their general-purpose GPU initiative succeeds, as I think it will. Interesting times are coming, indeed.
RE: Vertex Shaders
10/25/2006 5:01:56 PM
The triumvirate system you outline is truly a very interesting concept. Seen from a hardware point of view it is all you can dream of: the CPU - incredibly optimized for sequential execution, the GPU - incredibly optimized for parallel execution and the FPGA - harness the power of custom logic to implement time critical operations and to implement never-though-of-before stuff.
As much as I would want to see this happen, I think that one should recall that hardware is only part of the game. The software aspect is just as important. I hope a solution can be found here, because its a big challenge. Software engineers needs to learn parallel processing techniques and they need to learn the co-design concepts such that they can utilize the FPGA - and they will need to ally with hardware engineers or learn a hardware description language themselves.
All these splendid hardware ideas will fall short if the software guys don't know exactly what they are dealing with and how to utilise it. Completely new programming paradigms might need to be conceived.
"There's no chance that the iPhone is going to get any significant market share. No chance." -- Microsoft CEO Steve Ballmer
Farewell ATI: AMD Acquisition Completed Today
October 25, 2006, 4:46 AM
ATI AMD Chipset Roadmap Detailed
October 9, 2006, 12:30 PM
AMD and ATI Promise Unified Development by 2008
July 24, 2006, 8:11 AM
Seagate Senior Researcher: Heat Can Kill Data on Stored SSDs
May 13, 2015, 2:49 PM
How to Recover Most Apps After Your NVIDIA Driver Crashes in Windows 10
March 30, 2015, 12:54 PM
Tinkerer Gets Old School Mac Plus Running on the Modern Web
March 24, 2015, 6:41 PM
Facebook-Backed Oculus Rift's Release Date Slips to 2016; Valve and HTC Salivate
March 16, 2015, 5:58 PM
Hackers Steal Roughly $1 Billion From Banks Using Malware RAT
February 17, 2015, 9:30 AM
NVIDIA Kills Mobile GPU Overclocking, Robs Customers Who Paid For It
February 16, 2015, 8:59 AM
Most Popular Articles
America's Largest Cable Company, Comcast, Sees Internet Subscriptions Pass TV
May 4, 2015, 2:46 PM
Chromebooks Expected to See Sales Grow 26 Percent to 7.3 Million Units This Year
May 22, 2015, 1:26 PM
The Pirate Bay Loses Its Iconic Swedish Dot SE Domains
May 20, 2015, 6:31 PM
Rockstar Games' Parent Sues BBC Over Unauthorized "Grand Theft Auto" Drama
May 21, 2015, 6:05 PM
No Matter What the AP Tells You, Google's Self-Driving Cars are Pretty Safe
May 12, 2015, 1:06 PM
Latest Blog Posts
Sceptre Airs 27", 120 Hz. 1080p Monitor/HDTV w/ 5 ms Response Time for $220
Dec 3, 2014, 10:32 PM
Costco Gives Employees Thanksgiving Off; Wal-Mart Leads "Black Thursday" Charge
Oct 29, 2014, 9:57 PM
"Bear Selfies" Fad Could Turn Deadly, Warn Nevada Wildlife Officials
Oct 28, 2014, 12:00 PM
The Surface Mini That Was Never Released Gets "Hands On" Treatment
Sep 26, 2014, 8:22 AM
ISIS Imposes Ban on Teaching Evolution in Iraq
Sep 17, 2014, 5:22 PM
More Blog Posts
Copyright 2015 DailyTech LLC. -
Terms, Conditions & Privacy Information