backtop


Print 69 comment(s) - last by mindless1.. on Nov 1 at 3:12 PM

The future of CPU/GPU computing

With the completion of AMD’s acquisition of ATI, AMD has announced its working on new CPU/GPU silicon that integrates the CPU and graphics processor into a single unit. The upcoming silicon is currently codenamed Fusion and is expected in the late 2008 or early 2009 time frame. AMD claims Fusion will bring:

AMD intends to design Fusion processors to provide step-function increases in performance-per-watt relative to today’s CPU-only architectures, and to provide the best customer experience in a world increasingly reliant upon 3D graphics, digital media and high-performance computing. With Fusion processors, AMD will continue to promote an open platform and encourage companies throughout the ecosystem to create innovative new co-processing solutions aimed at further optimizing specific workloads. AMD-powered Fusion platforms will continue to fully support high-end discrete graphics, physics accelerators, and other PCI Express-based solutions to meet the ever-increasing needs of the most demanding enthusiast end-users.

AMD expects to integrate Fusion for all its product categories including laptops, desktops, workstation, servers and consumer electronics products. Judging by the inclusion of PCI Express support, it would appear the integrated GPU is more of a value solution—similar to Intel’s cancelled Timna processor. It is unknown if AMD will retain the current Athlon and Opteron names with the launch of Fusion. This isn't too surprising as AMD and ATI previously promised unified product development including vague mentions of hybrid CPU and GPU products. AMD also previously announced its Torrenza open architecture as well.

In addition to Fusion, AMD expects to ship integrated platforms with ATI chipsets in 2007. The platforms are expected empower commercial clients, notebooks, gaming and media computing. AMD expects users will benefit from greater battery life on the next-generation Turion platforms and greater enhancements with AMD Live! systems. DailyTech previously reported on ATI's chipset roadmap which outlined various integrated graphics and enthusiast products.

With the development of Fusion and upcoming integrated AMD platforms, it is unknown what will happen to NVIDIA’s chipset business, which currently relies mainly on AMD chipset sales.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Vertex Shaders
By MarcLeFou on 10/25/2006 1:56:33 PM , Rating: 3
Actually what I find interesting about this concept is that you can have a basic GPU core integrated into the CPU core which would be sufficient for everyday business applications, for basic workstations, for business laptops and for barebones computers which should cut costs for over 75% of all systems sold.

But what I find really smart about this concept is that, with the Torenza initiative, the CPU will now be able to communicate directly through the Hypertransport link with a bunch of addon cards. Most people so far have envisonned putting in a second or third GPU but what I see happening is actually a breaking down of the components of the GPU into separate parts. Apart from the obvious idea of increasing VRAM through an add-on card, think about being able to customize your GPU according to you usage scenario with specialized shader cards, geometry cards, mhz boosts, etc.

This system would be the ultimate in customization and would be much more price efficient for customers who would be able to get exactly what they need. And instead of changing a whole GPU when a new tech comes out, you could just change that particular add-on card giving a much longer lifespan to your video card, hence your system. Imagine just being able to upgrade to shader model 5,0 (or whatever it is then) just by changing your 50$ shader card instead of your whole video card like we have to today!

Also, assuming the technical hurdles can be overcomed, AMD would be the only one for a few cycles with this tech, creating a totally new market a bit like Nintendo is trying to do with its Wii and taking total control of it by catching the competition off guard because it would take Intel at least a year to develop a competing product in the best case scenario. Disruption of an established market to gain the leadership in both CPU architecture and GPU add-on cards in one fell swoop. Quite a business strategy.


RE: Vertex Shaders
By tbtkorg on 10/25/2006 4:24:23 PM , Rating: 5
Interesting thread.

Integrating the GPU with the CPU is not all about graphics; it's about making the tremendous parallel processing power of the GPU available for general computation, including graphics. Admittedly, I cannot imagine all the different applications for such parallelism any more than you can. Scientific computation will use it, at least, but it goes far beyond that. The belief is that the general-purpose GPU is inherently, fundamentally such a sound concept that people like you and me will soon come up with a thousand creative ways to put it to work, given the chance.

Readers who have written assembly code or programmed microcontrollers will best understand the point I am trying to make, because at the lowest programming level, GPU programming differs radically from traditional CPU programming. The CPU is code-oriented; the GPU, data-oriented. Wherever the quantity of data is large and the parallel transformation to be applied en-masse to the data is relatively simple, the general-purpose GPU can, at least in theory, greatly outperform any traditional CPU. The CPU, of course, is far more flexible, and still offers by far the best way to chain sequential calculations together. The marriage of the CPU to a general-purpose GPU is thus a profound concept, indeed.

The general-purpose GPU is an idea whose time has come. By acquiring ATI, AMD makes a serious attempt to dominate the coming generation of computer technology, taking over Intel's accustomed role as pacesetter and standard bearer. Of course there is no reason to expect Intel to sleep through this transition. If Intel responds competently, as one assumes that it will, then we are in for some very interesting times in the coming few years, I think.

There is a third element, besides the CPU and the GPU, which will emerge soon to complement both, I think. This is the FPGA or field-programmable gate array. Close on the heels of the CPU-GPU marriage, the integration of the FPGA will make it a triumvirate, opening further capabilities to the user at modest additional cost.
AMD/ATI will not be able to ignore this development, even if their general-purpose GPU initiative succeeds, as I think it will. Interesting times are coming, indeed.


RE: Vertex Shaders
By Larso on 10/25/2006 5:01:56 PM , Rating: 2
The triumvirate system you outline is truly a very interesting concept. Seen from a hardware point of view it is all you can dream of: the CPU - incredibly optimized for sequential execution, the GPU - incredibly optimized for parallel execution and the FPGA - harness the power of custom logic to implement time critical operations and to implement never-though-of-before stuff.

As much as I would want to see this happen, I think that one should recall that hardware is only part of the game. The software aspect is just as important. I hope a solution can be found here, because its a big challenge. Software engineers needs to learn parallel processing techniques and they need to learn the co-design concepts such that they can utilize the FPGA - and they will need to ally with hardware engineers or learn a hardware description language themselves.

All these splendid hardware ideas will fall short if the software guys don't know exactly what they are dealing with and how to utilise it. Completely new programming paradigms might need to be conceived.


"We’re Apple. We don’t wear suits. We don’t even own suits." -- Apple CEO Steve Jobs

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki