Print 50 comment(s) - last by mmarq.. on Jul 20 at 3:34 PM

AMD's Giuseppe Amato dispels rumors and misinterpreted statements of "Fusion," GPGPU and the company

In an interview with Italian media, AMD Executive Giuseppe Amato, Technical Director of Sales & Marketing EMEA, discussed AMD's current market position and future products.

In the interview, Amato shed more light on the structure of AMD's upcoming Fusion processors. A misconception that Amato noted is that Fusion processors will not only be available in single-chip flavors, but also multi-chip formats. Two Fusion processors linked together would allow for parallel GPUs. He said that AMD has still not solidified the future plans of Fusion yet, but indicated it would be very likely to see a Fusion processor with a GPU and CPU connected through a CrossFire-like interface -- and have a total TDP of less than 120 Watts.

Amato also praised the flexibility of the Fusion processor in the interview and told Hardware Upgrade that it will allow AMD to "integrate a specific number of GPU and CPU cores depending on the customer and the uses for which they will use the chip." 

"AMD isn't just a microprocessor company anymore", he stated. After the acquisition of ATI, "AMD changed from a processor company to a platform company." This is where Fusion ties in. Its high grade of flexibility will combine GPUs and CPUs into one product. Amato believes that Fusion platforms will be able to specifically match the needs of its customers.

AMD's Fusion processors will also be closely tied to GPGPU. Using a GPGPU platform based on Fusion, AMD will be able to offer HPC systems that can do all kinds of work. Code that is more suited for CPUs will be executed on the CPU part of the Fusion processor, while code more efficiently run on a GPGPU will be run on the GPU portion of the processor. To sum it up, AMD's Fusion processors will be able to do a variety of work, allowing them to better meet the needs of AMD's customers.

Amato also dispelled rumors that AMD will be going completely fabless. He blames the source of the rumor as a misinterpretation of a speech Hector Ruiz gave. However, AMD plans to stick to a fab-less manufacturing model for GPU and chipset products.

The full interview can be viewed at Hardware Upgrade.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

sounds good
By omnicronx on 7/16/2007 3:07:06 PM , Rating: 3
this sounds pretty damn cool, i have been waiting on information on the fusion for a while. I can see how this would be great in the oem market especially, but i wonder, will the chip be integrated into the board, or will it be a slot keeping it upgradeable. I think a core/gpu configuration thats upgradeable would be great.. although gone would be the days of upgrading part by part haha.

RE: sounds good
By Regs on 7/16/2007 3:20:19 PM , Rating: 2
Yes, it's good to hear. I like that we actually heard from someone from AMD's marketing team and not from some unknown source reporting from another unknown source.

It's something AMD should do more often, and to recollect in the past few months, AMD is on the right track about opening up to the public by speaking to them directly.

RE: sounds good
By Khato on 7/16/2007 3:36:53 PM , Rating: 3
... I'd trust some unknown source more than someone from marketing >.>

RE: sounds good
By kenji4life on 7/16/2007 4:25:15 PM , Rating: 5
Don't know why you'd say that here. While the statement has merit, all he is doing here is dispelling rumors, not creating them.

I won't argue that marketing has never spread rumors. But this doesn't seem to be the case here; especially when he's talking theory & r&d at this point. Not promising anything like release dates or benchmarks. Everything that he's saying is entirely plausible and there's no reason as of yet to think it's all hot air. Don't forget how much money AMD sunk into this.

RE: sounds good
By Khato on 7/16/2007 5:14:02 PM , Rating: 1
Eh, the only rumor I see being dispelled here is that of AMD going fabless. In which case he states what most that follow such things already guessed - AMD's going to continue outsourcing the fabrication of graphics chips and chipsets.

As to the rest of the article, eh, AMD marketing promises/statements to generate interest in their 'products' that are under development. One of my favorite quotes from the article, "We prefer not to spread information on operating clock speeds and performance as of right now..."

RE: sounds good
By 91TTZ on 7/16/2007 5:38:22 PM , Rating: 1
Don't know why you'd say that here. While the statement has merit, all he is doing here is dispelling rumors, not creating them.

You should never believe the "official" word of a company. They have no incentive to tell the truth.

Remember a week before the PS3 price cuts? Sony's official position was that there were "no plans" to cut the price of the PS3. Yet a week later the price was dropped. Things like price changes take time for a company to approve and inform their distributors, so there's no way that Sony didn't have plans when their exec made that statement. They were busy working on the price drop and their exec knew it, yet he still lied through his teeth.

RE: sounds good
By Ringold on 7/16/2007 6:53:34 PM , Rating: 4
Give them a little slack. "Lie" is pretty tough language. Businesses and their exec's are on a leash of a certain length about what information can be disclosed, how it can be disclosed and when, for legal and ethical reasons probably as it relates to the owners(investors). Being careles about spilling the beans could lead someone to jail and if it weren't this way there'd be nothing illegal about leaking information to hedge fund guys before issueing a proper press release. Perhaps the exec in question could've phrased it better, but saying he lied is.. a little stronger of language then I'd use.

RE: sounds good
By TomZ on 7/16/2007 7:01:24 PM , Rating: 5
How can you tell when a marketing executive is lying?

When he moves his lips. :o)

RE: sounds good
By theapparition on 7/17/2007 8:15:29 AM , Rating: 3
If you look at Sony's (or Microsoft, or anynone else for that matter) press releases reguarding price cuts/product announcements, etc. they all go somethink like this.

"We have no plans on announcing a price cut at this time"

It's a carefully crafted factual statement. So, technically no lie. Sony didn't plan on "announcing" the price cut "at this time". They very well may have been planning the actual price cut for several months.

Microsoft has no plans to "announce" a new xbox product "at this time". We all know they have 65nm redesign in the works.

Try it, I think you'll like it.

"I have no plans to announce to my wife that I'm sleeping with her sister."

See, just rolls off the tongue.

RE: sounds good
By JeffDM on 7/20/2007 11:04:32 AM , Rating: 2
"It's a carefully crafted factual statement. So, technically no lie. Sony didn't plan on "announcing" the price cut "at this time". They very well may have been planning the actual price cut for several months."

That still doesn't argue why I should give this AMD guy any credibility. I don't care if it's not technically a lie, it's still a reason why no one should pay attention to them.

RE: sounds good
By Hawkido on 7/17/2007 2:15:59 PM , Rating: 2
You should never believe the "official" word of a company. They have no incentive to tell the truth.

Incentive, No, no real incentive. However as a publically traded company there are Severe Penalties for Lying to the investors in the form of official press releases.

Forward looking statments are given room for error (such as release dates being off for unforseeable reasons, clock speeds not meeting the benchmarks due to poorer than expected yields on initial batches. So long as the stated goal of releasing the product or eventually meeting the goal happens, no problem. Investor confidence may suffer a bit, for such grandiose statments, but not the core of investors. They realize you can't get rich all at once, you get rich alittle bit at a time.

How much you do you hate your dad if he says "we'll go to the circus", but doesn't? What if he has to postpone it a weekend or two because of surgery or work interferes?

How spoiled are you if you can't be alittle bit flexible for something you haven't even paid for yet?

RE: sounds good
By JeffDM on 7/20/2007 11:00:06 AM , Rating: 2
I don't think it matters, corporate mouthpieces are not going to give out useful pieces of information. I don't care if they aren't technically lies, carefully crafted "tidbits" that are intended to give the opposite impression of what it's really saying is quite snake-like. I hope they don't wonder why people don't trust corporate PR when they do this.

RE: sounds good
By creathir on 7/16/2007 3:27:48 PM , Rating: 3
With SLI/Crossfire, those days are already gone. (If you want to take advantage of those techs)
- Creathir

RE: sounds good
By robrbecker on 7/16/2007 3:52:55 PM , Rating: 2
I think from what I've read at the inquirer and other tech sites their approach will most likely be to have multiple sockets on a board that different types of "processors" plug in to. Think of a mobo with 2 or more AM3 sockets connected with hypertransport. You could put any combination of things into these sockets: one or more dual or quad core AMD general purpose CPUs or one or more AMD (ATI) GPGPU. Workstations or scientific types could put added floating point powerhouses. And the best part is that the system will figure out which processor would perform operations the fastest and delegate the work to the appropriate chip.
It's like a heterogeneous multi-core processor where each core is physically replaceable since it is it's own chip. AMD could put these one the same die to cut cost and power further (and probably will) for mass-market models.

I think this approach will make huge progress for AMD in the mobile space: better graphics with lower power consumption!

RE: sounds good
By cobalt42 on 7/16/2007 4:09:14 PM , Rating: 3
I think you're thinking of Torrenza. Similar in concept, but one of the differences is that Fusion has more emphasis on heterogeneity in a single chip and Torrenza as an approach to plug in to HT sockets, e.g. with miscellaneous accelerators including FPGAs.

RE: sounds good
By GabrielIkram on 7/16/2007 4:27:35 PM , Rating: 4
Well, Fusion can be a part of Torrenza. It will most likely be working as a part of a Torrenza platform. Remember, Torrenza is a new AMD platform, and Fusion will probably be able to fit right into the platform.

A Torrenza platform's motherboard consists of two accelerator sockets and a PCIe accelerator.

What this means is that a highly specific system can be built. For example, for one accelerator socket I can decide to use a Fusion processor. For the other socket, I can use a third-party dedicated math coprocessor. Another highly-specific accelerator can be added using the PCI-Express interface. So basically, I can build a complete platform tailored specifically for my needs.

This is what I am seeing the new goal of AMD as, and this is what they will probably be targeting the corporate world with; extreme flexibility. If you need any futher explanation, go ahead and ask. Alternatively, you can also refer to this link:

or this link:

RE: sounds good
By Treckin on 7/16/07, Rating: 0
RE: sounds good
By Andrwken on 7/16/2007 11:14:34 PM , Rating: 2
Maybe someone can explain the long term benefits of this to me. I like the sound of the technology, but with 16 core designs already on the roadmap, doesn't this seem stopgap? Isn't the real goal to have multiple cores that can be assigned to doing specifice tasks, ie. video rendering or math? I can't find the article anymore but wasn't there just a write up on a quad core that was handling some pretty intensive 3d rendering on a new game engine. I would think other than the programming being the bottleneck, a 16 core nehalem could probably have half the cores do a pretty impressive job of rendering. 4 cores handling physics, and 4 for regular cpu function. Or are we too far out programming wise for that and the cores will have to be specialized for some time yet?

RE: sounds good
By mmarq on 7/17/2007 7:00:49 PM , Rating: 3
but i wonder, will the chip be integrated into the board, or will it be a slot keeping it upgradeable

In the first implementation it will be like today CPU + PCIe GPU, more a (hardware circuit + software tiny layer) called CTM, for application specific task stream acceleration.

Reading the interview Amato said they were able to accelerate a virus scanner to performance not possible with only the CPU. It sure 'can' be done with * the majority * of applications, if software developers so program them, because they all could and surely will have stream specific tasks except the pure Integer ones.

An example that we have shown that uses our previous video architectures is Tarari. By recompiling its antivirus scanner and using an AMD GPU, Tarari was able to reach significantly higher performance compared to what would have been obtainable using only a CPU

One honest comment i have is that thank god we don't have to go up trough to SSE50 anymore, because it will be useless, and not helping software developers because right now they are only passing from SSE2.

Would be a revolution if for having the best performance we wouldn't need to have the best CPU anymore. At least at streaming and FP.

Meaning has more applications go for CTM more irrelevant is witch CPU, Intel or AMD, is a little more general purpose performant than the other.

I think a core/gpu configuration thats upgradeable would be great.. although gone would be the days of upgrading part by part haha.

No.You would still be able to upgrade part by part

A large error that has been made regarding Fusion is that people are thinking that this type of architecture will only be a single chip package architecture, meaning both the CPU(!) and GPU are to be integrated on the same die.

A) - General stream will come to the GPU or GPGPU by CTM (hardware circuit + software tiny layer) and or other schemes. Here we already have propositions from ATI and Nvidia(tesla) of GPGPU capables of more general computing tasks, like physics acceleration and other tasks.

ATI is more advanced in the game, but Nvidia has also implement a similar thing, and it will be possible to use Intel CPUs with it, either ATI or Nvidia.

B) - Streaming will come to the CPU by means:
__ B1) Making functional units inside the CPU capable of it, as is the case of today SSE units in all CPUs(more or less), only requiring additional logic and a software layer for load balancing with Stream units outside of a particular CPU die, and or for general application stream task acceleration, like in CTM or the the almost only software layer Intel derivative from the project Larabee.

Those functional units can possibly(?) have also more GPU centric tasks like vertex and shading processing, turning the CPU in a GPU like. Only AMD at the moment seems to me, to have any intention at this in a shorter period, because of the advantage of better 'clustering' of their CPU designs(?). It will happen?... we have to see.

Thanks to the availablilty of a higher number of 'registries' General purpose GPU computing will also be made much easier in 2009 when Microsoft releases the DirectX 11 API.
(in that case CPU registries also IMO, because x86_64 can have 32 GPR, double of now, without breaking applications(?))

__ B2) Having inside the CPU die, *units separated from the traditional cores*, like other cores, connected through a Xbar, other link or sharing a L2/L3, and capables of streaming, like is the case of the IBM Cell processor implementation in the PS3, and the derivative in the Xbox.

But for load balancing with Stream units outside of a particular CPU die, and or for general application stream task acceleration, it would still be needed something like CTM or the almost only software layer Intel 'Larabee' derivative.

Again those separated units can surely have also more GPU centric tasks like vertex and shading processing, turning the CPU in a GPU like.

__ B3) Having inside the CPU package in a MCM configuration, like is the case of the C2Quad, a traditional CPU die and a GPU die. In this example a C2Quad(?) would be a C2 core + a GPU core. In here the CPU die and GPU die communicate trough a link or by sharing a L2/L3.

Again for load balancing with Stream units outside of a particular CPU MCM package and or for general application stream task acceleration, it would still be needed something like CTM or the almost only software layer Intel 'Larabee' derivative.

AMD now possesses all of the technologies it needs to develop Fusion architectures. Whether it is a native solution with serveral cores integrated in the same die, similar to what we are using for Barcelona, or a multi-die package (author’s note: the same architecture used by Intel for its Core 2 Quad chips) composed of two separate silicon die installed on the same package, AMD is open to all technological evolutions that the market requires.


This hardware circuit + software tiny layer is imperative, be it CTM, which seems to me to became the central part of Crossfire 2 for ATI, or Larabee which relays much more on software.

All those GPGPU and CPU/GPUs can be connected all together very effectively, trough a cache coherent protocol, used in mainframes multiparallel machines and clusters, of which IBM, AMD and others have very solid implementations, and which Intel will have when CSI arrives, if it has and how, cache coherency. That would make CPU importance even more irrelevant for the large group of straming capable applications.

More connected to 'us', in the enthusiast market, it is precisely there where AMD with HT3, DC 2, HTX slots like in the next RD790, and the ones after that
will absolutely rock!!...

I'm not payed to post, nor a fanboy, nor in defense of anyone, but it seems to me that AMD has a clear advantage, and if CSI don't came out relatively soon, Intel will be in trouble if it stays sticked to the same old shared FSB.

RE: sounds good
By indianpunk on 7/18/2007 4:52:56 AM , Rating: 2
i agree with you bro max u upload part by part maybe and just maybe ram but with already 4 gb ram possible and pc upgrades being Possible every 12-15 months (See anandtech article on intel price drops u'll know the trend )

I'll be going for a cheaper amd athlon x2 at the time being but i am sure come next august quad core will be cheaper by dozen and 1 tb's will be common as well and maybe just maybe ddr3 will be free flowing as well

RE: sounds good
By mmarq on 7/20/2007 12:48:40 AM , Rating: 2
"" but i am sure come next august quad core will be cheaper by dozen and 1 tb's will be common as well and maybe just maybe ddr3 will be free flowing as well ""

Without wanting to be ironic as you, but it seems you missed the point completely.

You can have a ready rig for all connected cache coherent GPGPU and CPU/GPUs, this year, and for Vista Ultimate or Linux!!... there are tradeoffs, there are always tradeoffs... (see here)

But don't have to wait for 1 tbs rams, free ddr3, quads or other pointless quarks...


Forget about 16- 80 core cpus, intel, amd or whatever, tera of ram or other bull to make you lighter in the pockets... this is a revolution!

"" By contrast, ad-hoc stream processors easily reach over 10x performance ""

Yeah, and the Future of XBOX most likely...
By RupanIII on 7/16/2007 6:03:32 PM , Rating: 1
Well, sofar, the guy over at another site won't post this, so, i'm posting here.
How about a Fusion XBOX Successor.
Microsoft owns the IP for the TriCore processor in XBOX360.
It uses an ATi video chip.
Nvidia seems pretty happy with Sony.
Uhm.... going to the upcoming Intel tech doesn't seem logical for MS.
So, Fusion an AMD Dual Core Phenom with a modified TriCore onboard the Fusion processor, and Hypertransport it to the new ATi video/sound hardware. Oh... now this machine can execute X86 code again... it's better at running XBOX Games than the 360is.. since the DX Metal variant can execute the old code with a compatability layer and native cpu processing. The 360 games process fine as the 360 cpu is onboard the Fusion.
Remember, Fusion is AMD's way of meeting Specific Customer design goals. IBM will manufacture ofcourse, very likely, as they have for AMD and Microsoft before and now.
Intel will NEVER help Microsoft run Non Intel component code. Drop in 2gig of ram and this works real good.
Oh, it should also run Games for Windows and be a MCE and have a version of Windows Home Server onboard.
Maybe i'm nuts, but, since Desktops are a dying breed and earn very little profit and Microsoft wants to sell it's own machines anyway, It SHOULD be on the drawing board.
With a 1080p TV this is a killer Living Room Digital Media Hub, Internet Browser and Game System... everything the Desktop has strided to but failed to be.

And sony,... well, they aren't in this position, lord know what they use after Cell. Nintendo, they are good. If they partnered with Apple and had a Mac Mini game system with WiiMote and HD Graphics, that's good too. But, again, the only Likely thing is what i posted above.


By kenji4life on 7/16/2007 6:49:48 PM , Rating: 3
If ms actually built a "media center" style xbox with more computer features, I think it would be a huge success. My biggest pet peeve with the 360 is lack of mouse support and limited keyboard support.

By jachristie79 on 7/16/2007 6:51:34 PM , Rating: 2
That just really hurt my head to read.....:(

RE: Yeah, and the Future of XBOX most likely...
By Treckin on 7/16/2007 7:38:21 PM , Rating: 1
dude, didnt n e 1 explain paragraphs to you in middle school?
so you know, Im sure the execs at Dell agree that there is absolutely NO money in desktops.... or that almost 65% of homes in the US have 1... thats more than have waffle irons!

Seriously, Msoft could do the media center xbox. But why the fuck would it have server 03 on it? Are you trying to make me laugh? Do you think that MS FORGOT about drm? I suppose we should allow people to distribute over broadband all of there music and movies... (playing devils advocate here, I think we should). Point being, MS would never go for that, especially after sinking millions into the development of Media Center

The funniest part of this is that it would blow apple tv out of the water.... aTV doesnt play HALO ROFL

By RupanIII on 7/16/2007 7:53:28 PM , Rating: 1
Hmm, yeah, it should be Media Center. Home Server is different. Again, too much stuff running through my head these days, and the root canal i'm getting tomorrow can't come soon enough.

By RupanIII on 7/16/2007 8:06:49 PM , Rating: 1
The sales trend is towards Notebooks.
Mostly older people and Gamers are buying desktops.
Plenty of those old people are buying notebooks as well.
The entire trend of the PC becoming a non shared device for the individual has grown obvious. A desktop is just the cheaper way to get a computer.
I ONLY build my own desktops, and ofcourse buy notebooks because building's not worth the trouble.
I prefer to share the desktop with the family, but, everyone wants their own notebook, my neighbors are the same way as are people i sell to in stores. I have people come in and buy 3 cheap notebooks so all the kids have their own. I have couples buying notebooks so they can do their own things WHILE they talk to each other in the same room, and so they can move around the house. All of the feedback I have from the public, and i get around as a Manufacturer's rep,is that people want Notebooks. And I sell BOTH.
Desktop sales are primarily going to be a business expense in the future. There should always be desktops, but, if Microsoft takes a shot at the Living Room Hub media center idea with the next xbox, as i described, tons of people are going to buy it just for movies and internet on an HDTV if it's cheap enough and simpler than a pc.
I know a few people, like my mom, who has a blast with the Wii Internet Channel. Wireless remote + TV is a good experience. I know someone with a Gyration mouse that feels the same way.
Just typing what I see, after all, it's a forum.

By RupanIII on 7/17/2007 11:57:08 AM , Rating: 2
OH, I knew i had an answer brewing. I couldn't get it out yesterday because I was working on a paper proposal for work. Notebook computers are getting more popular. They are also hardly upgradable, and don't last as long due to heat, motherboard power connectors, and being dropped and toted around. Yeah, DELL will happily sell you one before a desktop. Why? Because it's more likely ONLY YOU will use it, so they can sell your family member's one too. And, since you are moving it around all the time, eating food on it and nearby, adn maybe want to play games on it, you will upgrade it. Yep, you will.
Me, i'm a desktop man myself, but, I SEE THIS. I try to convince people to get a powerful desktop all of the time. I even teach people to use workgroups and do live demos with it. THAT does impress them a bit. They have no idea you can do that. But, most people, don't care. They want their own computer and want to keep their information that private.

RE: Yeah, and the Future of XBOX most likely...
By mmarq on 7/18/2007 3:20:07 AM , Rating: 2
So, Fusion an AMD Dual Core Phenom with a modified TriCore onboard the Fusion processor, and Hypertransport it to the new ATi video/sound hardware. Oh...


Xbox is a IBM Power architecture and Phenom a x86. You cant have the *same* OS, MS or Linux running 2 micro-architectures. You'd have to use a VM like vmWare for separated OSes, but them say goodbye to Fusion on the VM OS.

RE: Yeah, and the Future of XBOX most likely...
By rupaniii on 7/20/2007 11:00:23 AM , Rating: 2
Uhm... PS2 EE/GS chips onboard the motherboard for a PS3....
PS1 chips onboard the PS2 chips... Your going from MIPS to PPC architectures there! The new OS just sends to code for execution forward to the new processors.
Just because it's on Die doesn't change anything.
You're way off there.

By mmarq on 7/20/2007 3:34:48 PM , Rating: 2
That must be (never looked into it) for compatibility issues with the older models. The OS, so to speak is only one, but has some kind of VM resource partition, with virtualization or other stuff.

The idea is one OS commands all the hardware and all the other OSes or layers in VM mode, and the partition accesses, because otherwise you would go anywhere fast if 2 different 'OSes' or 'layers' try to lock, even if temporally, the same resource for itself at the same time...

... and be exactly like dual-booting if 2 different 'OSes' or 'layers' are completely isolated from one another, even if there is a MIPS chip and a PPC chip.

As long as there is Hardware sharing of some sort, one 'OS' must arbitrate. And not having any hardware sharing, than it can only be possible with 2 different machines... like a PS3 and a PS1 in a network...

Ok... its not IMPOSSIBLE...

... if you have resource partition with a VM layer inside of the arbitrating OS.

By TomCorelis on 7/16/2007 3:16:14 PM , Rating: 5
This approach has the capability to displace integrated graphics chips if done right, and I'd very much like to see that.

What would be even cooler would be if you could combine the GPU chip in Fusion with that of your dedicated GPU card.

By Mitch101 on 7/16/2007 3:29:31 PM , Rating: 2
Also think about sciencemark scores and how labs and universities could take advantage of the abilities of having physics processing ability in the CPU as the norm.

By Christobevii3 on 7/16/2007 3:36:27 PM , Rating: 2
I think it would be cool if they could combine the cpu, graphics, and northbridge into a single chip that uses like 40 watts of power under load and like 5w on idle for laptops.

Something on par with the 7300gts would be enough.

By rdeegvainl on 7/16/07, Rating: -1
By OrSin on 7/16/2007 3:48:11 PM , Rating: 1
I think it would be cool to have sex with triplets all at once too. Really making a statement that is no where near happening is worthless. Unless you are talking about mini-itx type systems, but my guess since it posted here you are talking about real usable systems.

By kenji4life on 7/16/2007 4:19:52 PM , Rating: 1
Wait, you mean you've never had sex with triplets? It's almost as much fun as having sex with three unrelated girls (no incest going on and all). But seriously, this kind of thinking is not much different than "omg this 800 mb hard drive will never get full, I'll never have to buy another one for the rest of my life!" <- Or are you too young to remember when a 5.25" hdd was lucky to hold more than a few mb?

By MonkeyPaw on 7/16/2007 4:57:52 PM , Rating: 2
As we already see in the Xbox360, connecting the CPU and GPU together via a high-speed, low-latency bus while having a shared pool of system memory can produce some fairly effective 3D power. By the time AMD brings about this concept, dual-channel DDR3-1333/1600 should be quite common, so memory bandwidth shouldn't be a problem either.

I think this technology will improve notebook graphics power while offering more battery savings, too.

By mmarq on 7/18/2007 2:51:05 AM , Rating: 2
What would be even cooler would be if you could combine the GPU chip in Fusion with that of your dedicated GPU card.

You can.

Buy a board with a RD790 chipset from AMD and a ATI card ready for Close_to_Metal with a R580 chip( or a future R650 announced), and them you'll have to wait for A CPU with CTM readiness that i believe all Barcelonas Phenom parts have, has claimed and demonstrated(support for CTM)... the teraflop FASN8 system...

With Intel would be having to buy a chipset ready for CTM or Nvidia tesla, and wait for Intel to deliver a CPU ready for that CTM or Tesla. No roadmaps indicate that for now.

For now the CPUs wont have GPU acceleration, only in 2008 either from AMD(K10.5) or Intel(Nehalem). For now only K10 Phenom i believe can be part of the Stream computing scheme by means of CTM.

But having a chipset(RD790)ready and GPU card(R580) ready wait for a 2008 ready CTM CPU upgrade with GPU acceleration, and in a Crossfire configuration with your PCIe card GPU... that seems to be what they claim...

By mmarq on 7/18/2007 3:58:46 AM , Rating: 2
I forgot to mention.

You'd have to have an OS ready. Vista Ultimate will have upgrade paths, but other versions i believe not. Linux, Solaris, FreeBSD and others will have drivers also.

But than again, it would be needed to modify and recompile applications to take advantage of stream computing. In the Windows arena, besides Games and few little more, it will be quite mute,i'm afraid... for now. For open Source it would be much much more easier... i believe.

Stream computing existe in the world of HPC(high performance computing), big iron Mainframes and Clusters and 'big' workstations, where applications were and are taylor made, and hugely expensive, to take advantage of this capabilities.

There is where AMD was and is positioning its K10 Barcelona in the first place. (server and HPC)

The desktop arena still have to wait a little more for stream computing, for hardware resources and software capable of it, though there are each time more announcements.

Re: Crossfire
By EarthsDM on 7/17/2007 9:05:55 AM , Rating: 2
Let's hope it isn't as much 'Crossfire' as it is HyperTransport 3.0. I mean, DAMMIT has, what, 20,800 MByte/s of bandwidth BOTH ways? Lets see them use that instead of some sissy dongle/interconnect.

RE: Re: Crossfire
By Hawkido on 7/17/2007 2:40:39 PM , Rating: 2
The Xfire dongle went away with the X1950 PRO I believe. There may have been a few sample boards of different makes before that, but the X1950 PRO was the showcase piece of the Xfire Interconnect W/O Dongle.

Platform company
By Jellodyne on 7/17/2007 10:42:45 AM , Rating: 2
You want to be a platform company? You need to push for standards, otherwise you're just a company with weird platforms. For instance, AMD needs to get on the horn with their longtime partner nVidia and agree to one standard platform for multiple PCIe GPU slot systems. The whole SLI/Crossfire standard split is bad business for everyone, and AMD is in a position to fix it because they've always worked with nVidia. It's two gpus in PCIe slots, how difficult can it be to make the platform gpu brand agnostic? Both sides would benefit from a common platform.

RE: Platform company
By SexyK on 7/17/2007 1:23:35 PM , Rating: 2
I'm not sure I see how "both sides benefit from a common platform." Hypothetically, if nVidia has 10,000 SLi users and AMD has 1,000 crossfire users, how does it benefit nVidia to open up their dominant standard and allow the use of their competitors' cards? Also, a unified standard would mean that all changes to the platform would have to be developed, what, together? So if nVidia wants to allow 4-card SLi they have to work with AMD to expand the standard rather than just developing and implementing their own system? Seems inefficient and counter productive to me. Just give me the best each company has to offer and whichever one is stronger will win out in the marketplace.

Great News !
By swizeus on 7/17/2007 10:43:24 PM , Rating: 1
Great news...


can you beat intel with that?
or this is just another propaganda by AMD to raise their stocks ?

I AM an AMD fanboy, but when Dailytech posted a blog titled A Letter to Hector Ruiz, i become pessimistic about their ability to manufacture processor. Put in mind, AMD has skipped 1 generation of processor release and they still hasn't release any new breakthrough yet, compare to intel. How're they supposed to beat the giant ?

Longer acronyms!!!
By Boottothehead on 7/18/2007 2:00:30 PM , Rating: 1
So when can I get my hands on a ROTBMMOIHDDSCGPCPUPPUOC (a Ready Out of The Box Massively Multiplayer Online Integrated Hard Drive and Sound Card Graphics Processor CPU Physics Processor (Over Clocked))?

what happens if AMD doesn't deliver on Torrenza?
By HardwareD00d on 7/16/07, Rating: -1
By waseem on 7/17/2007 11:09:55 AM , Rating: 2
ya gud news

By borismkv on 7/17/2007 11:59:15 AM , Rating: 2
Wokka wokka wokka...</Fozzy>

"Well, there may be a reason why they call them 'Mac' trucks! Windows machines will not be trucks." -- Microsoft CEO Steve Ballmer

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki