Print 64 comment(s) - last by icanhascpu.. on Sep 18 at 5:04 AM

AMD's 18 W dual-core Zacate packs two Bobcat Cores and a dedicate on-chip GPU die capable of handling gaming and 1080p video.  (Source: Slashgear)

NVIDIA recently announced a dual-core version of its ARM Tegra APU. NVIDIA was the first to release an APU, but the ARM core aboard Tegra is incompatible with Windows 7.  (Source: Reuters)
Look out AMD, you aren't the only incoming SoC solution anymore


AMD looks to soon capitalize on its success as the new sales king of the GPU market, by launching in early 2011 its "Fusion" products, which puts a GPU and CPU together on a single die.

At the IFA 2010 trade show in Germany this week, AMD showed off an 18W TDP Fusion system-on-a-chip (SOC) solution.  The chip combines dual 
Bobcat cores with AMD graphics, in what AMD calls an Accelerated Processing Unit (APU).  

The product is codenamed "Zacate" and looks like it could make a splash on the notebook scene thanks to its ability to decode 1080p video and play modern video games (all on a lean power budget).  Such a processor would be particularly desirable to ultra-portable designs.

Unfortunately for AMD it isn't the only one cooking up an APU.  
Bloomberg is reporting that Intel Chief Executive Officer Paul Otellini will show off his own company's take on a GPU+CPU SOC at the Intel Developer Forum in San Francisco next week.

The announcement creates in an interestingly competitive scenario -- AMD arguably has more GPU experience and the better graphics hardware technology.  But Intel has had superior CPU processing per dollar for some time now.  

John Taylor, a spokesman for the Sunnyvale, California-based AMD is quick to note his company's graphics edge, stating, "There are decades of research and design that goes into our discrete graphics.  Intel has yet to deliver a product that has discrete-level performance. Right now, it’s just claims."

Of course those are bold words coming from a company that has experienced plenty of delays of its own in the past.

Intel is reportedly confident that it can outcompete AMD in terms of price.  But its integrated graphics processors thus far have been far from stellar performers, to say the least.  So who will pull off the APU upset?  The CPU champion, or the GPU grandmaster?  The financial stakes are high and the market is wide open; customers can eagerly await a hard fought battle and the release of some exciting new options in 2011.


Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Intelol
By Reclaimer77 on 9/11/2010 8:46:01 AM , Rating: 0
Ok I'm probably asking for another -1 here, which anyone even slightly defending Intel seems to get these days. But what are you talking about?

I don't know what qualifies in your mind as an "amazing" product, but the Core2's were far more advanced than the Pentium 4 and whatever AMD had at the time. Both in performance, power consumption, and a host of other areas far beyond my understanding.

A Q9550/9650 owner to this day still has NO reason to change to an i7 or i5.

I actually agree, to an certain extent. I personally still use an old Core2, and it's "good enough" for what I do. But it wasn't so long ago when that was totally unheard of, using 5 year old CPU's and still having a system capable of handling modern games, apps, etc etc. Don't you think that's a testament to what Intel did with the Core2/Quad architecture?

I don't remember the i7 being "hyped" by Intel. It was delivered as a server solution, where you can actually use 8 cores. If anyone hyped it, it's the enthusiast crowd.

You're just talking a lot of FUD without anything to back it up. Please show me these "overhyped" benchmarks and fluffy scores that Intel used to "overhype" their products. Seriously, what are you talking about?

RE: Intelol
By dark matter on 9/12/2010 7:53:49 AM , Rating: 4
Here you go again, instantly on the defensive. You're not getting the -1 for defending Intel, you're getting the -1 for attacking other posters. Make your point, don't attack people.

Take a look at your last paragraph as an example. In case you don't understand how people work, let me answer your final question. He is talking about his opinion on the i7. He believes it was hyped up at the time.

Demanding he give you proof of the hype in a patronising way isn't defending Intel. And now you know why you get the -1's. Have your opinion, even have a strong opinion, but never for one moment think it is better than someone else's.

RE: Intelol
By Reclaimer77 on 9/13/2010 9:12:13 AM , Rating: 1
You're not getting the -1 for defending Intel, you're getting the -1 for attacking other posters

That wasn't an attack.

Take a look at your last paragraph as an example. In case you don't understand how people work, let me answer your final question. He is talking about his opinion on the i7. He believes it was hyped up at the time.

But it wasn't hyped up. He's wrong. I don't see why I shouldn't point this out just because it's his "opinion". And where in that post did he say "imo" or "my opinion"? Everything he's saying is put forth as if it were absolute fact. I'm just calling him out on it. There's no "attack" here, at all. Stop being so melodramatic.

Saying Intel hasn't innovated anything is FUD, and as a tech community we're SUPPOSED to stand up and take issue with that. If we don't, we've become a mass of fanbois and trolls. Is that what you want?

Everyone has a right to an opinion, and everyone else has a right to say your opinion is wrong.

"My sex life is pretty good" -- Steve Jobs' random musings during the 2010 D8 conference

Most Popular ArticlesSmartphone Screen Protectors – What To Look For
September 21, 2016, 9:33 AM
UN Meeting to Tackle Antimicrobial Resistance
September 21, 2016, 9:52 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Update: Problem-Free Galaxy Note7s CPSC Approved
September 22, 2016, 5:30 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki