Print 64 comment(s) - last by icanhascpu.. on Sep 18 at 5:04 AM

AMD's 18 W dual-core Zacate packs two Bobcat Cores and a dedicate on-chip GPU die capable of handling gaming and 1080p video.  (Source: Slashgear)

NVIDIA recently announced a dual-core version of its ARM Tegra APU. NVIDIA was the first to release an APU, but the ARM core aboard Tegra is incompatible with Windows 7.  (Source: Reuters)
Look out AMD, you aren't the only incoming SoC solution anymore


AMD looks to soon capitalize on its success as the new sales king of the GPU market, by launching in early 2011 its "Fusion" products, which puts a GPU and CPU together on a single die.

At the IFA 2010 trade show in Germany this week, AMD showed off an 18W TDP Fusion system-on-a-chip (SOC) solution.  The chip combines dual 
Bobcat cores with AMD graphics, in what AMD calls an Accelerated Processing Unit (APU).  

The product is codenamed "Zacate" and looks like it could make a splash on the notebook scene thanks to its ability to decode 1080p video and play modern video games (all on a lean power budget).  Such a processor would be particularly desirable to ultra-portable designs.

Unfortunately for AMD it isn't the only one cooking up an APU.  
Bloomberg is reporting that Intel Chief Executive Officer Paul Otellini will show off his own company's take on a GPU+CPU SOC at the Intel Developer Forum in San Francisco next week.

The announcement creates in an interestingly competitive scenario -- AMD arguably has more GPU experience and the better graphics hardware technology.  But Intel has had superior CPU processing per dollar for some time now.  

John Taylor, a spokesman for the Sunnyvale, California-based AMD is quick to note his company's graphics edge, stating, "There are decades of research and design that goes into our discrete graphics.  Intel has yet to deliver a product that has discrete-level performance. Right now, it’s just claims."

Of course those are bold words coming from a company that has experienced plenty of delays of its own in the past.

Intel is reportedly confident that it can outcompete AMD in terms of price.  But its integrated graphics processors thus far have been far from stellar performers, to say the least.  So who will pull off the APU upset?  The CPU champion, or the GPU grandmaster?  The financial stakes are high and the market is wide open; customers can eagerly await a hard fought battle and the release of some exciting new options in 2011.


Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By EricMartello on 9/11/2010 3:58:34 AM , Rating: 2
Intel has utterly failed on the dedicated GPU front. In terms of actual video chipset units sold, Intel is the leader. Their integrated graphics solutions ship more units than ATI and nVidia combined. I agree with you that Intel has failed laughably at the mid- and high-end GPU market. But at the low end where low cost, value, and low power draw are priorities, Intel is king.,2817,2367210,

I don't think Intel even offers a discrete GPU anymore after their "XTREME GRAPHICS 3D" or whatever it was got put to shame by Matrox back in the day. What I mean by failed is not in terms of sales, rather performance and technology. Yes, you'll find Intel's garbage video chipsets in just about every integrated solution out there - it works, but most people are unaware of just how poorly it performs. I'd rather have a low end Radeon or Geforce any day over Intel's integrated graphics.

I really wish AMD and nVidia would make more than a token effort to compete in this low-end low-power market. I suspect with their expertise they could make a much better-performing integrated video card than Intel with the same power consumption. Which I suppose is what this move by AMD is all about - challenging Intel's dominance of the integrated video chipset market.

There are a handful of motherboards with AMD and NVIDIA graphics chipsets embedded. I have one with an integrated NVIDIA 9400 which is far better than whatever Intel has. Perhaps in a laptop where battery life is an issue, intel may have an advantage, but now there is "optimus" or whatever from Nvidia that lets you use intel for 2D and a discrete Nvidia GPU for gaming.

Quite frankly I don't know why the don't just have the GPU go into a standby state when no 3D is being used, and have 2D processed by a secondary processor. That would eliminate excessive power draw during idle, and for desktops, could reduce case temps and noise.

"When an individual makes a copy of a song for himself, I suppose we can say he stole a song." -- Sony BMG attorney Jennifer Pariser

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
No More Turtlenecks - Try Snakables
September 19, 2016, 7:44 AM
ADHD Diagnosis and Treatment in Children: Problem or Paranoia?
September 19, 2016, 5:30 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
Automaker Porsche may expand range of Panamera Coupe design.
September 18, 2016, 11:00 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki