backtop


Print 64 comment(s) - last by icanhascpu.. on Sep 18 at 5:04 AM


AMD's 18 W dual-core Zacate packs two Bobcat Cores and a dedicate on-chip GPU die capable of handling gaming and 1080p video.  (Source: Slashgear)

NVIDIA recently announced a dual-core version of its ARM Tegra APU. NVIDIA was the first to release an APU, but the ARM core aboard Tegra is incompatible with Windows 7.  (Source: Reuters)
Look out AMD, you aren't the only incoming SoC solution anymore

 

AMD looks to soon capitalize on its success as the new sales king of the GPU market, by launching in early 2011 its "Fusion" products, which puts a GPU and CPU together on a single die.

At the IFA 2010 trade show in Germany this week, AMD showed off an 18W TDP Fusion system-on-a-chip (SOC) solution.  The chip combines dual 
Bobcat cores with AMD graphics, in what AMD calls an Accelerated Processing Unit (APU).  

The product is codenamed "Zacate" and looks like it could make a splash on the notebook scene thanks to its ability to decode 1080p video and play modern video games (all on a lean power budget).  Such a processor would be particularly desirable to ultra-portable designs.

Unfortunately for AMD it isn't the only one cooking up an APU.  
Bloomberg is reporting that Intel Chief Executive Officer Paul Otellini will show off his own company's take on a GPU+CPU SOC at the Intel Developer Forum in San Francisco next week.

The announcement creates in an interestingly competitive scenario -- AMD arguably has more GPU experience and the better graphics hardware technology.  But Intel has had superior CPU processing per dollar for some time now.  

John Taylor, a spokesman for the Sunnyvale, California-based AMD is quick to note his company's graphics edge, stating, "There are decades of research and design that goes into our discrete graphics.  Intel has yet to deliver a product that has discrete-level performance. Right now, it’s just claims."

Of course those are bold words coming from a company that has experienced plenty of delays of its own in the past.

Intel is reportedly confident that it can outcompete AMD in terms of price.  But its integrated graphics processors thus far have been far from stellar performers, to say the least.  So who will pull off the APU upset?  The CPU champion, or the GPU grandmaster?  The financial stakes are high and the market is wide open; customers can eagerly await a hard fought battle and the release of some exciting new options in 2011.

 



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: AMD for the win... and easily.
By theapparition on 9/11/2010 12:32:22 PM , Rating: 1
As enthusiasts, we all know that Intel doesn't have a competitive high end graphics processor.

But in no way try to translate that into an Intel failure.

I have no doubt, that if Intel's business case determined that they needed a high end GPU, they'd either develope a competitve offering or purchase one (nVidia?). But as of now, looking at Intels current market position and revenues, they seem to think there is no need. With good reason.

For most consumer level computers, there is absolutely no need for faster graphics.....or even faster CPU's. Do you really need more than 1997 technology to browse the web, run Word and Excel or play solitare esque games?

Of course power users and enthusiasts like ourselves will absolutely appreciate advances, but the average consumer won't. Demographics have switched and now it's all about mobile computing.

Intel is still the largest provider of GPU's on the planet. Yes, they all suck, but they don't suck just enough for the majority to be content. Not only that, but by offering extreamly low cost and high profit chipsets, Intel has not only fortified it's marketshare, but deprived competitors potential sales.

Does everyone seem to think that the largest and most successful chip manufacturer in history couldn't make a world class GPU if need be? It's ok to play brand favorites, just don't be naive about it.


RE: AMD for the win... and easily.
By inighthawki on 9/11/2010 3:15:37 PM , Rating: 3
quote:
Does everyone seem to think that the largest and most successful chip manufacturer in history couldn't make a world class GPU if need be? It's ok to play brand favorites, just don't be naive about it.


I think that is mostly a result of "Larabee" - a disaster of an attempt at a GPU, but this doesnt necessarily mean it will translate to no ability to make a high end gpu, just a failure at making a high end x86-based gpu


By MGSsancho on 9/12/2010 3:01:32 AM , Rating: 2
Intel still has them for development but I do not think they care that much. Intel goes after the most profitable portions of markets. Intel will always have budget, mainstream and enthusiast chipsets. Even in their integrated/budget offerings they still give you enough PCI-E lanes for a discrete add-on card. Maybe they just want to have a platform that even at entry level you can have fully hardware accelerated video/youtube. I remember several years ago you almost needed at least a geforce 2 if you wanted to watched a dvd or a downloaded file. With AMD's upcoming and Intel's upcoming offerings, every system will have full hardware accelerated video. this means even a cheap system can me formatted to me a multimedia machine with out hunting down a mobo and parts to get it all working. All we would need to do is look for the cheapest mobo in the form factor we want.

Intel tried very hard and very long for a dedicated video card, but it wasn't good enough to compete with their green and red rivals to run crysis. Team blue's green's and red's budgets were good enough for baseline/main H264 profile playback and for HD youtube, all of their new products will be good enough for High Profile settings for H264 (bluray.)

I do not see why we are having a pissing war. Only people buying dedicated graphics cards will be gamers, animators, artist (built-in chips dont have many options for color correction. maybe the new ones will,) developers and those who run applications that use cuda/opencl/gnugpu applications.


"We’re Apple. We don’t wear suits. We don’t even own suits." -- Apple CEO Steve Jobs














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki