backtop


Print 60 comment(s) - last by piroroadkill.. on Mar 21 at 5:00 AM

NVIDIA says trying to design a GPU for the console wasn't worth the cost

Advanced Micro Devices, Inc. (AMD) has quietly dominated the market for commodity graphics chips and CPUs for console gaming systems, and the latest generation of consoles look to be no exception.  Sony Corp.'s (TYO:6758PS4, to launch this holiday season, will feature an AMD GPU and CPU.  And there's an AMD 550 MHz Radeon "Latte" GPU aboard Nintendo Comp., Ltd.'s (TYO:7974popular Wii U.

So how does NVIDIA Corp. (NVDA), AMD's chief rival in the PC graphics market feel about AMD's dominance of the increasingly PC-like consoles?  Not too bad, apparently.



NVIDIA's Senior Vice President of content and tecnology told Gamespot in a recent interview that his company is essentially letting AMD win.  While he's convinced his firm could be AMD if it tried, he says it just isn't worth it, remarking:

I'm sure there was a negotiation that went on and we came to the conclusion that we didn't want to do the business at the price those guys were willing to pay. Having been through the original Xbox and PS3, we understand the economics of the development and the trade-offs.

If we say, did a console, what other piece of our business would we put on hold to chase after that? In the end, you only have so many engineers and so much capability, and if you're going to go off and do chips for Sony or Microsoft, then that's probably a chip that you're not doing for some other portion of your business.

That statement seems a bit odd -- after all, hegemony of consoles could be a ticket for a financially struggling AMD to effectively sell tens, if not hundreds of millions of chips.

Wii U GPU
The Wii U packs an AMD GPU (blue: memory; red: stream processors; yellow: texture units).
[Image Source: Chipworks]

But NVIDIA's focus is more directed on the mobile market, where it's looking to leverage pared down versions of its GeForce GPUs beside ARM CPU cores.  NVIDIA has its work cut out for it in that market; it largely lost the last round to Qualcomm, Inc. (QCOM) due to its chips being too power-hungry.  
 

NVIDIA is focused on its mobile processor war with Qualcomm.

NVIDIA is looking to change later this year with the refresh of Tegra 4 that will include an on-die LTE modem.  Between Tegra and the development of traditional PC GPUs, NVIDIA sounds content to let AMD freely dominate the console market -- or so it says.

Source: GameSpot



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Nvidia didn't have a chance...
By nikon133 on 3/17/2013 10:55:31 PM , Rating: 5
Personally, I'd rather stay with 30 - 40 million consoles a year than going for questionable share of Android's mobile market while being completely out of Apple's and Windows phone/Windows Pro tablet shares. Considering that only company that moves real volumes in Android market, Samsung, prefers to stick with their own hardware, and Nvidia is fighting Qualcomm, TI, - even Intel is pushing their X86 on Android - it doesn't really look like Nvidia has it set in stone on mobile market either.

Even on PC market, considering that majority of units are shipped with integrated Intel GPUs, and big chunk of dedicated solutions are based on entry level (and dirt cheap) Nvidia and AMD solutions... that console market might be healthiest source of income than anything else either Nvidia or AMD have right now.

Yes mobile market is and will grow even further compared to console market, but like I said - there are no guarantees that Nvidia will be overly successful there, nor that fierce competition with other players will let them make as much money there as they could, otherwise.

At the end of the day, Nvidia only does what seems to be common practice nowadays - trying to underestimate competitor's success by downplaying its importance. Not unlike what Apple and HTC are trying to do about Samsung Galaxy S4.

Damage control, nothing else.


RE: Nvidia didn't have a chance...
By FlyTexas on 3/18/2013 2:39:15 AM , Rating: 2
nVidia's biggest problem is the lack of a x86 CPU to pair with their GPUs.

So they are going in another direction trying to get away from Intel as much as they can.

Is there a future in add-on desktop graphics cards? Probably not, give a few more years and the built in GPUs on Intel CPUs will become "good enough" for most people.

Hard cord people will want add-on cards, but their numbers will shrink as Intel's GPUs improve.

Quite frankly, I'm shocked that Intel hasn't purchased nVidia, it made perfect sense after AMD bought ATI.


By FITCamaro on 3/18/2013 8:41:21 AM , Rating: 2
Intel and AMDs integrated GPUs have been "good enough" for average people for years.

The discrete market has always been for hard core gamers and Intel is no where close to being "good enough" for them. Sure Ivy Bridge can play older games at decent resolutions and newer games at 720p with low/mid details. But that isn't going to take away sales from those who would have bought a dedicated GPU to start with.


RE: Nvidia didn't have a chance...
By Bateluer on 3/19/2013 11:32:46 PM , Rating: 2
If you're wanting to play games at anything higher than minimum details settings, you're going to get a PCIe card. Even the HD4600 in the Haswell i7 4770K doesn't deliver playable frame rates in yesterday's games. Not exactly future proof there.

http://www.tomshardware.com/reviews/core-i7-4770k-...


"The Space Elevator will be built about 50 years after everyone stops laughing" -- Sir Arthur C. Clarke














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki