backtop


Print 50 comment(s) - last by FaceMaster.. on Mar 17 at 6:53 PM

Marchitecture at its finest

NVIDIA has the fastest single GPU for the desktop in the GTX 285, a 55nm die-shrunk version of its predecessor the GTX 280. However, ATI has been able to gain a larger market share due to aggressive pricing and ramping of smaller geometries.  This has led to price pressure on NVIDIA, especially in the performance mainstream segment.

NVIDIA's original GT200 chip -- which is used in the GTX 280 and GTX 260 -- is too big, too costly, and consumes too much power to be used effectively in a mobile solution. NVIDIA has already switched to TSMC's 55nm process from the baseline 65nm node to deal with these issues for the GTX 285, but it is still not suitable for the majority of laptop users. Battery life is too short, the cooling fan is too loud, and the cost is too much.

One solution was to begin manufacturing on the 40nm bulk process like ATI has done. According to our sources, NVIDIA's attempts to produce a die-shrunk 40nm GT200 chip were "disastrous at best". Design problems became evident, since the GT200 was originally designed for the 65nm node. Two shrinks in a row without a major redesign was just too much for NVIDIA, and our most recent information from Taiwan is that the first 40nm chips from NVIDIA will be in the GeForce 300 series.

Without a power efficient GT200 based GPU solution for the mobile or mainstream value markets, NVIDIA is rebranding the 55nm G92b chip yet again to meet these critical segments. The original 65nm G92 chip was used in the GeForce 8800 GT, but you can only do so much with an older design. The chip was respun as the G92b with a 55nm die shrink, and is currently used in the 9800 GTX+. All G92 chips are only DirectX 10 capable, and will not support the full feature set of DirectX 10.1 or DirectX 11 that will come with Windows 7.

The problem is that many consumers will pick up a GTX 280M or GTX 260M thinking that it is the same or similar to the GTX 280, when it is actually just a 9800 GTX+.

There is currently no GeForce 100 series for the desktop market, but Nvidia launched the GeForce GTS 160M and 150M for notebooks at the same time as the GTX280M and the GTX260M.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Stupid Stupid
By kattanna on 3/4/2009 11:36:51 AM , Rating: 5
the problem with that line of thought though is that people looking at your 300c from 2008, are not being told that in 2009 its a new 400c model! when in reality its the same 300c.

GPU makers have for many many years now used advancing model numbers to show new tech levels, AKA geforce 5XXX, 6XXX, 7XXX, 8XXX.. with numbering within that range to show higher and lower end versions of that new tech process, EXP a 6800 being a higher end model then a 6600.

now nvidia released a new tech for GPU's in the GT 260/280 parts. the tech level is even called the GT200 chip. now they try to introduce "new" cards based on older tech and call it the GTS250, clearly implying its an lower end part of the new GT200 tech.

is it the fault of the consumer for not fully researching their purchase so they dont get "conned" yes it is.

but its also clearly deceptive advertising on nvidias part for trying to confuse what its products are.


RE: Stupid Stupid
By Reclaimer77 on 3/4/09, Rating: -1
RE: Stupid Stupid
By ninjaquick on 3/4/09, Rating: -1
RE: Stupid Stupid
By afkrotch on 3/4/2009 2:57:21 PM , Rating: 1
So what exactly do you consider a performance difference? The new card will be smaller in physical size (if you get the 1 gb card), it's a much better yield for the gpu, so it'll run cooler and consume less power. Under idle/full load it's using roughly 40w less power.

I personally could care less that they named it a GTS250. I know what it is, you know what it is. Morons won't know, but hell. They wouldn't even know what a 9800GTX+ even is anyways.


RE: Stupid Stupid
By therealnickdanger on 3/4/2009 12:14:26 PM , Rating: 3
I dunno, I think there's a lot of things that "don't make sense" to us on the outside looking in. To assume it's a con game or intentional deception is too pessimistic for my tastes. NVIDIA doesn't want to have three cards on the market (8800, 9800, GT250), it wants its brand to have one (perceived) line at a time.

Look at it this way:

NVIDIA makes a GREAT card, the 8800, then continue (as usual) to improve the manufacturing side of it and then want to introduce some better mid and low end parts for a new 9000-series lineup. Well, do they bring out the 9600GT and leave the 8800GT on the market? Think of all the ignorant customers that will buy a 9600GT instead of a 8800GT because the number is higher! That would not only hurt NVIDIA's reputation, but also their bottom line (high end cards provide more margin).

The solution? "Well, we don't have a new architecture for our high-end card yet, so lets rebadge the 8800GT as the 9800GT so that our customers know that 9800 > 9600 instead of 9600 > 8800."

So that carried them through the 9000-series until the GT-series hit. OK, so now they have these huge, badass graphics cards on the high end, but they have nothing in the GT-series for the mid or low end. Once again, they have a numbering problem. In the ignorant consumers' eyes, they see 8800/9800 > 260/280. Why not clear up the confusion and rebadge a clear and solid winner to be a placeholder in their GT lineup? It helps ensure that consumers get what they expect. Let's face it, they assume a higher number = a better product.

Obviously, NVIDIA knows that you and I know the difference (or lack thereof) between the 8800/9800/GT250. NVIDIA also has shareholders that demand performance. If the GT line isn't successful because they don't have a mainstream part ON PAPER, then it looks bad ON PAPER, and that could put NVIDIA in a difficult situation for their next architecture.

NVIDIA's only error in all this, IMO, is failing to discontinue previous models when the new ones hit.

For the record, I've been using and preferring mostly ATI cards for the past 5 years.


RE: Stupid Stupid
By Mr JB on 3/4/2009 12:27:20 PM , Rating: 2
We live in a world of small margins. So I can't blame Nvidia trying to get more return on their investement. I am in the affect though that I would hate it not to go to a good cause. Befor anyone points out I don't mean the fat cats pockets.


"Let's face it, we're not changing the world. We're building a product that helps people buy more crap - and watch porn." -- Seagate CEO Bill Watkins














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki