backtop


Print 50 comment(s) - last by FaceMaster.. on Mar 17 at 6:53 PM

Marchitecture at its finest

NVIDIA has the fastest single GPU for the desktop in the GTX 285, a 55nm die-shrunk version of its predecessor the GTX 280. However, ATI has been able to gain a larger market share due to aggressive pricing and ramping of smaller geometries.  This has led to price pressure on NVIDIA, especially in the performance mainstream segment.

NVIDIA's original GT200 chip -- which is used in the GTX 280 and GTX 260 -- is too big, too costly, and consumes too much power to be used effectively in a mobile solution. NVIDIA has already switched to TSMC's 55nm process from the baseline 65nm node to deal with these issues for the GTX 285, but it is still not suitable for the majority of laptop users. Battery life is too short, the cooling fan is too loud, and the cost is too much.

One solution was to begin manufacturing on the 40nm bulk process like ATI has done. According to our sources, NVIDIA's attempts to produce a die-shrunk 40nm GT200 chip were "disastrous at best". Design problems became evident, since the GT200 was originally designed for the 65nm node. Two shrinks in a row without a major redesign was just too much for NVIDIA, and our most recent information from Taiwan is that the first 40nm chips from NVIDIA will be in the GeForce 300 series.

Without a power efficient GT200 based GPU solution for the mobile or mainstream value markets, NVIDIA is rebranding the 55nm G92b chip yet again to meet these critical segments. The original 65nm G92 chip was used in the GeForce 8800 GT, but you can only do so much with an older design. The chip was respun as the G92b with a 55nm die shrink, and is currently used in the 9800 GTX+. All G92 chips are only DirectX 10 capable, and will not support the full feature set of DirectX 10.1 or DirectX 11 that will come with Windows 7.

The problem is that many consumers will pick up a GTX 280M or GTX 260M thinking that it is the same or similar to the GTX 280, when it is actually just a 9800 GTX+.

There is currently no GeForce 100 series for the desktop market, but Nvidia launched the GeForce GTS 160M and 150M for notebooks at the same time as the GTX280M and the GTX260M.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Stupid Stupid
By therealnickdanger on 3/4/2009 12:14:26 PM , Rating: 3
I dunno, I think there's a lot of things that "don't make sense" to us on the outside looking in. To assume it's a con game or intentional deception is too pessimistic for my tastes. NVIDIA doesn't want to have three cards on the market (8800, 9800, GT250), it wants its brand to have one (perceived) line at a time.

Look at it this way:

NVIDIA makes a GREAT card, the 8800, then continue (as usual) to improve the manufacturing side of it and then want to introduce some better mid and low end parts for a new 9000-series lineup. Well, do they bring out the 9600GT and leave the 8800GT on the market? Think of all the ignorant customers that will buy a 9600GT instead of a 8800GT because the number is higher! That would not only hurt NVIDIA's reputation, but also their bottom line (high end cards provide more margin).

The solution? "Well, we don't have a new architecture for our high-end card yet, so lets rebadge the 8800GT as the 9800GT so that our customers know that 9800 > 9600 instead of 9600 > 8800."

So that carried them through the 9000-series until the GT-series hit. OK, so now they have these huge, badass graphics cards on the high end, but they have nothing in the GT-series for the mid or low end. Once again, they have a numbering problem. In the ignorant consumers' eyes, they see 8800/9800 > 260/280. Why not clear up the confusion and rebadge a clear and solid winner to be a placeholder in their GT lineup? It helps ensure that consumers get what they expect. Let's face it, they assume a higher number = a better product.

Obviously, NVIDIA knows that you and I know the difference (or lack thereof) between the 8800/9800/GT250. NVIDIA also has shareholders that demand performance. If the GT line isn't successful because they don't have a mainstream part ON PAPER, then it looks bad ON PAPER, and that could put NVIDIA in a difficult situation for their next architecture.

NVIDIA's only error in all this, IMO, is failing to discontinue previous models when the new ones hit.

For the record, I've been using and preferring mostly ATI cards for the past 5 years.


"DailyTech is the best kept secret on the Internet." -- Larry Barber














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki