backtop


Print 50 comment(s) - last by FaceMaster.. on Mar 17 at 6:53 PM

Marchitecture at its finest

NVIDIA has the fastest single GPU for the desktop in the GTX 285, a 55nm die-shrunk version of its predecessor the GTX 280. However, ATI has been able to gain a larger market share due to aggressive pricing and ramping of smaller geometries.  This has led to price pressure on NVIDIA, especially in the performance mainstream segment.

NVIDIA's original GT200 chip -- which is used in the GTX 280 and GTX 260 -- is too big, too costly, and consumes too much power to be used effectively in a mobile solution. NVIDIA has already switched to TSMC's 55nm process from the baseline 65nm node to deal with these issues for the GTX 285, but it is still not suitable for the majority of laptop users. Battery life is too short, the cooling fan is too loud, and the cost is too much.

One solution was to begin manufacturing on the 40nm bulk process like ATI has done. According to our sources, NVIDIA's attempts to produce a die-shrunk 40nm GT200 chip were "disastrous at best". Design problems became evident, since the GT200 was originally designed for the 65nm node. Two shrinks in a row without a major redesign was just too much for NVIDIA, and our most recent information from Taiwan is that the first 40nm chips from NVIDIA will be in the GeForce 300 series.

Without a power efficient GT200 based GPU solution for the mobile or mainstream value markets, NVIDIA is rebranding the 55nm G92b chip yet again to meet these critical segments. The original 65nm G92 chip was used in the GeForce 8800 GT, but you can only do so much with an older design. The chip was respun as the G92b with a 55nm die shrink, and is currently used in the 9800 GTX+. All G92 chips are only DirectX 10 capable, and will not support the full feature set of DirectX 10.1 or DirectX 11 that will come with Windows 7.

The problem is that many consumers will pick up a GTX 280M or GTX 260M thinking that it is the same or similar to the GTX 280, when it is actually just a 9800 GTX+.

There is currently no GeForce 100 series for the desktop market, but Nvidia launched the GeForce GTS 160M and 150M for notebooks at the same time as the GTX280M and the GTX260M.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Ignorance is bliss
By Schrag4 on 3/4/2009 10:11:51 AM , Rating: 5
Don't you all get tired of explaining to your family and friends how a mobile graphics solution is inferior to its similarly named desktop part? I know I do. Using the same architecture for 3 named generations??? Come'on....

I had similar gripes with names like SuperSpeed USB and ExtremeFS, terrible names. Just terrible. What's the next version? Super-Duper speed? Ultra-Mega-ExtremeFS?




RE: Ignorance is bliss
By ExarKun333 on 3/4/2009 11:29:59 AM , Rating: 2
That's a good point. It is hard to explain to the computer-illiterate that their "GTX280M" chip is really a couple-year old 8800-series GPU that you can get for around $75 on the desktop. They will say "it's better than your GTX 260 C216 GPU...look, 280 > 260!!!".


RE: Ignorance is bliss
By Bateluer on 3/4/2009 11:49:54 AM , Rating: 2
But you can respond with My 260 + 216 is greater than your 280M.

Or you could just run them through a loop of 3Dmark 06 or Vantage?


RE: Ignorance is bliss
By ZipSpeed on 3/4/2009 12:37:47 PM , Rating: 3
And the scary thing is that Nvidia will also be able to get away with charging a premium on the chip just because it says 280M.

Don't get me wrong, Nvidia still makes some great products but I've lost some respect for them these last couple years.


RE: Ignorance is bliss
By ExarKun333 on 3/4/2009 12:42:45 PM , Rating: 5
You are right on, they will charge a lot for this chip. I don't have an issue with the "M" version being slightly neutered for mobile use, thats not that uncommon. I would expect to see less memory bandwith and probably less shaders...but it SHOULD at least be the same architechure, right!?

A good analogy would be if AMD released a "Phenom Mobile 9550" CPU and it was really a Athlon X2 CPU. It just doesn't make sense.


RE: Ignorance is bliss
By boogle on 3/5/2009 5:51:49 AM , Rating: 1
If I release product Z that's faster than product A - does it matter which one has what underneath?

In your analogy, if the Phenom Mobile 9550 was faster than the competition, and/or other CPUs from the same brand with the Athlon X2 name - would you still have a problem with the name? What about the Sempron which was just a slightly cut down Athlon XP? What about the Celeron? What if I had a Celeron that was faster than a Pentium? Should I continue to call the Celeron a Celeron? Would it not be more confusing having a 'budget' name beating a proper name? Surely it would be less confusing to just brand the oddly fast Celeron a 'Pentium' purely to avoid performance confusion.

I often think NV should just rename the 'codenames' for these cores so that even tho its still G92 it's renamed to something like GT200M and launched as a 'new' core. Then no one would complain. It's only this teeny bit of additional knowledge that has people going mental. Never mind that G92 is still competitive...


RE: Ignorance is bliss
By Starcub on 3/4/2009 1:05:32 PM , Rating: 2
quote:
Don't you all get tired of explaining to your family and friends how a mobile graphics solution is inferior to its similarly named desktop part? I know I do. Using the same architecture for 3 named generations??? Come'on....

With more people deciding to buy laptops instead of desktop computers, they may be trying to differentiate the segments a little more. Gamers should really be buying desktop computers; even most laptop gamers are concerned about heat and battery life given what has happened in terms of reliability with gaming laptops (now about a 6 year old market).

I don't see the tick tock tock strategy as something bad myself; my 8800M GTS is plenty good enough for most of the games I play.


RE: Ignorance is bliss
By afkrotch on 3/4/2009 3:43:20 PM , Rating: 2
GPUs going with a tick tock tock tick strategy has worked for a while now. I think it's less of a problem now than before, with multiple gpu solutions available. Games are becoming more graphically intensive, but not so much that graphics cards aren't able to keep up.

The latest game I've bought was FEAR 2. I'm clocking in at 60 fps (seems like the game's capped at that) at 1920x1200. I'm not using any AA/AF though. Maybe it's just me, but I can't tell the difference with it on or off. Probably cause I'm usually looking at what's right near me and not some wire, fence, tree way off in the distance. Looking at that crap gets you killed.


RE: Ignorance is bliss
By i3arracuda on 3/5/2009 12:17:39 PM , Rating: 2
quote:
I had similar gripes with names like SuperSpeed USB and ExtremeFS, terrible names. Just terrible. What's the next version? Super-Duper speed? Ultra-Mega-ExtremeFS?


What about SuperSpeed USB 2.0 Ultimate Hyper-transporting Special Champion Edition...TURBO.

Hey, it worked for Street Fighter II.


"I f***ing cannot play Halo 2 multiplayer. I cannot do it." -- Bungie Technical Lead Chris Butcher














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki