Print 50 comment(s) - last by FaceMaster.. on Mar 17 at 6:53 PM

Marchitecture at its finest

NVIDIA has the fastest single GPU for the desktop in the GTX 285, a 55nm die-shrunk version of its predecessor the GTX 280. However, ATI has been able to gain a larger market share due to aggressive pricing and ramping of smaller geometries.  This has led to price pressure on NVIDIA, especially in the performance mainstream segment.

NVIDIA's original GT200 chip -- which is used in the GTX 280 and GTX 260 -- is too big, too costly, and consumes too much power to be used effectively in a mobile solution. NVIDIA has already switched to TSMC's 55nm process from the baseline 65nm node to deal with these issues for the GTX 285, but it is still not suitable for the majority of laptop users. Battery life is too short, the cooling fan is too loud, and the cost is too much.

One solution was to begin manufacturing on the 40nm bulk process like ATI has done. According to our sources, NVIDIA's attempts to produce a die-shrunk 40nm GT200 chip were "disastrous at best". Design problems became evident, since the GT200 was originally designed for the 65nm node. Two shrinks in a row without a major redesign was just too much for NVIDIA, and our most recent information from Taiwan is that the first 40nm chips from NVIDIA will be in the GeForce 300 series.

Without a power efficient GT200 based GPU solution for the mobile or mainstream value markets, NVIDIA is rebranding the 55nm G92b chip yet again to meet these critical segments. The original 65nm G92 chip was used in the GeForce 8800 GT, but you can only do so much with an older design. The chip was respun as the G92b with a 55nm die shrink, and is currently used in the 9800 GTX+. All G92 chips are only DirectX 10 capable, and will not support the full feature set of DirectX 10.1 or DirectX 11 that will come with Windows 7.

The problem is that many consumers will pick up a GTX 280M or GTX 260M thinking that it is the same or similar to the GTX 280, when it is actually just a 9800 GTX+.

There is currently no GeForce 100 series for the desktop market, but Nvidia launched the GeForce GTS 160M and 150M for notebooks at the same time as the GTX280M and the GTX260M.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Stupid Stupid
By Belard on 3/4/2009 9:26:34 AM , Rating: 5
Is nvidia racing to out-do themselves every 6 months with stupid re-branding of the same chip?! Geez, it was bad when the 8800GTs-512 came out.

How about the GF2-MX, which was re-branded GF2-mx200 and mx400 (The mx200 was a slower version of the already low-end mx).

So with another re-branded G92 chip... does it or does it not have the design flaw? Or is the GTX for notebooks is supposed to make us feel safer at using it or take a chance for video death 6~18 months later?

If Nvidia had half a brain, they would have re-branded these older chips as GTX 1x0... it would have made a bit more sense... rather than using 3 generations of names for the same bloody thing.

RE: Stupid Stupid
By inperfectdarkness on 3/4/2009 9:37:12 AM , Rating: 5
going to buy ati in my next laptop. at least i know i can run 10.1 that way.

RE: Stupid Stupid
By boogle on 3/4/09, Rating: 0
RE: Stupid Stupid
By StevoLincolnite on 3/4/2009 1:37:33 PM , Rating: 5
Which games will run acceptably using the DX10.1 features on a standard laptop? Hell, the number of DX10.1 titles is minimal at best making it even less likely you'll ever actually use the 'features' 10.1 offers over 10.

That's his choice as a consumer, However there have been a few cases in past history where buying a chip with superior performance but with a feature set that is a generation behind would mean that the card would be useless for newer generation games despite having ample enough processing power to do so.

A few cases where this happened was with the x8xx series of GPU's from ATI, where SM3 only games wouldn't run on the GPU's. - Games like Bioshock.
What happened? The community gets together and re-writes the games shader packages (Shader Shock).

Or hows about when TnL became mainstream which left out the bulk of Intel IGP's, The Radeon VE/LE, The Voodoo 4 and 5 GPU's even though they were performance wise capable of running the games?

Or when games started needing Pixel Shaders and the Geforce 4 MX440 being one of the best selling GPU's of all time was in-capable of running such games? - Despite being faster than the Geforce FX5200?

Performance can always be increased by reducing the games quality, driver tweaking, overclocking, but there is only so much you can do with feature sets not being available, not everyone has the cash or even desire to upgrade every 12 months as soon as something better comes along.

RE: Stupid Stupid
By nafhan on 3/5/2009 11:58:10 AM , Rating: 2
Yep, I can't run rainbow six vegas 2 on my older computer because it has an x800 gto.

RE: Stupid Stupid
By aegisofrime on 3/4/2009 9:51:10 AM , Rating: 2
As much as I like ATI (I'm running a 4870 myself), I really wish that ATI would get their GPGPU efforts up to speed with nVidia. I'm looking enviously at Badaboom while ATI only has that crappy AVIVO converter which I heard doesn't even use the GPU and outputs crappy quality. And they don't even have it out for 64-bit Vista!

RE: Stupid Stupid
By inperfectdarkness on 3/4/2009 12:43:57 PM , Rating: 4
i'd like to see the 4780m overclocked, die shrunk to 40mm, given 1gb of gddr5, and offered in an "all in wonder" configuration.

i'd also like a winning lottery ticket...but that won't happen either.

RE: Stupid Stupid
By GodisanAtheist on 3/4/2009 4:51:59 PM , Rating: 2
That's really not that far fetched...

die shrunk to 40 mm

...if we've somehow discovered how to shrink things to a much larger size...

RE: Stupid Stupid
By FaceMaster on 3/4/2009 6:00:58 PM , Rating: 5
Yes, we understand he made a mistake. No need to be so gay about it.

RE: Stupid Stupid
By GodisanAtheist on 3/4/09, Rating: -1
RE: Stupid Stupid
By FaceMaster on 3/5/2009 7:46:41 AM , Rating: 4
I can't help being who I am.

That's just an excuse people use to justify making the same mistakes over and over again. Anybody can learn. It's just that most people are too lazy.

RE: Stupid Stupid
By GodisanAtheist on 3/11/2009 8:40:10 PM , Rating: 1
Anybody can learn

... to not be gay? Cause that's kinda the joke I was going for.

RE: Stupid Stupid
By FaceMaster on 3/17/2009 6:53:21 PM , Rating: 1
... to not be gay? Cause that's kinda the joke I was going for

I don't mind gay people. I just can't stand it when PEOPLE FLAUNT IT. You've now made two posts revolving directly around gayness. Good for you. This isn't a dating site though. So stop with these sex-based comments.

And so should your Mum (/irony)

RE: Stupid Stupid
By Fox5 on 3/4/09, Rating: 0
RE: Stupid Stupid
By therealnickdanger on 3/4/2009 9:55:35 AM , Rating: 3
NVIDIA's problem, I think, is that it believes it is controlling the market. While it is a KEY player, perhaps even the biggest player, it's not enough to shape the market as it sees fit. NVIDIA is rolling the dice, expecting that if it can just get one or two of its proprietary* technologies (like CUDA or PhysX) to become mainstream, then it can sit back and devour royalties.

Too bad for NVIDIA that Microsoft is leading the charge when it comes to graphics APIs. Is DX11 still going to feature "DirectPhysics"? With Intel also moving toward SoC and more graphics/physics enhancements to their CPUs, NVIDIA's dice rolling will eventually come up snake eyes...

RE: Stupid Stupid
By winterspan on 3/4/2009 4:59:16 PM , Rating: 1
Yep, and I don't believe either will succeed.

CUDA won't survive long with Apple's OpenCL push and MS's DirectX11 "compute shader"...

And while I believe Physx has been somewhat successful, there is no way that the industry will accept an Nvidia only physics solution over the long term, particularly as Nvidia lost so much enthusiast momentum once ATI got their act together with the excellent RV770 parts..

RE: Stupid Stupid
By SiliconDoc on 3/14/09, Rating: 0
RE: Stupid Stupid
By murphyslabrat on 3/4/2009 9:59:36 AM , Rating: 2
It's not terribly different from what NVidia has been doing with it's mobile GPU's to date. For instance, I'm using a laptop with a GeForce 8800M GTS. In reality, what I have is a 64 SP card with a shader clock of 1250 hz, making it barely competitive with a desktop version of the 9600 GT.

The 9600M GT, on the other hand, is basically a mildly overclocked 8600M with a die-shrink.

RE: Stupid Stupid
By aguilpa1 on 3/4/2009 10:00:51 AM , Rating: 5
Yup its stupid and somewhat criminal in that through marketing they are attempting to fool new buyers or the uninitiated into thinking they are getting a "new" product. I wouldn't mind so much if Nvidia was just honest and call a goose a goose and quit slapping some lipstick on it every 6 months and giving it a different name. (weird mental image perhaps bad analogy, seeing as how geese have no lips)

RE: Stupid Stupid
By therealnickdanger on 3/4/09, Rating: -1
RE: Stupid Stupid
By Reclaimer77 on 3/4/09, Rating: -1
RE: Stupid Stupid
By kattanna on 3/4/2009 11:36:51 AM , Rating: 5
the problem with that line of thought though is that people looking at your 300c from 2008, are not being told that in 2009 its a new 400c model! when in reality its the same 300c.

GPU makers have for many many years now used advancing model numbers to show new tech levels, AKA geforce 5XXX, 6XXX, 7XXX, 8XXX.. with numbering within that range to show higher and lower end versions of that new tech process, EXP a 6800 being a higher end model then a 6600.

now nvidia released a new tech for GPU's in the GT 260/280 parts. the tech level is even called the GT200 chip. now they try to introduce "new" cards based on older tech and call it the GTS250, clearly implying its an lower end part of the new GT200 tech.

is it the fault of the consumer for not fully researching their purchase so they dont get "conned" yes it is.

but its also clearly deceptive advertising on nvidias part for trying to confuse what its products are.

RE: Stupid Stupid
By Reclaimer77 on 3/4/09, Rating: -1
RE: Stupid Stupid
By ninjaquick on 3/4/09, Rating: -1
RE: Stupid Stupid
By afkrotch on 3/4/2009 2:57:21 PM , Rating: 1
So what exactly do you consider a performance difference? The new card will be smaller in physical size (if you get the 1 gb card), it's a much better yield for the gpu, so it'll run cooler and consume less power. Under idle/full load it's using roughly 40w less power.

I personally could care less that they named it a GTS250. I know what it is, you know what it is. Morons won't know, but hell. They wouldn't even know what a 9800GTX+ even is anyways.

RE: Stupid Stupid
By therealnickdanger on 3/4/2009 12:14:26 PM , Rating: 3
I dunno, I think there's a lot of things that "don't make sense" to us on the outside looking in. To assume it's a con game or intentional deception is too pessimistic for my tastes. NVIDIA doesn't want to have three cards on the market (8800, 9800, GT250), it wants its brand to have one (perceived) line at a time.

Look at it this way:

NVIDIA makes a GREAT card, the 8800, then continue (as usual) to improve the manufacturing side of it and then want to introduce some better mid and low end parts for a new 9000-series lineup. Well, do they bring out the 9600GT and leave the 8800GT on the market? Think of all the ignorant customers that will buy a 9600GT instead of a 8800GT because the number is higher! That would not only hurt NVIDIA's reputation, but also their bottom line (high end cards provide more margin).

The solution? "Well, we don't have a new architecture for our high-end card yet, so lets rebadge the 8800GT as the 9800GT so that our customers know that 9800 > 9600 instead of 9600 > 8800."

So that carried them through the 9000-series until the GT-series hit. OK, so now they have these huge, badass graphics cards on the high end, but they have nothing in the GT-series for the mid or low end. Once again, they have a numbering problem. In the ignorant consumers' eyes, they see 8800/9800 > 260/280. Why not clear up the confusion and rebadge a clear and solid winner to be a placeholder in their GT lineup? It helps ensure that consumers get what they expect. Let's face it, they assume a higher number = a better product.

Obviously, NVIDIA knows that you and I know the difference (or lack thereof) between the 8800/9800/GT250. NVIDIA also has shareholders that demand performance. If the GT line isn't successful because they don't have a mainstream part ON PAPER, then it looks bad ON PAPER, and that could put NVIDIA in a difficult situation for their next architecture.

NVIDIA's only error in all this, IMO, is failing to discontinue previous models when the new ones hit.

For the record, I've been using and preferring mostly ATI cards for the past 5 years.

RE: Stupid Stupid
By Mr JB on 3/4/2009 12:27:20 PM , Rating: 2
We live in a world of small margins. So I can't blame Nvidia trying to get more return on their investement. I am in the affect though that I would hate it not to go to a good cause. Befor anyone points out I don't mean the fat cats pockets.

RE: Stupid Stupid
By kattanna on 3/4/2009 10:42:32 AM , Rating: 3
if my memory is correct, they are also using that same G92 chip in a new desktop product called the GTS 250. leaving people to assume, incorrectly, that its a more mainstream 260 board.

when in fact all it is is a repackaged 9800 card.

nvidia is really making me wonder nowadays.

RE: Stupid Stupid
By afkrotch on 3/4/2009 3:08:53 PM , Rating: 2
It's cause AMD was a lot more agile this architecture. AMD hit all the markets with their cards, while Nvidia was just sitting at the high end. They simply aren't ready to release a low end yet, so as a bandaid fix, they repackaged the 9800GTX+ to fill that void.

I doubt the GTS250 will last very long, when actual low end GT/GTS/GTX cards come out. I mean, the 9800GTX lasted what? A couple months before the 9800GTX+ came out.

Nvidia would like nothing more than to be on one architecture, but they just aren't ready to do it yet.

RE: Stupid Stupid
By Motoman on 3/4/2009 11:32:59 AM , Rating: 5
...I think Nvidia thinks the average consumer is dim enough to buy an old product with a new name on it.

...and they're probably right.

RE: Stupid Stupid
By TSS on 3/4/2009 11:41:56 AM , Rating: 2
Now now, they are making the life of the consumer difficult but by no way this rebranding is stupid.

If you couldn't produce a worthy competitor, wouldn't you try to confuse the public as much as possible to still get sales? It doesn't matter if you rebrand old products, it matters if people still buy it.

Their probably puling every trick they have to keep their market share afloat untill their next GPU's. And on this point, if the next generation fails to beat ATI, i think we'll see Nvidia beeing bought by intel.

Personally i'm on the verge of buying a new PC. I'm kind of done with ATI drivers (i miss the geforce ti4600 times... never ever had problems with those drivers) but i'm not convinced on Nvidia's product right now, or their business ethics. So i'll probably wind up with a HD4870.

On a side note though, lets not forget that actually making a GPU is still complicated as hell. You can't really hold it against them that they keep rebranding as long as they possibly could.

RE: Stupid Stupid
By Bateluer on 3/4/2009 11:51:57 AM , Rating: 2
I was going to buy a 9800M powered notebook, but now I've shifted gears to looking at a Radeon 4850 powered notebook. Given my time table, this could easily be a 40nm 4860 powered notebook in Q2 09.

I've had great experiences with ATI card on my desktop, and my first three laptops all used Radeons as well, the original Mobility Radeon, a MR9000, and an MR9600.

RE: Stupid Stupid
By poundsmack on 3/4/09, Rating: -1
RE: Stupid Stupid
By boogle on 3/5/2009 5:44:59 AM , Rating: 3
I can only assume a lot of people here also have a problem with CPUs being offered at different speeds and price points - despite being the exact same core.

Although oddly CPUs sometimes do the reverse and change the core a fair amount - yet still brand it under the same name.

Either way I don't see what the problem is - as long as the product performs as it should, the underlying technology matters little to me. It's not like a game gives a popup 'Sorry, but while your GPU will run this game fine, its been given a name we don't like, so this game will now close. Buy ATI'.

Now for the obligatory DailyTech anti-minus a million rating: The current underdog (NV/ATI whoever) rocks. The company (NV/ATI whoever) with currently good financials is rubbish and doesn't know anything about putting out a good product.

RE: Stupid Stupid
By SiliconDoc on 3/13/09, Rating: -1
Ignorance is bliss
By Schrag4 on 3/4/2009 10:11:51 AM , Rating: 5
Don't you all get tired of explaining to your family and friends how a mobile graphics solution is inferior to its similarly named desktop part? I know I do. Using the same architecture for 3 named generations??? Come'on....

I had similar gripes with names like SuperSpeed USB and ExtremeFS, terrible names. Just terrible. What's the next version? Super-Duper speed? Ultra-Mega-ExtremeFS?

RE: Ignorance is bliss
By ExarKun333 on 3/4/2009 11:29:59 AM , Rating: 2
That's a good point. It is hard to explain to the computer-illiterate that their "GTX280M" chip is really a couple-year old 8800-series GPU that you can get for around $75 on the desktop. They will say "it's better than your GTX 260 C216 GPU...look, 280 > 260!!!".

RE: Ignorance is bliss
By Bateluer on 3/4/2009 11:49:54 AM , Rating: 2
But you can respond with My 260 + 216 is greater than your 280M.

Or you could just run them through a loop of 3Dmark 06 or Vantage?

RE: Ignorance is bliss
By ZipSpeed on 3/4/2009 12:37:47 PM , Rating: 3
And the scary thing is that Nvidia will also be able to get away with charging a premium on the chip just because it says 280M.

Don't get me wrong, Nvidia still makes some great products but I've lost some respect for them these last couple years.

RE: Ignorance is bliss
By ExarKun333 on 3/4/2009 12:42:45 PM , Rating: 5
You are right on, they will charge a lot for this chip. I don't have an issue with the "M" version being slightly neutered for mobile use, thats not that uncommon. I would expect to see less memory bandwith and probably less shaders...but it SHOULD at least be the same architechure, right!?

A good analogy would be if AMD released a "Phenom Mobile 9550" CPU and it was really a Athlon X2 CPU. It just doesn't make sense.

RE: Ignorance is bliss
By boogle on 3/5/2009 5:51:49 AM , Rating: 1
If I release product Z that's faster than product A - does it matter which one has what underneath?

In your analogy, if the Phenom Mobile 9550 was faster than the competition, and/or other CPUs from the same brand with the Athlon X2 name - would you still have a problem with the name? What about the Sempron which was just a slightly cut down Athlon XP? What about the Celeron? What if I had a Celeron that was faster than a Pentium? Should I continue to call the Celeron a Celeron? Would it not be more confusing having a 'budget' name beating a proper name? Surely it would be less confusing to just brand the oddly fast Celeron a 'Pentium' purely to avoid performance confusion.

I often think NV should just rename the 'codenames' for these cores so that even tho its still G92 it's renamed to something like GT200M and launched as a 'new' core. Then no one would complain. It's only this teeny bit of additional knowledge that has people going mental. Never mind that G92 is still competitive...

RE: Ignorance is bliss
By Starcub on 3/4/2009 1:05:32 PM , Rating: 2
Don't you all get tired of explaining to your family and friends how a mobile graphics solution is inferior to its similarly named desktop part? I know I do. Using the same architecture for 3 named generations??? Come'on....

With more people deciding to buy laptops instead of desktop computers, they may be trying to differentiate the segments a little more. Gamers should really be buying desktop computers; even most laptop gamers are concerned about heat and battery life given what has happened in terms of reliability with gaming laptops (now about a 6 year old market).

I don't see the tick tock tock strategy as something bad myself; my 8800M GTS is plenty good enough for most of the games I play.

RE: Ignorance is bliss
By afkrotch on 3/4/2009 3:43:20 PM , Rating: 2
GPUs going with a tick tock tock tick strategy has worked for a while now. I think it's less of a problem now than before, with multiple gpu solutions available. Games are becoming more graphically intensive, but not so much that graphics cards aren't able to keep up.

The latest game I've bought was FEAR 2. I'm clocking in at 60 fps (seems like the game's capped at that) at 1920x1200. I'm not using any AA/AF though. Maybe it's just me, but I can't tell the difference with it on or off. Probably cause I'm usually looking at what's right near me and not some wire, fence, tree way off in the distance. Looking at that crap gets you killed.

RE: Ignorance is bliss
By i3arracuda on 3/5/2009 12:17:39 PM , Rating: 2
I had similar gripes with names like SuperSpeed USB and ExtremeFS, terrible names. Just terrible. What's the next version? Super-Duper speed? Ultra-Mega-ExtremeFS?

What about SuperSpeed USB 2.0 Ultimate Hyper-transporting Special Champion Edition...TURBO.

Hey, it worked for Street Fighter II.

Look at it from Nvidia's shoes...
By roostitup on 3/4/2009 11:53:27 AM , Rating: 2
I don't like what Nvidia has been doing either with there renaming, but what else can they do? They lost a lot of money with their bumping issue and they need to gain their credibility back. They probably don't have enough money to design a whole new mobile chip. Granted this holds back the GPU market from improving, but ATI is also somewhat hurting and hasn't really been pushing the performance limits either. Nvidia has no incentive to improve it's GPU's performance because ATI isn't being as tough of a competator as they could be, but this is slowly changing in ATI's favor. With this slowing of R&D from Nvidia it is giving ATI the chance to take the lead and maybe we will see Nvidia finally be pushed off the podium. The G92 is still a good performing chip and it competes well with ATI's offerings, and at least they suposidly fixed the bumping issue. It's deceptive marketing, but it seems that it may be all they can do at this moment in time to help correct their poor recent history as effeciently as possible. Makes me wonder if the deceptive marketing will hurt their credibility as bad as the bumping issue?

By kilkennycat on 3/4/2009 8:35:36 PM , Rating: 2
They probably don't have enough money to design a whole new mobile chip.

Well, nVidia has currently about $1 billion in the bank and a total debt of $26 million, as per their last quarterly report. Compared with AMDs ~ $2- 3 billion in the red (after the foundry spin-off). nVidia is designing a complete family of new-architecture GPUs, but as is said elsewhere 'when they're done, they're done'. Don't expect any announcements at all until there is working PRODUCTION-YIELD silicon. Smaller-geometry processes bring greater device-modeling uncertainty, tougher power-management problems and very expensive masking costs, so it is very wise indeed for management not to put any short-focus business pressure on the design cycle.

By Demon-Xanth on 3/4/2009 12:53:02 PM , Rating: 4
When nVidia came out with the 9000 series as a rebadge, I thought "Why are they wasting their time? This seems like a step down, not up.". Then when they came out with the GTX 280/260 I thought "Alright, they came out with a real next generation card."...

...then they announced the GTS 250, and now this. What happened to the company that marched forward relentlessly? The G92 chip went from an 8800GTS which was second of the line, to the 9800GTX+ which was top of the line, to the GTS250 which was third of the line, and now the GTX280M which is the top of the line again. The only differences have been die shrinks and a minor speed bump.

nVidia, please. Refill the coffee machine in your R&D dept. breakroom.

By MrGaZZaDaG on 3/4/2009 10:47:52 PM , Rating: 3
Doesnt Nvidia do this with every card?
use last years chips, rebadge it and sell it for more...

pretty lame

Who cares, nVidia will be bankrupt soon anyways
By Mithan on 3/4/09, Rating: 0
By Goty on 3/4/2009 11:09:41 AM , Rating: 2
Riiight, because the high-end is where GPU makers make all the money, right? That whole mid/low-range and integrated market gets them nowhere even though that's where they sell the most GPUs.

"This week I got an iPhone. This weekend I got four chargers so I can keep it charged everywhere I go and a land line so I can actually make phone calls." -- Facebook CEO Mark Zuckerberg

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki