Print 50 comment(s) - last by FaceMaster.. on Mar 17 at 6:53 PM

Marchitecture at its finest

NVIDIA has the fastest single GPU for the desktop in the GTX 285, a 55nm die-shrunk version of its predecessor the GTX 280. However, ATI has been able to gain a larger market share due to aggressive pricing and ramping of smaller geometries.  This has led to price pressure on NVIDIA, especially in the performance mainstream segment.

NVIDIA's original GT200 chip -- which is used in the GTX 280 and GTX 260 -- is too big, too costly, and consumes too much power to be used effectively in a mobile solution. NVIDIA has already switched to TSMC's 55nm process from the baseline 65nm node to deal with these issues for the GTX 285, but it is still not suitable for the majority of laptop users. Battery life is too short, the cooling fan is too loud, and the cost is too much.

One solution was to begin manufacturing on the 40nm bulk process like ATI has done. According to our sources, NVIDIA's attempts to produce a die-shrunk 40nm GT200 chip were "disastrous at best". Design problems became evident, since the GT200 was originally designed for the 65nm node. Two shrinks in a row without a major redesign was just too much for NVIDIA, and our most recent information from Taiwan is that the first 40nm chips from NVIDIA will be in the GeForce 300 series.

Without a power efficient GT200 based GPU solution for the mobile or mainstream value markets, NVIDIA is rebranding the 55nm G92b chip yet again to meet these critical segments. The original 65nm G92 chip was used in the GeForce 8800 GT, but you can only do so much with an older design. The chip was respun as the G92b with a 55nm die shrink, and is currently used in the 9800 GTX+. All G92 chips are only DirectX 10 capable, and will not support the full feature set of DirectX 10.1 or DirectX 11 that will come with Windows 7.

The problem is that many consumers will pick up a GTX 280M or GTX 260M thinking that it is the same or similar to the GTX 280, when it is actually just a 9800 GTX+.

There is currently no GeForce 100 series for the desktop market, but Nvidia launched the GeForce GTS 160M and 150M for notebooks at the same time as the GTX280M and the GTX260M.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Stupid Stupid
By Belard on 3/4/2009 9:26:34 AM , Rating: 5
Is nvidia racing to out-do themselves every 6 months with stupid re-branding of the same chip?! Geez, it was bad when the 8800GTs-512 came out.

How about the GF2-MX, which was re-branded GF2-mx200 and mx400 (The mx200 was a slower version of the already low-end mx).

So with another re-branded G92 chip... does it or does it not have the design flaw? Or is the GTX for notebooks is supposed to make us feel safer at using it or take a chance for video death 6~18 months later?

If Nvidia had half a brain, they would have re-branded these older chips as GTX 1x0... it would have made a bit more sense... rather than using 3 generations of names for the same bloody thing.

RE: Stupid Stupid
By inperfectdarkness on 3/4/2009 9:37:12 AM , Rating: 5
going to buy ati in my next laptop. at least i know i can run 10.1 that way.

RE: Stupid Stupid
By boogle on 3/4/09, Rating: 0
RE: Stupid Stupid
By StevoLincolnite on 3/4/2009 1:37:33 PM , Rating: 5
Which games will run acceptably using the DX10.1 features on a standard laptop? Hell, the number of DX10.1 titles is minimal at best making it even less likely you'll ever actually use the 'features' 10.1 offers over 10.

That's his choice as a consumer, However there have been a few cases in past history where buying a chip with superior performance but with a feature set that is a generation behind would mean that the card would be useless for newer generation games despite having ample enough processing power to do so.

A few cases where this happened was with the x8xx series of GPU's from ATI, where SM3 only games wouldn't run on the GPU's. - Games like Bioshock.
What happened? The community gets together and re-writes the games shader packages (Shader Shock).

Or hows about when TnL became mainstream which left out the bulk of Intel IGP's, The Radeon VE/LE, The Voodoo 4 and 5 GPU's even though they were performance wise capable of running the games?

Or when games started needing Pixel Shaders and the Geforce 4 MX440 being one of the best selling GPU's of all time was in-capable of running such games? - Despite being faster than the Geforce FX5200?

Performance can always be increased by reducing the games quality, driver tweaking, overclocking, but there is only so much you can do with feature sets not being available, not everyone has the cash or even desire to upgrade every 12 months as soon as something better comes along.

RE: Stupid Stupid
By nafhan on 3/5/2009 11:58:10 AM , Rating: 2
Yep, I can't run rainbow six vegas 2 on my older computer because it has an x800 gto.

RE: Stupid Stupid
By aegisofrime on 3/4/2009 9:51:10 AM , Rating: 2
As much as I like ATI (I'm running a 4870 myself), I really wish that ATI would get their GPGPU efforts up to speed with nVidia. I'm looking enviously at Badaboom while ATI only has that crappy AVIVO converter which I heard doesn't even use the GPU and outputs crappy quality. And they don't even have it out for 64-bit Vista!

RE: Stupid Stupid
By inperfectdarkness on 3/4/2009 12:43:57 PM , Rating: 4
i'd like to see the 4780m overclocked, die shrunk to 40mm, given 1gb of gddr5, and offered in an "all in wonder" configuration.

i'd also like a winning lottery ticket...but that won't happen either.

RE: Stupid Stupid
By GodisanAtheist on 3/4/2009 4:51:59 PM , Rating: 2
That's really not that far fetched...

die shrunk to 40 mm

...if we've somehow discovered how to shrink things to a much larger size...

RE: Stupid Stupid
By FaceMaster on 3/4/2009 6:00:58 PM , Rating: 5
Yes, we understand he made a mistake. No need to be so gay about it.

RE: Stupid Stupid
By GodisanAtheist on 3/4/09, Rating: -1
RE: Stupid Stupid
By FaceMaster on 3/5/2009 7:46:41 AM , Rating: 4
I can't help being who I am.

That's just an excuse people use to justify making the same mistakes over and over again. Anybody can learn. It's just that most people are too lazy.

RE: Stupid Stupid
By GodisanAtheist on 3/11/2009 8:40:10 PM , Rating: 1
Anybody can learn

... to not be gay? Cause that's kinda the joke I was going for.

RE: Stupid Stupid
By FaceMaster on 3/17/2009 6:53:21 PM , Rating: 1
... to not be gay? Cause that's kinda the joke I was going for

I don't mind gay people. I just can't stand it when PEOPLE FLAUNT IT. You've now made two posts revolving directly around gayness. Good for you. This isn't a dating site though. So stop with these sex-based comments.

And so should your Mum (/irony)

RE: Stupid Stupid
By Fox5 on 3/4/09, Rating: 0
RE: Stupid Stupid
By therealnickdanger on 3/4/2009 9:55:35 AM , Rating: 3
NVIDIA's problem, I think, is that it believes it is controlling the market. While it is a KEY player, perhaps even the biggest player, it's not enough to shape the market as it sees fit. NVIDIA is rolling the dice, expecting that if it can just get one or two of its proprietary* technologies (like CUDA or PhysX) to become mainstream, then it can sit back and devour royalties.

Too bad for NVIDIA that Microsoft is leading the charge when it comes to graphics APIs. Is DX11 still going to feature "DirectPhysics"? With Intel also moving toward SoC and more graphics/physics enhancements to their CPUs, NVIDIA's dice rolling will eventually come up snake eyes...

RE: Stupid Stupid
By winterspan on 3/4/2009 4:59:16 PM , Rating: 1
Yep, and I don't believe either will succeed.

CUDA won't survive long with Apple's OpenCL push and MS's DirectX11 "compute shader"...

And while I believe Physx has been somewhat successful, there is no way that the industry will accept an Nvidia only physics solution over the long term, particularly as Nvidia lost so much enthusiast momentum once ATI got their act together with the excellent RV770 parts..

RE: Stupid Stupid
By SiliconDoc on 3/14/09, Rating: 0
RE: Stupid Stupid
By murphyslabrat on 3/4/2009 9:59:36 AM , Rating: 2
It's not terribly different from what NVidia has been doing with it's mobile GPU's to date. For instance, I'm using a laptop with a GeForce 8800M GTS. In reality, what I have is a 64 SP card with a shader clock of 1250 hz, making it barely competitive with a desktop version of the 9600 GT.

The 9600M GT, on the other hand, is basically a mildly overclocked 8600M with a die-shrink.

RE: Stupid Stupid
By aguilpa1 on 3/4/2009 10:00:51 AM , Rating: 5
Yup its stupid and somewhat criminal in that through marketing they are attempting to fool new buyers or the uninitiated into thinking they are getting a "new" product. I wouldn't mind so much if Nvidia was just honest and call a goose a goose and quit slapping some lipstick on it every 6 months and giving it a different name. (weird mental image perhaps bad analogy, seeing as how geese have no lips)

RE: Stupid Stupid
By therealnickdanger on 3/4/09, Rating: -1
RE: Stupid Stupid
By Reclaimer77 on 3/4/09, Rating: -1
RE: Stupid Stupid
By kattanna on 3/4/2009 11:36:51 AM , Rating: 5
the problem with that line of thought though is that people looking at your 300c from 2008, are not being told that in 2009 its a new 400c model! when in reality its the same 300c.

GPU makers have for many many years now used advancing model numbers to show new tech levels, AKA geforce 5XXX, 6XXX, 7XXX, 8XXX.. with numbering within that range to show higher and lower end versions of that new tech process, EXP a 6800 being a higher end model then a 6600.

now nvidia released a new tech for GPU's in the GT 260/280 parts. the tech level is even called the GT200 chip. now they try to introduce "new" cards based on older tech and call it the GTS250, clearly implying its an lower end part of the new GT200 tech.

is it the fault of the consumer for not fully researching their purchase so they dont get "conned" yes it is.

but its also clearly deceptive advertising on nvidias part for trying to confuse what its products are.

RE: Stupid Stupid
By Reclaimer77 on 3/4/09, Rating: -1
RE: Stupid Stupid
By ninjaquick on 3/4/09, Rating: -1
RE: Stupid Stupid
By afkrotch on 3/4/2009 2:57:21 PM , Rating: 1
So what exactly do you consider a performance difference? The new card will be smaller in physical size (if you get the 1 gb card), it's a much better yield for the gpu, so it'll run cooler and consume less power. Under idle/full load it's using roughly 40w less power.

I personally could care less that they named it a GTS250. I know what it is, you know what it is. Morons won't know, but hell. They wouldn't even know what a 9800GTX+ even is anyways.

RE: Stupid Stupid
By therealnickdanger on 3/4/2009 12:14:26 PM , Rating: 3
I dunno, I think there's a lot of things that "don't make sense" to us on the outside looking in. To assume it's a con game or intentional deception is too pessimistic for my tastes. NVIDIA doesn't want to have three cards on the market (8800, 9800, GT250), it wants its brand to have one (perceived) line at a time.

Look at it this way:

NVIDIA makes a GREAT card, the 8800, then continue (as usual) to improve the manufacturing side of it and then want to introduce some better mid and low end parts for a new 9000-series lineup. Well, do they bring out the 9600GT and leave the 8800GT on the market? Think of all the ignorant customers that will buy a 9600GT instead of a 8800GT because the number is higher! That would not only hurt NVIDIA's reputation, but also their bottom line (high end cards provide more margin).

The solution? "Well, we don't have a new architecture for our high-end card yet, so lets rebadge the 8800GT as the 9800GT so that our customers know that 9800 > 9600 instead of 9600 > 8800."

So that carried them through the 9000-series until the GT-series hit. OK, so now they have these huge, badass graphics cards on the high end, but they have nothing in the GT-series for the mid or low end. Once again, they have a numbering problem. In the ignorant consumers' eyes, they see 8800/9800 > 260/280. Why not clear up the confusion and rebadge a clear and solid winner to be a placeholder in their GT lineup? It helps ensure that consumers get what they expect. Let's face it, they assume a higher number = a better product.

Obviously, NVIDIA knows that you and I know the difference (or lack thereof) between the 8800/9800/GT250. NVIDIA also has shareholders that demand performance. If the GT line isn't successful because they don't have a mainstream part ON PAPER, then it looks bad ON PAPER, and that could put NVIDIA in a difficult situation for their next architecture.

NVIDIA's only error in all this, IMO, is failing to discontinue previous models when the new ones hit.

For the record, I've been using and preferring mostly ATI cards for the past 5 years.

RE: Stupid Stupid
By Mr JB on 3/4/2009 12:27:20 PM , Rating: 2
We live in a world of small margins. So I can't blame Nvidia trying to get more return on their investement. I am in the affect though that I would hate it not to go to a good cause. Befor anyone points out I don't mean the fat cats pockets.

RE: Stupid Stupid
By kattanna on 3/4/2009 10:42:32 AM , Rating: 3
if my memory is correct, they are also using that same G92 chip in a new desktop product called the GTS 250. leaving people to assume, incorrectly, that its a more mainstream 260 board.

when in fact all it is is a repackaged 9800 card.

nvidia is really making me wonder nowadays.

RE: Stupid Stupid
By afkrotch on 3/4/2009 3:08:53 PM , Rating: 2
It's cause AMD was a lot more agile this architecture. AMD hit all the markets with their cards, while Nvidia was just sitting at the high end. They simply aren't ready to release a low end yet, so as a bandaid fix, they repackaged the 9800GTX+ to fill that void.

I doubt the GTS250 will last very long, when actual low end GT/GTS/GTX cards come out. I mean, the 9800GTX lasted what? A couple months before the 9800GTX+ came out.

Nvidia would like nothing more than to be on one architecture, but they just aren't ready to do it yet.

RE: Stupid Stupid
By Motoman on 3/4/2009 11:32:59 AM , Rating: 5
...I think Nvidia thinks the average consumer is dim enough to buy an old product with a new name on it.

...and they're probably right.

RE: Stupid Stupid
By TSS on 3/4/2009 11:41:56 AM , Rating: 2
Now now, they are making the life of the consumer difficult but by no way this rebranding is stupid.

If you couldn't produce a worthy competitor, wouldn't you try to confuse the public as much as possible to still get sales? It doesn't matter if you rebrand old products, it matters if people still buy it.

Their probably puling every trick they have to keep their market share afloat untill their next GPU's. And on this point, if the next generation fails to beat ATI, i think we'll see Nvidia beeing bought by intel.

Personally i'm on the verge of buying a new PC. I'm kind of done with ATI drivers (i miss the geforce ti4600 times... never ever had problems with those drivers) but i'm not convinced on Nvidia's product right now, or their business ethics. So i'll probably wind up with a HD4870.

On a side note though, lets not forget that actually making a GPU is still complicated as hell. You can't really hold it against them that they keep rebranding as long as they possibly could.

RE: Stupid Stupid
By Bateluer on 3/4/2009 11:51:57 AM , Rating: 2
I was going to buy a 9800M powered notebook, but now I've shifted gears to looking at a Radeon 4850 powered notebook. Given my time table, this could easily be a 40nm 4860 powered notebook in Q2 09.

I've had great experiences with ATI card on my desktop, and my first three laptops all used Radeons as well, the original Mobility Radeon, a MR9000, and an MR9600.

RE: Stupid Stupid
By poundsmack on 3/4/09, Rating: -1
RE: Stupid Stupid
By boogle on 3/5/2009 5:44:59 AM , Rating: 3
I can only assume a lot of people here also have a problem with CPUs being offered at different speeds and price points - despite being the exact same core.

Although oddly CPUs sometimes do the reverse and change the core a fair amount - yet still brand it under the same name.

Either way I don't see what the problem is - as long as the product performs as it should, the underlying technology matters little to me. It's not like a game gives a popup 'Sorry, but while your GPU will run this game fine, its been given a name we don't like, so this game will now close. Buy ATI'.

Now for the obligatory DailyTech anti-minus a million rating: The current underdog (NV/ATI whoever) rocks. The company (NV/ATI whoever) with currently good financials is rubbish and doesn't know anything about putting out a good product.

RE: Stupid Stupid
By SiliconDoc on 3/13/09, Rating: -1
"Intel is investing heavily (think gazillions of dollars and bazillions of engineering man hours) in resources to create an Intel host controllers spec in order to speed time to market of the USB 3.0 technology." -- Intel blogger Nick Knupffer

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki