Print 86 comment(s) - last by Ard.. on Nov 14 at 12:19 AM

Get ready for an avalanche of new NVIDIA products

NVIDIA's GeForce 8800 GT might be one of the best performance-per-dollar cards since the Radeon 9800, but things are moving fast at NVIDIA and there's a lot more on the way from the company between now and next Summer.

Over the last quarter, the company moved away from the old "Gx" designation for its core names, instead opting to switch to a more descriptive system.  NVIDIA's new codenames follow:
  • D8M: Eighth generation mainstream, previously named G98
  • D8P: Eighth generation performance, previously named G92
  • D9M: Ninth generation mainstream
  • D9P: Ninth generation performance
  • D9E: Ninth generation enthusiast
GeForce 8800 GT, codenamed G92 and D8P, stole the majority of the headlines last week.  GeForce 8800 GT, the 112 stream processor sub-titan, became NVIDIA's first 65nm processor design. However, NVIDIA's dark horse was really the revision on GeForce 8800 GTS SSC.

GeForce 8800 GTS SSC, as it’s awkwardly called, is essentially identical to the GeForce 8800 GTS based on the 90nm G80 core.  However, where typical 8800 GTS components only enables 96 of the 128 stream processors of the G80 core, the 8800 GTS SSC enables 112 stream processors -- the same number featured on the GeForce 8800 GT.

And yet in December, GeForce 8800 GTS is expected to undergo another revision as the company moves from the 90nm G80 core to the 65nm D8P.  Vendors will introduce 112 stream processor and 128 stream processor revisions on D8P, which even further convolutes the corporate guidance put forth just a week ago.

NVIDIA will continue to cannibalize the GeForce 8000 series as it moves to 65nm silicon across the board.  GeForce 8400 will likely be the first to go before the end of the year, as the G86 design is replaced by the 65nm D8M silicon, which was previously called G98.

As 2007 comes to a close, the company will ramp production on ninth-generation components to replace the eighth-generation 65nm parts, D8x.  Sound familiar? It should, as NVIDIA is almost exactly replicating Intel's tick-tock strategy of alternate cycles of design and shrink. 

Early NVIDIA roadmaps claim D9M, the first ninth-generation NVIDIA component, will replace the GeForce 8500-series lineup.  There's no retail designation for these D9x parts, but it would be a safe bet to say these will be the GeForce 9xxx-series cards.

D9M will add PCIe 2.0 support, DirectX 10.1, wider memory controllers (up to 128-bits) and will be based on a 65nm silicon.  D9P, the likely 8600-series replacement, adds the same features as D9M, but the memory controller width will top out at 256-bits.

D9E, the enthusiast component slated to replace the GeForce 8800-series, will incorporate all of the features of D9P and add a 512-bit memory bus. NVIDIA is holding its cards close on D9E, and has not provided any other guidance or release date.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Will they ever learn?
By themadmilkman on 11/8/2007 3:53:47 PM , Rating: 4
Because it increases their profits. Pure and simple. It costs a lot of money to develop these new chips. The longer they can keep the profit margin high on the newly developed chips, and because said margin would be reduced by the introduction of a better chip, holding back allows them to maximize profits.

Really simplified version:

Let's say Intel only has two chips on the market at any one time. A high-end new chip, and a low-end previous chip.

It costs $1 to produce each chip.

It costs $90 to develop a new chip.

Intel sells it's new chips for $10, and it's old chips for $5.

Over a set period of time, Intel sells 10 new chips and 20 old chips. As a result, Intel has made $80. (10*10 = 100, deduct production and development, no money made on new chips, 20*5 = 100 from old chips - 20 productions == $80 final profit).

Now Intel has a choice. Release a new chip, with the same development costs, or ride out the current chips. Assuming that there is no competition for the top spot, as is the case now, and sales remain constant, Intel would now make $170. Not releasing a new product increases their profits by $70 for the time period.

Now, this is EXCEPTIONALLY simplified, but it gets the basic point across.

RE: Will they ever learn?
By KristopherKubicki on 11/8/2007 3:56:57 PM , Rating: 5
Someone tied up in the executive level of a motherboard manufacturer told me this once:

Selling computers is like selling produce -- every day it sits there its worth less.

RE: Will they ever learn?
By Lifted on 11/8/2007 5:36:48 PM , Rating: 1
That's a bit too simplified. In your example, Intel makes more money by not investing as much in R&D when they have no need to, but in reality Intel's R&D spending hardly (if ever) goes down, always up. As the market grows and as profits grow, so will R&D. Even if they had a 100% monopoly on the market, they would have to spend on R&D or else their profits would start to flatten out as people wouldn't do the upgrade cycle as frequently as they do now.

RE: Will they ever learn?
By themadmilkman on 11/9/2007 4:45:32 AM , Rating: 2
I did note that it was exceptionally simplified.`

RE: Will they ever learn?
By themadmilkman on 11/9/2007 12:17:33 PM , Rating: 2
Perhaps for a more accurate (exceptionally simplified) example, you could replace the R&D one-time costs with retooling costs. It does cost Intel a lot of money to set up plants capable of producing these chips. R&D would then be included in the production costs of each chip. That would perhaps be more accurate.

"Well, there may be a reason why they call them 'Mac' trucks! Windows machines will not be trucks." -- Microsoft CEO Steve Ballmer

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki