Print 86 comment(s) - last by Ard.. on Nov 14 at 12:19 AM

Get ready for an avalanche of new NVIDIA products

NVIDIA's GeForce 8800 GT might be one of the best performance-per-dollar cards since the Radeon 9800, but things are moving fast at NVIDIA and there's a lot more on the way from the company between now and next Summer.

Over the last quarter, the company moved away from the old "Gx" designation for its core names, instead opting to switch to a more descriptive system.  NVIDIA's new codenames follow:
  • D8M: Eighth generation mainstream, previously named G98
  • D8P: Eighth generation performance, previously named G92
  • D9M: Ninth generation mainstream
  • D9P: Ninth generation performance
  • D9E: Ninth generation enthusiast
GeForce 8800 GT, codenamed G92 and D8P, stole the majority of the headlines last week.  GeForce 8800 GT, the 112 stream processor sub-titan, became NVIDIA's first 65nm processor design. However, NVIDIA's dark horse was really the revision on GeForce 8800 GTS SSC.

GeForce 8800 GTS SSC, as it’s awkwardly called, is essentially identical to the GeForce 8800 GTS based on the 90nm G80 core.  However, where typical 8800 GTS components only enables 96 of the 128 stream processors of the G80 core, the 8800 GTS SSC enables 112 stream processors -- the same number featured on the GeForce 8800 GT.

And yet in December, GeForce 8800 GTS is expected to undergo another revision as the company moves from the 90nm G80 core to the 65nm D8P.  Vendors will introduce 112 stream processor and 128 stream processor revisions on D8P, which even further convolutes the corporate guidance put forth just a week ago.

NVIDIA will continue to cannibalize the GeForce 8000 series as it moves to 65nm silicon across the board.  GeForce 8400 will likely be the first to go before the end of the year, as the G86 design is replaced by the 65nm D8M silicon, which was previously called G98.

As 2007 comes to a close, the company will ramp production on ninth-generation components to replace the eighth-generation 65nm parts, D8x.  Sound familiar? It should, as NVIDIA is almost exactly replicating Intel's tick-tock strategy of alternate cycles of design and shrink. 

Early NVIDIA roadmaps claim D9M, the first ninth-generation NVIDIA component, will replace the GeForce 8500-series lineup.  There's no retail designation for these D9x parts, but it would be a safe bet to say these will be the GeForce 9xxx-series cards.

D9M will add PCIe 2.0 support, DirectX 10.1, wider memory controllers (up to 128-bits) and will be based on a 65nm silicon.  D9P, the likely 8600-series replacement, adds the same features as D9M, but the memory controller width will top out at 256-bits.

D9E, the enthusiast component slated to replace the GeForce 8800-series, will incorporate all of the features of D9P and add a 512-bit memory bus. NVIDIA is holding its cards close on D9E, and has not provided any other guidance or release date.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Will they ever learn?
By AvidDailyTechie on 11/8/2007 3:00:17 PM , Rating: 4
What's the deal with nVIDIA, and Intel for that matter, not pushing tech forward just because they're in the lead. I've seen both companies push as hard as they can when they're behind (7950 GX2, P4)... It's like they're complacent with being just a bit 'better'? Why not completely blow the competition out of the water??

I know nVIDIA could do a high end refresh much sooner than it's scheduled (if at all), probably wheneveer they wanted...

Same goes with Intel, I know they're not pushing the clocks of their new chips, instead they're releasing very slight performance increases while taking more profit at the expence of the consumer.

I guess that's business.

RE: Will they ever learn?
By AvidDailyTechie on 11/8/2007 3:01:26 PM , Rating: 2
I like the old names better.

RE: Will they ever learn?
By JarvisTheGray on 11/8/07, Rating: -1
RE: Will they ever learn?
By SavagePotato on 11/8/2007 3:50:22 PM , Rating: 3
Apparently in your world Intel released a totally different netburst architecture that was awesome and not in fact a total piece that hit a brick wall in the massive mhz push they designed it for.

Intel was indeed stagnant and it took the Athlon 64 to shake them up and get back on track with the core architecture.

RE: Will they ever learn?
By murphyslabrat on 11/12/2007 10:57:22 AM , Rating: 4
Not sure if I correctly understood where you were going with this, but Intel was far from behind when they launched P4. And Core 2 (aka P5)
A clock-speed equivelant comparison between Pentium III and Athlon Thunderbird. Thunderbird comes out on the upper side of dead even.
We didn't have a difficult time seeing why the AMD Athlon processors–particularly the new Thunderbird chips–are giving Intel fits and starts. The Xi 1100K MTower SP is fast, loaded, and a lot cheaper than comparably equipped Pentium III PCs. And that's a winner in our book...As configured for our testing (details below), the unit costs just $2,895 (direct)–and that includes a 19-inch monitor, a 64MB graphics board, and a totally digital audio subsystem. The two vendors were expected to set the price for a comparably equipped 1.13-GHz PIII at roughly $4,000.

While, apparently, AMD and Intel were dead even in terms of performance, AMD did have a definitive cost advantage. I.E. AMD had a one-up on Intel.

As to the A64 vs P4? Are you kidding me?!?!?!
That's a comparison between a 3.2GHz P4 vs an AXP, let alone an A64.
As for A64, I hope that this clears up any confusion.

There is a reason why Intel stuck with netburst for so long, because it performed

No, it was because it had very high clockspeeds.
This article is very detailed analysis, and you can probably skip to the test setup and results straightaway. However, this is a showcase of one of the greatest crimes Intel has perpetrated against the consumer: they laid aside a product that was able to keep a shaky pace with an Athlon 1.4Ghz and a Pentium 4 1.8Ghz. I happen to have build two PC's with PIII-S's (the 1.266 variant that was discussed here, I bought a pair off of e-Bay for $30 w/ S&H ^^j), and love them. The only trouble I ever had was with motherboard compatibility. Intel screwed the customer over in one of the most heinous manners imaginable, they made the superior product (cheaper to produce, cheaper to run, more powerful clock-for-clock) artificially expensive to purchase , in addition to -- again, artificially -- requiring a new motherboard. Can it get any worse than that?!?!?!?

and by the end of 2001 my friend's mom had bought a 3.0Ghz P4 and my machine was obsolete.

Damn, I wish I was your friends mom, as the 3.0Ghz Northwoods didn't come out till November 2002!

RE: Will they ever learn?
By MetaDFF on 11/8/2007 3:47:01 PM , Rating: 5
Blowing the competition out of the water is good for the press but not necessarily good for the bottom line. Why introduce a new product when you can use the old one to recoup research and development costs?

Chips are increasingly more difficult and expensive to design, so it makes sense for them to profit from it over the longest period of time possible.

RE: Will they ever learn?
By SavagePotato on 11/8/2007 4:52:00 PM , Rating: 2
Intel is still pushing plenty hard at this point in my opinion. Nehalem will be out as early as next summer. They aren't really ramping the clock speeds up that high but nonetheless the performance steadily increases.

Penryn is just starting to become available and marks a pretty solid performance gain over the previous offerings. Nehalem probably an even more revolutionary step. When you consider how quickly their tick tock strategy moves along it seems plenty fast, at least to me.

As far as video cards go, I am honestly a little relieved to see them slow down a bit. The 8800gtx has been a top enthusiast part for an entire year, that's quite unprecedented. It's gotten to the point where at least these 6 and 700 dollar parts are seeing a solid 2 year plus lifespan. Myself I've been using a 7900gtx for close to 2 years now, and I'm starting to feel the lacking of it finaly in games like UT3, but comparing to past cards it has held out very well for 2 years, the 8800gtx probably will be even better, considering it is still to date the highest performing single card solution.

RE: Will they ever learn?
By retrospooty on 11/8/2007 8:56:50 PM , Rating: 2
What he means is that Penryn was going to debut at 3.6ghz per Intel 6 months ago, and based on many sites early OC results easily could (anyone that has tried is hitting 3.6 without incrasing voltage or running hot at all), but since AMD has nothing to offer we are getting 3.16 and 3.2 ghz parts at the high end price points. This is exactly what they did int he past before the athlon came along.

RE: Will they ever learn?
By SlyNine on 11/9/2007 2:21:38 AM , Rating: 2
I think the 9700pro had one of the best life spanes ever.

RE: Will they ever learn?
By StevoLincolnite on 11/9/2007 4:49:48 AM , Rating: 2
The Voodoo did, with some modifications to Doom 3, and Quake 4 it could be run on a Voodoo 2 12mb card which was released in 1998. (Quake 4 was released in 2005 - so 7 years?)

Otherwise the Geforce 256, which was supported in games like Half Life 2 and FarCry without any modifications to the game.
Or perhaps the Geforce 3 Ti200 Which can run Oblivion with the help of OldOblivion?

RE: Will they ever learn?
By 3kliksphilip on 11/9/2007 8:47:37 PM , Rating: 3
I agree completely. It was the first DX 9 card out, it whipped the Geforce 4's and it ran any game out (Including the big names, such as Half Life 2, Doom 3 and Far Cry). To upgrade the DX 10 requires Vista and as most people are happy on XP, the DX 9 cards are still effective. Plus the XBOX 360, Wii and PS3 aren't DX10 compatible.

Okay, some of the things I have said are generalised, but you get my drift that the ATI 9700 was a great product which was launched at the right time. I wish I hadn't got an ATI Radeon 9600...

RE: Will they ever learn?
By themadmilkman on 11/8/2007 3:53:47 PM , Rating: 4
Because it increases their profits. Pure and simple. It costs a lot of money to develop these new chips. The longer they can keep the profit margin high on the newly developed chips, and because said margin would be reduced by the introduction of a better chip, holding back allows them to maximize profits.

Really simplified version:

Let's say Intel only has two chips on the market at any one time. A high-end new chip, and a low-end previous chip.

It costs $1 to produce each chip.

It costs $90 to develop a new chip.

Intel sells it's new chips for $10, and it's old chips for $5.

Over a set period of time, Intel sells 10 new chips and 20 old chips. As a result, Intel has made $80. (10*10 = 100, deduct production and development, no money made on new chips, 20*5 = 100 from old chips - 20 productions == $80 final profit).

Now Intel has a choice. Release a new chip, with the same development costs, or ride out the current chips. Assuming that there is no competition for the top spot, as is the case now, and sales remain constant, Intel would now make $170. Not releasing a new product increases their profits by $70 for the time period.

Now, this is EXCEPTIONALLY simplified, but it gets the basic point across.

RE: Will they ever learn?
By KristopherKubicki on 11/8/2007 3:56:57 PM , Rating: 5
Someone tied up in the executive level of a motherboard manufacturer told me this once:

Selling computers is like selling produce -- every day it sits there its worth less.

RE: Will they ever learn?
By Lifted on 11/8/2007 5:36:48 PM , Rating: 1
That's a bit too simplified. In your example, Intel makes more money by not investing as much in R&D when they have no need to, but in reality Intel's R&D spending hardly (if ever) goes down, always up. As the market grows and as profits grow, so will R&D. Even if they had a 100% monopoly on the market, they would have to spend on R&D or else their profits would start to flatten out as people wouldn't do the upgrade cycle as frequently as they do now.

RE: Will they ever learn?
By themadmilkman on 11/9/2007 4:45:32 AM , Rating: 2
I did note that it was exceptionally simplified.`

RE: Will they ever learn?
By themadmilkman on 11/9/2007 12:17:33 PM , Rating: 2
Perhaps for a more accurate (exceptionally simplified) example, you could replace the R&D one-time costs with retooling costs. It does cost Intel a lot of money to set up plants capable of producing these chips. R&D would then be included in the production costs of each chip. That would perhaps be more accurate.

RE: Will they ever learn?
By Nik00117 on 11/8/2007 4:27:33 PM , Rating: 2
The deal is this, they are ahead in the game and are arrogrant to that fact. They are the big boys on the block.

However here you got ATI and AMD working their buns off to beat them.

ATI and AMD will over take Nvidia and Intel, Nividia and intel will then take over ATI and AMD and the cycle will simply repeat.

RE: Will they ever learn?
By DeepBlue1975 on 11/8/2007 6:28:23 PM , Rating: 3
I disagree about Intel.
Intel is now showing what it can actually do, in contrast to what they've been doing when AMD could leave them in the shades with the crappy Netburst architecture whose sole purpose was to crank up mhz like crazy regardless of efficiency and final performance.

Core2 was a great architecture at launch. Penryn will improve Core2 in an interesting way and Intel already is talking about 2008's nehalem. You know, they can't release a new from-the-ground-up architecture every 2 months :D

Nvidia is another story, from releasing something new every 6 months, they went asleep for a whole year. The 8800 GT is a very nice refresh, though, and I'll switch my aging x800xl for one of those as soon as I can get one were I leave :D

ATI/AMD are mostly excused because of the takeover and AMD's financial position after that... Though I really hate they pushing barcelona's launch 3 times in a row, resulting in an almost full year delay.

I definitely want AMD / ATI back pushing and competing in the high end... In the meantime, my CPU is from Intel and in a few months my GPU will be from Nvidia.

RE: Will they ever learn?
By kilkennycat on 11/8/2007 7:28:57 PM , Rating: 2
D9E is the true next-gen high-end GPU. Not at all a "simple refresh" of existing technology. A completely new design. It will have the functionality of both GPU and GPGPU with the double-precision data paths needed by the GPGPU and all the HD decoding attributes of the new 65nm 8800GT (G92). Expect other goodies like full DX10.1 functionality and maybe PCIe 3.0 functionality also. Mid-2008 would be my guess.

RE: Will they ever learn?
By rudy on 11/8/2007 10:53:16 PM , Rating: 2
Maybe it is not so simple. If you have a better product how do you make it better? How do you know what level you have to hit to "blow the competition away". Sometimes when there is no better competition it is difficult to innovate and really push ahead cause you dont know where to go.

Look at online gaming, you see these guys who are not expirenced with clans sit in servers for years and always beat their buddies but never really hit a new level. Then when they are enlightened to really hardcore gaming they step up their game. Its not like they didnt want to crush their competition they just did not have any good competition to measure against and know how to improve. How to react how to take the next step.

Competition really forces you to perform.

RE: Will they ever learn?
By opterondo on 11/9/2007 1:33:20 AM , Rating: 3
Are you joking?

Intel is pushing harder than AMD their direct competitor.

nVidia is pushing harder than AMD their direct competitor.

I think it is obvious who is "not pushing tech forward".

RE: Will they ever learn?
By mikefarinha on 11/9/2007 1:59:15 PM , Rating: 4
Just because Intel or nVidia is pushing harder than AMD doesn't mean they are pushing their hardest.

AMD is a relatively small company that is a competitor to both of Intel and nVidia. They can only push so hard.

More descriptive?
By 16nm on 11/8/2007 2:35:38 PM , Rating: 2
instead opting to switch to a more descriptive system. NVIDIA's new codenames follow:

<!--[if !supportLists]--> <!--[endif]-->D8M: Eighth generation mainstream, previously named G98
<!--[if !supportLists]--> <!--[endif]-->D8P: Eighth generation performance, previously named G92
<!--[if !supportLists]--> <!--[endif]-->D9M: Ninth generation mainstream
<!--[if !supportLists]--> <!--[endif]-->D9P: Ninth generation performance
<!--[if !supportLists]--> <!--[endif]-->D9E: Ninth generation enthusiast

So this is what more descriptive looks like? LOL. SOmeone needs to edit this out of the article.

RE: More descriptive?
By TomZ on 11/8/2007 2:37:04 PM , Rating: 2
Yeah, codenames that include "supportLists" etc. will surely add more confusion. :o)

RE: More descriptive?
By KristopherKubicki on 11/8/2007 2:45:43 PM , Rating: 2
We're having all sorts of document engine problems this week. We're fixing it -- slowly.

RE: More descriptive?
By ninjit on 11/8/2007 2:39:29 PM , Rating: 1
Seems like your browser isn't displaying the "bullet" properly

And no they didn't change the article since your post, I read it first before there were any comments.

What browser are you using?

RE: More descriptive?
By Master Kenobi on 11/8/2007 2:43:16 PM , Rating: 4
IE7 :P

RE: More descriptive?
By NEOCortex on 11/8/2007 3:07:08 PM , Rating: 2
Aren't the references to D8M in the second to last paragraph suppose to be D9M? It seems like things might make a tiny bit more sense that way.

Or maybe its just me, and this is really a sign I should get out of computer gaming......

RE: More descriptive?
By KristopherKubicki on 11/8/2007 3:13:51 PM , Rating: 1
Yep, sorry about that.

By Spivonious on 11/8/2007 3:00:17 PM , Rating: 2
I actually like this change. The old way never made much sense to me.

And just to confirm - this is just for codenames, right? I doubt they'd mess up their well-known product naming scheme (6000, 7000, 8000, etc.).

RE: Great!
By Lonyo on 11/8/2007 3:19:01 PM , Rating: 2
Well they will have the same problems as ATi in 2 generations time :P So something's going to happen.

RE: Great!
By slashbinslashbash on 11/8/2007 3:47:38 PM , Rating: 2
Yeah, and I'm actually wondering if we will see an Nvidia 9600, 9700, or 9800 since those were such big hits for ATI a few generations ago. I wonder why Nvidia went with the 4-digit numbers anyway. They had a good thing going with the GeForce 2, 3, 4, etc. then jumped to the 5900/5200/etc. family. Yes, the GeForce 4's had 4-digit numbers too, e.g. GeForce 4 Ti 4200, but it was still GeForce 4.

Actually now I wonder if they will just change their branding from GeForce to something else. They've been riding that name pretty long now -- 8 generations of product. Intel went through 4 "numbers" of Pentium families before switching to the Core nomenclature (although there were at least 9 codenames/architectures that I can think of... Pentium, Pentium MMX, Pentium Pro, PII, PIII, PIII Coppermine, PIII Tualatin, P4 Williamette, P4 Northwood, P4 Prescott).

RE: Great!
By murphyslabrat on 11/12/2007 11:17:15 AM , Rating: 2
I think the reason for the nomenclature switch was to avoid association with the Pentium 4.

RE: Great!
By tedrodai on 11/8/2007 3:33:00 PM , Rating: 2
The M, P, and E make more sense than their old numbering system, but where does the D come from? At the moment, it's just sending me the submliminal message that I'm too Dense to understand it.

RE: Great!
By KristopherKubicki on 11/8/2007 3:52:36 PM , Rating: 3
I think the D is for desktop. M would be for mobile.

By SlyNine on 11/8/07, Rating: 0
By HeelyJoe on 11/8/2007 9:21:35 PM , Rating: 2
SLI never really has, and probably never will be, a cost-effective solution in most cases.

The only time I really see it as being useful is in systems where money really doesn't matter as much.

I agree, though, it would be nice if that weren't the case.

By retrospooty on 11/8/2007 9:45:19 PM , Rating: 3
SLI was never really a great upgrade path. Its always been semi buggy and using 2 cards takes alot of power. Keep in mind that every 6 months or so (except this past year) a new gen caard is out and its usually as fast if not faster than previous gen SLI at a lower total cost.

For example, why would you have bough 2x 6800's when one 7800 came out, and likewise, one 8800GTX is far faster than 2 7900GTX's so the use is always limited as an upgrade path. Its really an enthustiest thing.

By SlyNine on 11/9/2007 2:12:08 AM , Rating: 2
I guess I always had hoped that you can pick up youre 400$ card when it comes out and then when the price drops to 120$ you could buy another one for a stop gap too avoid upgrading every generation.

By retrospooty on 11/9/2007 9:55:49 AM , Rating: 3
you can... It is an option, its just that now you have to deal with the power and heat and space req's of 2 cards, and your single new gen card will likely be the same $400 price point and be faster than previous gen SLI. Still a decent option if you have the system for it.

By SlyNine on 11/10/2007 1:14:26 PM , Rating: 2
The thing is you might not want or need 2 of those cards when you first buy it. So the plan would be to wait until the card drops in price and then pick the second one up. But anymore before the card has a chance to drop in price they cut production.

When I bought the 7800GT, SLI was still somewhat new (at least too me). I didn't expect a 7900GT to come out. Worst thing was the 7800GT's never dropped in price so it was never worth buying a second one.

As far as buying 2 video cards of a previous generation, for example buying 2 7900GTX's when the 8800GTX came out. That's a very bad investment to say the least. But if you already had a 7900GTX and the prices of them were currently very good, 200$ maybe, back when the 8800GTX's were first released then you might would just spend 200$ for the upgrade to hold you off.

You may not have the greatest performance on the block but 2 7900GTX's are still very good.

Out with the old, in with the new
By zeroxcape on 11/8/2007 2:43:06 PM , Rating: 2
If I had any of the old G80 video cards like the current GTX or the Ultra, I would probably get rid of them quickly as they will soon be replaced by the faster and cheaper 65nm chips. How soon will be the GTX and Ultra be replace? Most websites are reporting the earliest at Decemeber and the latest at Febuary. The 8800GTS 640MB is already been out done by the cheaper 8800GT, just imagine what the future cards will be like...

RE: Out with the old, in with the new
By Master Kenobi on 11/8/2007 3:45:13 PM , Rating: 2
I'm going to sit on my GTX until Nehalem when I overhaul my entire rig.

By SavagePotato on 11/8/2007 4:04:11 PM , Rating: 5
That can't be good for it's heat envelope.

By yacoub on 11/8/2007 3:42:50 PM , Rating: 2
I thought the 8800 GTS SSC was simply an eVGA model name, not an official NVidia model. Interesting.

RE: oh
By KristopherKubicki on 11/8/2007 3:49:28 PM , Rating: 2
XFX and PNY have models too. More to come I'd expect.

BTW, I saw your post at the other place about RTPE. It's been kind of autonomous since I left in 2006 so I haven't actually been able to figure out what happened to it. I'm working with Anand to figure something out.

RE: oh
By yacoub on 11/8/2007 4:18:34 PM , Rating: 2
hey thanks Kris

RE: oh
By cobalt42 on 11/8/2007 4:24:16 PM , Rating: 2
Yes, other makers will have 112-SP versions, but I think the point is that "SSC" appears to be the specific moniker EVGA is using, not an acronym coming from NVIDIA.

I heard was that board makers were not to use the 112SP as part of their marketing, but could use other overclocking-related marketing terms. EVGA has apparently introduced "SSC" as part of their overclocking series -- if you look at their "GT" page, the range goes nothing/SC/KO/SSC for 600/650/675/700 MHz core clocks.

BGF, for instance, is calling it the "OC 640MB Extreme Edition":

By lifeguard1999 on 11/8/2007 3:23:29 PM , Rating: 5
I can't wait for the NVidia 9800!!! ATI's 9800 was great, so let's see what NVidia can do with the name. :)

RE: 9800!!!
By murphyslabrat on 11/12/2007 11:21:08 AM , Rating: 2
And the 8500 wasn't cool? It was still an incredible price/performance bargain.

By andrewrocks on 11/8/2007 2:29:17 PM , Rating: 2
from what i gather

the replacement for the 8800GTX should be early in 08? quarter 1?

and to appease us for such a long delay... the new 8800GTS SSC will essentially have the performance of the GTX for a much lower price.

RE: so...
By Plazmid19 on 11/8/2007 2:41:34 PM , Rating: 2
Hopefully the retailers will stock sufficient quantities of the new cards.
What a run on the bank the 8800 GT has caused! The 8800 GT will be sold out until at least Jan.
The new cards should come as no surprise, there have been enough rumors already and this is typical for a new turn of silicon.
Early adopters, beware!

Changed my minor.
By pauldovi on 11/8/2007 3:53:33 PM , Rating: 2
From CS to Nvidia Nomenclature.

RE: Changed my minor.
By KoreyJ on 11/8/2007 7:01:23 PM , Rating: 2
How about this title for a thesis?

"The GeForce 8800GT: How a name does not equate to performance"

By AthlonBoy on 11/9/2007 7:42:01 AM , Rating: 2
D9M will add PCIe 2.0 support, DirectX 10.1, wider memory controllers (up to 128-bits)

Uh. What? 128-bit memory controllers?

RE: Huh?
By AthlonBoy on 11/9/2007 7:43:12 AM , Rating: 2
Never mind me. 9M is the n200 part. >_<

'nuff said.
By Master Kenobi on 11/8/2007 2:25:39 PM , Rating: 1
I thought Intel had the corner of the market for alphabet soup but nVidia seems to have 1-upped them.

RE: 'nuff said.
By cheetah2k on 11/8/2007 8:15:39 PM , Rating: 2
Its always a pain at the start, but then (just like AMD and Intel) we all forget and move on with the new naming schemes....

Nothing different this time.

By ninjit on 11/8/2007 2:37:21 PM , Rating: 3
name changes galore, at this rate I don't think anyone is going to care what 8xxx or 9xxx they call thee cards.

But the part# changes do sound like a little improvement, so as long as retail boxes state D8m, or D8P, along with clock and memory information, that would be more useful for potential buyers than the card number.

Though I'm sure Joe Average will still look at EVGA's 9950GSPOT-VD edition and assume it's better than the rest (and they would be right to do so), but as the new 8800GT just showed us (and similar cases in ATI's illustrious past) this can be misleading.

By SavagePotato on 11/8/2007 3:20:41 PM , Rating: 2
Maybe they will give sli the codename DP for the D9E which will be very symbolic of the price it will cost no doubt.

Random Rumors
By mac2j on 11/13/2007 2:23:38 AM , Rating: 2
Ok I've been hearing some rumors - anyone can confirm any of these are real?

I've heard the D9P is pretty far along - will debut as the 9800 ~ Feb 2008 with Dx10.1 support.

Also, that Nvidia got such a lead on R&D time with AMD unable to get an 8800 GTS equivalent part out for over a year that D9 will not be getting a die shrink revision like D8 did.

Instead D10 will debut at 45nm Q4 2008 and may, depending on MS, have DX11 support. Also I've been hearing crazy core clock frequencies between 2-3 GHZ. Lastly I've heard Nvidia has been trying to decide what to call D10 commercially cause no one wants to have to shop for something like a "10800 GTS OC" or something like that. Heard they're talking about a new lower incrementing number system (1080 for example... dropping a zero), bringing back the "FX####" naming in some way, or going with something totally new.

You can't scold your customers...
By gochichi on 11/13/2007 4:44:41 AM , Rating: 1
Part of the reason they can't blow themselves out of the water, is b/c customers get ticked off when they buy something pricey only to have it become eclipsed by a cheaper product.

The video card industry is huger than ever but as fragile as ever. Imagine if BMW sold you an M series vehicle today, on the premise that it was SO FAST. Then within a year they upgrade their base model to 500HP and with better fuel economy... yikes right? How many times can they do that and still sell M-series products ever?

This totally happened with the computer industry. They make an effort now so that a crappy computer is crappy right out of the gate, and a high end model stays mostly out of reach by the $500.00 computers for the next 2 or 3 years at least. You can't outdo a 1950XT with a 2400XT... wrong price brackets. It MUST be that way if we can hope to continue to see innovation for decades to come.

In terms of real "lead", they are investing correctly in terms of technology. Just like a Honda Accord isn't going to come out with 250HP engine with 12MPG... cause it just wouldn't sell, regardless of the price. Same goes for video cards, people want a slick package (at least I do) and lucky for us these slick packages are cheaper to produce (while being vastly more expensive to design). So they're investing in brain power, which in the long term is the only way to move forward. Remember, it's not like AMD/ATI isn't investing in research either. They're coming out with an even smaller process this week.

Could you squeeze another 25% out of this level of technology? Sure, but I think they're headed in the right direction... I want twice as fast as now and below $300.00 and I want it to be a single-slot solution that doesn't require an expensive powersupply and is relatively quiet. Think I'm alone? I boubt it, and it's the way moving forward.

If you were a diamond seller, and thought, why sell dimonds for $1000.00 and have competition when I could sell them for $500.00 and corner the market... well, the thing is, that may work fine this year, but next year... you're running the same business under one roof as there was last year, and with less than half the profit.

I'm glad NVIDIA and AMD are keeping each other at least a little bit honest. It's getting harder and harder to double performance... they have been out to lunch for a while now, $500-$800 "enthusiast" cards ... WAY out of line ($300- $500 TOPS for top level). They were asking us to have 600W power supplies and maybe special sloppy motherboards that could handle two double width cards... yikes. No no no. Single-slot $200.00... design design design. Don't spend a lot of $ making too many cards at this technology level, make enough of them to keep things viable and move on to the next technology level.

My heavy suspition is that there isn't high volume availability of the 8800GT b/c it's a transition product. Every card they sell right now is a card that ATI won't sell ... yes, but it's also a card they won't sell next year too.

$35k Lexus last year, if you sell too many $15K Lexus' this year... you won't sell hardly any $35k ones next year.

I will admit that I would just as soon spend $300.00 now and $300.00 in 6 months for a twice as good card but I don't think that's normal... and it's something that I might say and not actually do. It would be cool to see runaway innovation... but I think it would just cause a crash later on.

9th gen yeah right
By ElFenix on 11/8/07, Rating: -1
RE: 9th gen yeah right
By yacoub on 11/8/2007 3:44:00 PM , Rating: 2
Where to TNT2, GeForce 4000 series, and GeForce 5000 series fit on your chart? Just curious which numbers you lump them under.

RE: 9th gen yeah right
By yacoub on 11/8/2007 3:44:18 PM , Rating: 2

RE: 9th gen yeah right
By therealnickdanger on 11/8/2007 4:46:16 PM , Rating: 2
Here's a review of the "new" GTS SSC. It appears to be beaten by the new 512MB 8800GT in most scenarios except in the highest resolutions and with DX10 games. NVIDIA, what are you thinking?

RE: 9th gen yeah right
By opterondo on 11/9/2007 1:40:48 AM , Rating: 2
"what are you thinking?"

sell old stagnant products to dummies at high prices before we unveil next-gen -- rinse & repeat

GTS SSC is technically older than 65nm 112 SP GT card

RE: 9th gen yeah right
By ElFenix on 11/8/07, Rating: 0
RE: 9th gen yeah right
By omnicronx on 11/8/2007 6:26:51 PM , Rating: 3
Not quite true, although the TNT2 and GF4 were essentially clock increases, gf2 was not just added speed.
The GF2 included; a second texture map unit, a high definition video processor, and actually included an early version of pixel shaders, even though they were not used.

GF3 also added dx8.1 compatibility..

and GF4 depending on the version you got was either a modified GF3, or a modified GF256(geforce4mx).

RE: 9th gen yeah right
By TheGreatGrapeApe on 11/8/2007 8:34:41 PM , Rating: 2
GF3 also added dx8.1 compatibility..

Actually the GF3 and GF4 are only DX8.0

RE: 9th gen yeah right
By opterondo on 11/9/2007 1:52:39 AM , Rating: 2

• API support

° Complete DirectX® support, including
DirectX 8.1
° Complete DirectX® support, including
DirectX 8.1
° Complete DirectX® support, including
DirectX 8.1

° Full OpenGL® 1.3 support


• NVIDIA Unified Driver Architecture (UDA)
• Fully compliant professional OpenGL 1.3
API with NVIDIA extensions, on all Linux
and Windows operating systems
• WHQL-certified for Windows XP, Windows
Me, Windows 2000, Windows NT, and
Windows 98
• Complete Linux XFree86 drivers
• MAC 9/X OS support


RE: 9th gen yeah right
By SlyNine on 11/9/2007 2:30:02 AM , Rating: 2
man what I wouldnt give to have one of those beasts. lol

RE: 9th gen yeah right
By ElFenix on 11/9/2007 11:27:46 AM , Rating: 2
i did mention added 'pipes' as being a mark of a half gen product, and the extra texture units would fall under that. the GF2 added little to the 3D featureset over the GF256. (high def video processor is outside of that, and, as you admitted, the early pixel shaders weren't used).

the GF3 was a full generational step (adding full dx8.1 compatibility, a big step to the 3D featureset). that's why i've listed it that way.

i don't count the MX as a version of the main chip, because, as you point out, they're not.

RE: 9th gen yeah right
By KristopherKubicki on 11/8/2007 3:51:03 PM , Rating: 3
I think if you start counting at GeForce 256 the generations line up easier.

RE: 9th gen yeah right
By ElFenix on 11/8/07, Rating: 0
RE: 9th gen yeah right
By cobalt42 on 11/9/2007 3:28:29 PM , Rating: 3
The reason you have to start counting at the GeForce line (other than the fact that by name the TNT wasn't a GeForce) is that the GeForce was the first one to include transform and lighting acceleration. Previous cards were just rasterization accelerators.

RE: 9th gen yeah right
By cleco on 11/8/2007 4:16:35 PM , Rating: 2
so like mention before you missed out on a couple:

Geforce 2
Geforce 7

RE: 9th gen yeah right
By ElFenix on 11/8/2007 4:25:51 PM , Rating: 2
those are half gen steps.

the half gen steps have been characterized by making the features added with the full gen steps playable and useful.

which is why i think this upcoming 9 series will actually be a half gen. it'll make DX10 stuff (more) playable.

RE: 9th gen yeah right
By KristopherKubicki on 11/8/2007 4:28:29 PM , Rating: 2
True, although they will be adding DX10.1 support. I don't really know what's involved with DX10.1, but I suspect it's not too architecturally difficult as ATI is doing it on RV670, which is essentially just a die shrink on R600.

RE: 9th gen yeah right
By ElFenix on 11/8/2007 4:32:19 PM , Rating: 2
maybe it's a 3/4 gen product, then. like 2.75G cellular service.

i could be completely wrong. i'm just going by nvidia's history here.

RE: 9th gen yeah right
By techyguy on 11/9/2007 5:29:16 AM , Rating: 2
We will probably have Shader FX 5 and some shadow acceleration. That is a lot more than what separates the 2007 generation from 2006 model cars.

What else can Nvidia do. They are waiting for Direct X 11. Microsoft is waiting for games to catch up to Direct X 10. While developers wanted a Vista OS that was actually easier to program for.

Maybe physics processing that actually works would be helpful. But isn't that the road quad cores are taking?

RE: 9th gen yeah right
By Ard on 11/8/2007 5:33:19 PM , Rating: 5
I think they mean 9th generation as in the GeForce line.

1st - GeForce 256
2nd - GeForce 2
3rd - GeForce 3 (new)
4th - GeForce 4 (refresh)
5th - GeForce FX (new)
6th - GeForce 6x00 (new)
7th - GeForce 7x00 (new)
8th - GeForce 8x00 (new)
9th - D9M/P/E

The 8.5th gen would be G92/G98, which are the refresh parts. NVIDIA's latest codenames (G70, G80) also seemed to correspond...with the exception of G92/G98 obviously. Even if we use your list of generations, D9 would still be 9th generation...though you're missing the GF7 series.

RE: 9th gen yeah right
By retrospooty on 11/8/2007 9:02:11 PM , Rating: 2
I think you got it right. =)

RE: 9th gen yeah right
By ElFenix on 11/9/2007 11:15:30 AM , Rating: 2
the GF 7 series is still a warmed over GF 6 series.

RE: 9th gen yeah right
By Ard on 11/14/2007 12:19:14 AM , Rating: 2
Well, I don't know if I'd go that far. G70 has some serious improvements over NV40. The only real reason why G70 is sometimes thought of as a refresh part is because NV40 didn't have an actual refresh (all NV45 did was integrate the PCIe bridge into the die).

RE: 9th gen yeah right
By ElFenix on 11/9/2007 11:28:45 AM , Rating: 2
and yes, if they mean 9th gen of 'geforce' branding, then yes, that's how it is.

“We do believe we have a moral responsibility to keep porn off the iPhone.” -- Steve Jobs

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki