backtop


Print 86 comment(s) - last by Ard.. on Nov 14 at 12:19 AM

Get ready for an avalanche of new NVIDIA products

NVIDIA's GeForce 8800 GT might be one of the best performance-per-dollar cards since the Radeon 9800, but things are moving fast at NVIDIA and there's a lot more on the way from the company between now and next Summer.

Over the last quarter, the company moved away from the old "Gx" designation for its core names, instead opting to switch to a more descriptive system.  NVIDIA's new codenames follow:
  • D8M: Eighth generation mainstream, previously named G98
  • D8P: Eighth generation performance, previously named G92
  • D9M: Ninth generation mainstream
  • D9P: Ninth generation performance
  • D9E: Ninth generation enthusiast
GeForce 8800 GT, codenamed G92 and D8P, stole the majority of the headlines last week.  GeForce 8800 GT, the 112 stream processor sub-titan, became NVIDIA's first 65nm processor design. However, NVIDIA's dark horse was really the revision on GeForce 8800 GTS SSC.

GeForce 8800 GTS SSC, as it’s awkwardly called, is essentially identical to the GeForce 8800 GTS based on the 90nm G80 core.  However, where typical 8800 GTS components only enables 96 of the 128 stream processors of the G80 core, the 8800 GTS SSC enables 112 stream processors -- the same number featured on the GeForce 8800 GT.

And yet in December, GeForce 8800 GTS is expected to undergo another revision as the company moves from the 90nm G80 core to the 65nm D8P.  Vendors will introduce 112 stream processor and 128 stream processor revisions on D8P, which even further convolutes the corporate guidance put forth just a week ago.

NVIDIA will continue to cannibalize the GeForce 8000 series as it moves to 65nm silicon across the board.  GeForce 8400 will likely be the first to go before the end of the year, as the G86 design is replaced by the 65nm D8M silicon, which was previously called G98.

As 2007 comes to a close, the company will ramp production on ninth-generation components to replace the eighth-generation 65nm parts, D8x.  Sound familiar? It should, as NVIDIA is almost exactly replicating Intel's tick-tock strategy of alternate cycles of design and shrink. 

Early NVIDIA roadmaps claim D9M, the first ninth-generation NVIDIA component, will replace the GeForce 8500-series lineup.  There's no retail designation for these D9x parts, but it would be a safe bet to say these will be the GeForce 9xxx-series cards.

D9M will add PCIe 2.0 support, DirectX 10.1, wider memory controllers (up to 128-bits) and will be based on a 65nm silicon.  D9P, the likely 8600-series replacement, adds the same features as D9M, but the memory controller width will top out at 256-bits.

D9E, the enthusiast component slated to replace the GeForce 8800-series, will incorporate all of the features of D9P and add a 512-bit memory bus. NVIDIA is holding its cards close on D9E, and has not provided any other guidance or release date.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Will they ever learn?
By AvidDailyTechie on 11/8/2007 3:00:17 PM , Rating: 4
What's the deal with nVIDIA, and Intel for that matter, not pushing tech forward just because they're in the lead. I've seen both companies push as hard as they can when they're behind (7950 GX2, P4)... It's like they're complacent with being just a bit 'better'? Why not completely blow the competition out of the water??

I know nVIDIA could do a high end refresh much sooner than it's scheduled (if at all), probably wheneveer they wanted...

Same goes with Intel, I know they're not pushing the clocks of their new chips, instead they're releasing very slight performance increases while taking more profit at the expence of the consumer.

I guess that's business.




RE: Will they ever learn?
By AvidDailyTechie on 11/8/2007 3:01:26 PM , Rating: 2
I like the old names better.


RE: Will they ever learn?
By JarvisTheGray on 11/8/07, Rating: -1
RE: Will they ever learn?
By SavagePotato on 11/8/2007 3:50:22 PM , Rating: 3
Apparently in your world Intel released a totally different netburst architecture that was awesome and not in fact a total piece that hit a brick wall in the massive mhz push they designed it for.

Intel was indeed stagnant and it took the Athlon 64 to shake them up and get back on track with the core architecture.


RE: Will they ever learn?
By murphyslabrat on 11/12/2007 10:57:22 AM , Rating: 4
quote:
Not sure if I correctly understood where you were going with this, but Intel was far from behind when they launched P4. And Core 2 (aka P5)

http://www.xbitlabs.com/articles/cpu/display/thund...
A clock-speed equivelant comparison between Pentium III and Athlon Thunderbird. Thunderbird comes out on the upper side of dead even.
http://findarticles.com/p/articles/mi_zdpcm/is_200...
quote:
We didn't have a difficult time seeing why the AMD Athlon processors–particularly the new Thunderbird chips–are giving Intel fits and starts. The Xi 1100K MTower SP is fast, loaded, and a lot cheaper than comparably equipped Pentium III PCs. And that's a winner in our book...As configured for our testing (details below), the unit costs just $2,895 (direct)–and that includes a 19-inch monitor, a 64MB graphics board, and a totally digital audio subsystem. The two vendors were expected to set the price for a comparably equipped 1.13-GHz PIII at roughly $4,000.

While, apparently, AMD and Intel were dead even in terms of performance, AMD did have a definitive cost advantage. I.E. AMD had a one-up on Intel.

As to the A64 vs P4? Are you kidding me?!?!?!
http://www.digit-life.com/articles2/roundupmobo/p4...
That's a comparison between a 3.2GHz P4 vs an AXP, let alone an A64.
http://www.hardcoreware.net/reviews/review-328-1.h...
http://www.hexus.net/content/item.php?item=5692&pa...
As for A64, I hope that this clears up any confusion.

quote:
There is a reason why Intel stuck with netburst for so long, because it performed

No, it was because it had very high clockspeeds.
http://www.thg.ru/cpu/20010919/index.html
This article is very detailed analysis, and you can probably skip to the test setup and results straightaway. However, this is a showcase of one of the greatest crimes Intel has perpetrated against the consumer: they laid aside a product that was able to keep a shaky pace with an Athlon 1.4Ghz and a Pentium 4 1.8Ghz. I happen to have build two PC's with PIII-S's (the 1.266 variant that was discussed here, I bought a pair off of e-Bay for $30 w/ S&H ^^j), and love them. The only trouble I ever had was with motherboard compatibility. Intel screwed the customer over in one of the most heinous manners imaginable, they made the superior product (cheaper to produce, cheaper to run, more powerful clock-for-clock) artificially expensive to purchase , in addition to -- again, artificially -- requiring a new motherboard. Can it get any worse than that?!?!?!?

quote:
and by the end of 2001 my friend's mom had bought a 3.0Ghz P4 and my machine was obsolete.

Damn, I wish I was your friends mom, as the 3.0Ghz Northwoods didn't come out till November 2002!


RE: Will they ever learn?
By MetaDFF on 11/8/2007 3:47:01 PM , Rating: 5
Blowing the competition out of the water is good for the press but not necessarily good for the bottom line. Why introduce a new product when you can use the old one to recoup research and development costs?

Chips are increasingly more difficult and expensive to design, so it makes sense for them to profit from it over the longest period of time possible.


RE: Will they ever learn?
By SavagePotato on 11/8/2007 4:52:00 PM , Rating: 2
Intel is still pushing plenty hard at this point in my opinion. Nehalem will be out as early as next summer. They aren't really ramping the clock speeds up that high but nonetheless the performance steadily increases.

Penryn is just starting to become available and marks a pretty solid performance gain over the previous offerings. Nehalem probably an even more revolutionary step. When you consider how quickly their tick tock strategy moves along it seems plenty fast, at least to me.

As far as video cards go, I am honestly a little relieved to see them slow down a bit. The 8800gtx has been a top enthusiast part for an entire year, that's quite unprecedented. It's gotten to the point where at least these 6 and 700 dollar parts are seeing a solid 2 year plus lifespan. Myself I've been using a 7900gtx for close to 2 years now, and I'm starting to feel the lacking of it finaly in games like UT3, but comparing to past cards it has held out very well for 2 years, the 8800gtx probably will be even better, considering it is still to date the highest performing single card solution.


RE: Will they ever learn?
By retrospooty on 11/8/2007 8:56:50 PM , Rating: 2
What he means is that Penryn was going to debut at 3.6ghz per Intel 6 months ago, and based on many sites early OC results easily could (anyone that has tried is hitting 3.6 without incrasing voltage or running hot at all), but since AMD has nothing to offer we are getting 3.16 and 3.2 ghz parts at the high end price points. This is exactly what they did int he past before the athlon came along.


RE: Will they ever learn?
By SlyNine on 11/9/2007 2:21:38 AM , Rating: 2
I think the 9700pro had one of the best life spanes ever.


RE: Will they ever learn?
By StevoLincolnite on 11/9/2007 4:49:48 AM , Rating: 2
The Voodoo did, with some modifications to Doom 3, and Quake 4 it could be run on a Voodoo 2 12mb card which was released in 1998. (Quake 4 was released in 2005 - so 7 years?)

Otherwise the Geforce 256, which was supported in games like Half Life 2 and FarCry without any modifications to the game.
Or perhaps the Geforce 3 Ti200 Which can run Oblivion with the help of OldOblivion?


RE: Will they ever learn?
By 3kliksphilip on 11/9/2007 8:47:37 PM , Rating: 3
I agree completely. It was the first DX 9 card out, it whipped the Geforce 4's and it ran any game out (Including the big names, such as Half Life 2, Doom 3 and Far Cry). To upgrade the DX 10 requires Vista and as most people are happy on XP, the DX 9 cards are still effective. Plus the XBOX 360, Wii and PS3 aren't DX10 compatible.

Okay, some of the things I have said are generalised, but you get my drift that the ATI 9700 was a great product which was launched at the right time. I wish I hadn't got an ATI Radeon 9600...


RE: Will they ever learn?
By themadmilkman on 11/8/2007 3:53:47 PM , Rating: 4
Because it increases their profits. Pure and simple. It costs a lot of money to develop these new chips. The longer they can keep the profit margin high on the newly developed chips, and because said margin would be reduced by the introduction of a better chip, holding back allows them to maximize profits.

Really simplified version:

Let's say Intel only has two chips on the market at any one time. A high-end new chip, and a low-end previous chip.

It costs $1 to produce each chip.

It costs $90 to develop a new chip.

Intel sells it's new chips for $10, and it's old chips for $5.

Over a set period of time, Intel sells 10 new chips and 20 old chips. As a result, Intel has made $80. (10*10 = 100, deduct production and development, no money made on new chips, 20*5 = 100 from old chips - 20 productions == $80 final profit).

Now Intel has a choice. Release a new chip, with the same development costs, or ride out the current chips. Assuming that there is no competition for the top spot, as is the case now, and sales remain constant, Intel would now make $170. Not releasing a new product increases their profits by $70 for the time period.

Now, this is EXCEPTIONALLY simplified, but it gets the basic point across.


By KristopherKubicki (blog) on 11/8/2007 3:56:57 PM , Rating: 5
Someone tied up in the executive level of a motherboard manufacturer told me this once:

Selling computers is like selling produce -- every day it sits there its worth less.


RE: Will they ever learn?
By Lifted on 11/8/2007 5:36:48 PM , Rating: 1
That's a bit too simplified. In your example, Intel makes more money by not investing as much in R&D when they have no need to, but in reality Intel's R&D spending hardly (if ever) goes down, always up. As the market grows and as profits grow, so will R&D. Even if they had a 100% monopoly on the market, they would have to spend on R&D or else their profits would start to flatten out as people wouldn't do the upgrade cycle as frequently as they do now.


RE: Will they ever learn?
By themadmilkman on 11/9/2007 4:45:32 AM , Rating: 2
I did note that it was exceptionally simplified.`


RE: Will they ever learn?
By themadmilkman on 11/9/2007 12:17:33 PM , Rating: 2
Perhaps for a more accurate (exceptionally simplified) example, you could replace the R&D one-time costs with retooling costs. It does cost Intel a lot of money to set up plants capable of producing these chips. R&D would then be included in the production costs of each chip. That would perhaps be more accurate.


RE: Will they ever learn?
By Nik00117 on 11/8/2007 4:27:33 PM , Rating: 2
The deal is this, they are ahead in the game and are arrogrant to that fact. They are the big boys on the block.

However here you got ATI and AMD working their buns off to beat them.

ATI and AMD will over take Nvidia and Intel, Nividia and intel will then take over ATI and AMD and the cycle will simply repeat.


RE: Will they ever learn?
By DeepBlue1975 on 11/8/2007 6:28:23 PM , Rating: 3
I disagree about Intel.
Intel is now showing what it can actually do, in contrast to what they've been doing when AMD could leave them in the shades with the crappy Netburst architecture whose sole purpose was to crank up mhz like crazy regardless of efficiency and final performance.

Core2 was a great architecture at launch. Penryn will improve Core2 in an interesting way and Intel already is talking about 2008's nehalem. You know, they can't release a new from-the-ground-up architecture every 2 months :D

Nvidia is another story, from releasing something new every 6 months, they went asleep for a whole year. The 8800 GT is a very nice refresh, though, and I'll switch my aging x800xl for one of those as soon as I can get one were I leave :D

ATI/AMD are mostly excused because of the takeover and AMD's financial position after that... Though I really hate they pushing barcelona's launch 3 times in a row, resulting in an almost full year delay.

I definitely want AMD / ATI back pushing and competing in the high end... In the meantime, my CPU is from Intel and in a few months my GPU will be from Nvidia.


RE: Will they ever learn?
By kilkennycat on 11/8/2007 7:28:57 PM , Rating: 2
D9E is the true next-gen high-end GPU. Not at all a "simple refresh" of existing technology. A completely new design. It will have the functionality of both GPU and GPGPU with the double-precision data paths needed by the GPGPU and all the HD decoding attributes of the new 65nm 8800GT (G92). Expect other goodies like full DX10.1 functionality and maybe PCIe 3.0 functionality also. Mid-2008 would be my guess.


RE: Will they ever learn?
By rudy on 11/8/2007 10:53:16 PM , Rating: 2
Maybe it is not so simple. If you have a better product how do you make it better? How do you know what level you have to hit to "blow the competition away". Sometimes when there is no better competition it is difficult to innovate and really push ahead cause you dont know where to go.

Look at online gaming, you see these guys who are not expirenced with clans sit in servers for years and always beat their buddies but never really hit a new level. Then when they are enlightened to really hardcore gaming they step up their game. Its not like they didnt want to crush their competition they just did not have any good competition to measure against and know how to improve. How to react how to take the next step.

Competition really forces you to perform.


RE: Will they ever learn?
By opterondo on 11/9/2007 1:33:20 AM , Rating: 3
Are you joking?

Intel is pushing harder than AMD their direct competitor.

nVidia is pushing harder than AMD their direct competitor.

I think it is obvious who is "not pushing tech forward".


RE: Will they ever learn?
By mikefarinha on 11/9/2007 1:59:15 PM , Rating: 4
Just because Intel or nVidia is pushing harder than AMD doesn't mean they are pushing their hardest.

AMD is a relatively small company that is a competitor to both of Intel and nVidia. They can only push so hard.


"People Don't Respect Confidentiality in This Industry" -- Sony Computer Entertainment of America President and CEO Jack Tretton














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki