backtop


Print 110 comment(s) - last by IGoodwin.. on Jan 10 at 5:42 PM

NVIDIA's D9M makes its first appearance on corporate roadmaps

NVIDIA's newest mid-range processor, codenamed D9M, will make its official debut as the GeForce 9600 GT.

Corporate guidance from NVIDIA lists the initial GeForce 9600 GT shipments come stock with a 650 MHz core clock and a 1625 MHz unified shader clock.  Unlike the G84 core found on GeForce 8600 GT, D9M will feature a 256-bit memory bus interface.  Coupled with a 900 MHz memory clock, NVIDIA calculates the memory bandwidth at 57.6 GB/s. 

The texture fill rate is estimated at 20.8 billion pixels per second.  The company would not indicate how many shaders or stream processors reside on the D9M core. 

Late last year, NVIDIA confirmed the D9 family will use TSMC's 65nm process node.  The company introduced its first 65nm processor shrink in November 2007: the G92

Other details of the D9M family have already surfaced.  ChileHardware published slides yesterday claiming the GeForce 9600 requires a 400W power supply that requires 26A on the 12V rail.  Unlike previous mid-range GeForce cards, the D9M will require a 6-pin supplementary power connector.

NVIDIA publicly confirmed other details of D9M: DirectX 10.1 support, Shader Model 4.0, OpenGL 2.1 and PCIe 2.0 support just to name a few. 

Further documentation from NVIDIA claims the 9600 GT will also support the Quantum Effects physics processing engine. 

Like all NVIDIA processors, the GeForce 9600 is also HDCP compatible, though final support still depends on vendor implementation. 

NVIDIA declined to comment on expected price of GeForce 9600.   A representative for NVIDIA would comment that the performance increase between GeForce 9600 and GeForce 8600 is "almost double."


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

finally
By nosfe on 1/3/2008 12:43:09 PM , Rating: -1
Finally a mainstream card with a 256bits of memory bandwidth




RE: finally
By ChronoReverse on 1/3/2008 12:49:43 PM , Rating: 5
Did you completely miss the 3850 mid-range card that has a 256bit memory interface?


RE: finally
By mholler on 1/3/2008 12:54:55 PM , Rating: 5
Took the words right out of my keyboard.


RE: finally
By Noya on 1/3/08, Rating: -1
RE: finally
By Duwelon on 1/3/2008 1:20:41 PM , Rating: 2
You're an AMDist.


RE: finally
By gtrinku on 1/3/2008 4:55:08 PM , Rating: 5
Wouldn't that be an antiAMDite?


RE: finally
By Visual on 1/4/2008 4:52:03 AM , Rating: 2
AMDfob


RE: finally
By ChronoReverse on 1/3/2008 2:01:04 PM , Rating: 4
Well, I purely look at the merits of each card and the 3850 just so happens to be a worthy card even if most of its brethen aren't. Drivers in single card mode work fine too.

Frankly, AMD drivers have been since the launch of Vista with speed almost par to XP. Nvidia has really screwed Vista over by giving the impression that Vista is significantly slower in games compared to XP. At least nowadays both sides have drivers that are par with XP but the damage has already been done.


RE: finally
By beepandbop on 1/4/2008 11:59:25 AM , Rating: 2
That's really ignorant. AMD is lagging in the processor area, but for once, AMD/ATi put out midrange/high performing cards that compete on the midrange level--which is where most of the market is. So you could have labeled yourself a complete moron, and save us the time for reading that worthless comment.


RE: finally
By Kamgusta on 1/3/2008 4:07:16 PM , Rating: 2
I found quite funny to call "mid-range" the 3850.

It shares the same architecture of 3870 (that I think should be called "very-high-end"), but only has lower clocks.

Also, how should we call 29x0s? "mid-low-range"? 26x0s? "low-range"? And 24x0s? "very-low-range"?

3850 and 3870 are high-end parts. If DAAMIT engaged a price war with NVIDIA and wants to sell them at discount prices, this doesn't change the situation.

Otherwise, I should also call an ATI X1950XT "very-low-end" 'cause I can pick one for 119 bucks @ newegg...

p.s. http://ati.amd.com/products/hdseries.html
Performance - ATI Radeon™ HD 2900
Mainstream - ATI Radeon™ HD 2600
Value - ATI Radeon™ HD 2400


RE: finally
By Lightning III on 1/3/2008 4:24:26 PM , Rating: 1
if it starts with a 2 it called a previous generation stu boy


RE: finally
By Kamgusta on 1/3/2008 5:06:54 PM , Rating: 3
Didn't you noticed HD3400s/HD3600s are still to be launched? And "previous generation" HD2900 PRO was launched just on October 2007 (2 months ago)?


RE: finally
By Shining Arcanine on 1/5/2008 11:36:06 AM , Rating: 2
That is 3 months ago.


RE: finally
By ImSpartacus on 1/3/2008 4:42:51 PM , Rating: 5
The 3870 and 3850 are not high end at all. look at the price. they are competing on the 8800gt's low end. they basically replaced most of the 2x00 series.

comparing the 2900xt to the 3870 is like comparing the 8800gt to the 8800gtx. you get a hair more performance for a huge price jump.

and the 3850 pretty much covers the rest of amd's mid range.

Don't get me wrong, im no amd fanboy, I would get an 8800gt if they were cheaper (and in stock), but deals like a 512mb 3850 selling cheaper than the 256mb 8800gt (200$<215$) cant be passed up.


RE: finally
By Belard on 1/4/2008 12:07:09 AM , Rating: 2
Well... they are HIGH end, just as the $300 8800GT ended up being a HIGH-END part since its comparable to the 8800GTX.. and there isn't anything that is much faster... other than the $700 Ultra which isn't worth it.

Looking at another major site with a graph, the HD3870 slides in right under the 8800GT. So the TOP 5 cards are:
1 - 8800Ultra ($700~800 - avg $700)
2 - 8800GTX ($470~600 - avg $500)
3 - 8800GTS-512 ($330~$380) [not same as original gts 640/320)
4 - 8800GT ($265~320 - avg $300)
5 - HD 3870 ($240~280 - avg $250)

The performance delta of the 3870~8800Ultra is not that much.

The 9600 doesn't sound so hot... they put it at DOUBLE the 8600? Hmm... the 8800GT/3870 are already more than twice as fast as the 8600GTS!!. Now if the 9600 sells for under $125 then it maybe worthwhile. note: the $180 3850 is almost twice as fast as the 8600GTS..

Oh well... who knows...


RE: finally
By suryad on 1/4/08, Rating: 0
RE: finally
By ImSpartacus on 1/6/2008 9:16:10 AM , Rating: 2
Any purchase can be rationalized as "worth it" if you have the funding and the "need", but for the majority of us who have neither, it is a bad decision.


RE: finally
By ImSpartacus on 1/6/2008 9:43:33 AM , Rating: 2
Now that may be a single card set up, however AMD's current strategy is to eliminate the ultra high end (gtx/ultra, etc).

AMD's high end is cross-fire 3870's. Why do you think the 38*0's are so damn good at scaling multiple graphics cards AND they have extremely low power requirements?

It just makes sense that AMD will begin to endorse lots of multiple card setups (quad x-fire?).

I personally think it's a novel idea, but right now I will stick with a single card setup (that's why I was eying an 8800gt).

I would setup a more relevant list due to the manufacturer's wishes.

1. SLI 8800 Ultra
2. SLI 8800 GTX
3. SLI 8800 GTS (512mb) / 8800 Ultra
4. SLI 8800 GT / CF 3870
5. 8800 GTX
6. 8800 GTS (512 mb) / CF 3850
7. 8800 GT
8. 3870
9. 3850

That doesn't count quad CF or 3-way SLI. That is mostly made of my conjunctures, your previous list, and my previous reading on the net (COUGH*ANANDTECH*). I'm likely wrong somewhere, but it gives you can idea.

Good reading:
http://www.anandtech.com/showdoc.aspx?i=3151&p=1 38*0
http://www.anandtech.com/showdoc.aspx?i=3140&p=1 8800 GT


RE: finally
By Alpha4 on 1/3/2008 9:07:02 PM , Rating: 1
*Agreed*


RE: finally
By otispunkmeyer on 1/4/2008 6:07:10 AM , Rating: 2
yeah but those, i dunno they dont seem too mainstream to me, theyre still fairly high end

plus the 3850 is gonna have an 8800GS to contend with now, 192bit mem. 96 shaders .....

i think what he meant was, in that class of cards where the 8600 and 2600 sit...there is no 256bit card. and when the top products are using 384bit and 512bit you would of thought that there really should of been a bit more progress in that part. and finally there is.


RE: finally
By AggressorPrime on 1/3/2008 1:18:34 PM , Rating: 2
You are so right. Memory bandwidth is extremely important now that we have games like Crysis out that greatly limit the resolution. Memory bandwidth allows for greater resolutions. I just hope they push 512MB versions as well since you need 512MB as well for the XHD resolutions.


RE: finally
By HaZaRd2K6 on 1/3/2008 1:26:36 PM , Rating: 5
Why you think you'll be able to play Crysis at XHD resolutions with settings cranked on a lower-mid-range card is beyond me.

It's just not going to happen that way. If you buy a mid-range card, the manufacturers assume you have a mid-range system with a mid-range monitor and don't plan on running Crysis on a 24" screen.


RE: finally
By finelemon on 1/3/08, Rating: 0
RE: finally
By retrospooty on 1/3/2008 3:51:53 PM , Rating: 2
that doesnt change the fact that you cant run crysis at higg settings on this card, or the next gen mid range either for that matter, even on a high end system.


RE: finally
By finelemon on 1/3/2008 8:33:42 PM , Rating: 3
Why would you expect to be able to? No, there is no one in the pipeline who is in charge of making sure games will run well on current hardware. The game makers will always push for that little bit more than current hardware can do and hardware will always move along at its own pace regardless of what games would 'like'.

If you want to have someone to care about making sure that a game plays well on your hardware NOW then get a console.


RE: finally
By 1078feba on 1/4/2008 9:17:35 AM , Rating: 3
Provacative argument.

Maybe I'm displaying a bit too much ignorance here, but I really want to know.

If the highest end rigs around, with QX9650's OC'd to 3.6 and dual ultras on H2O and an NF780i at 1600 FSB can't run Crysis on a 26-30 inch monitor with all high settings at more than 20-30 FPS, how the hell did Crytek actually develop the game? What did they run it on? A Cray? I mean, how did they actually know that it would look spectacular if they were only able to run it 800x600?


RE: finally
By suryad on 1/4/08, Rating: 0
RE: finally
By finelemon on 1/5/2008 1:38:12 AM , Rating: 2
They use PC's. They are no different to the gamers themsevles. They would alternate between a lower resolution when they want a high frame rate or put up with a low frame rate to see it in high-res. Eg. no one in the world has seen Crysis running at 2048x1024 at 100FPS.


RE: finally
By retrospooty on 1/5/2008 1:42:43 PM , Rating: 3
"If the highest end rigs around, with QX9650's OC'd to 3.6 and dual ultras on H2O and an NF780i at 1600 FSB can't run Crysis on a 26-30 inch monitor with all high settings at more than 20-30 FPS, how the hell did Crytek actually develop the game?"

The answer is that it was not tested on a platform that can run it at playable speeds and highest settings, since this platform does not yet exist. Its a good thing to add higher settings for future systems. I wish more games did that.

With all that said, if you ask me, Crysis is not well optimized at all. Q4 and UT3 run very well at the highest settings with 4xAA enabled. Crysis certainly looks better than those games, but not THAT much better. It looks like 50% better and runs 300% slower. The juice isnt worth the squeeze.


RE: finally
By Lightning III on 1/3/08, Rating: -1
RE: finally
By Rockjock51 on 1/3/08, Rating: 0
RE: finally
By roadrun777 on 1/3/2008 6:21:20 PM , Rating: 2
I am just curious about the green weenies...
Where do you buy those?!?


RE: finally
By onwisconsin on 1/3/2008 8:43:11 PM , Rating: 3
Whatever it is, you can find it on eBay ;)


RE: finally
By Haltech on 1/3/2008 7:23:26 PM , Rating: 3
is it me or can anyone understand his last paragraph?


RE: finally
By PLaYaHaTeD on 1/3/2008 7:33:53 PM , Rating: 2
Lol, i literally couldn't control my laughter after i read your post, then read the last paragraph. So funny.


RE: finally
By ImSpartacus on 1/6/2008 9:46:32 AM , Rating: 2
I kinda feel bad for that guy. That's some terrible grammar.


RE: finally
By roadrun777 on 1/3/2008 7:34:54 PM , Rating: 3
Um, he is saying that he is angry about the heat issues and the lack of performance like the rest of us. Then he says that he tried to go green (green weenies?) and got angry with the performance and gave the video card (or machine in this case) back to whomever he got it from.

I personally have a 2 8800GT cards in SLI and I duct taped a broom stick to my computer so I can use it as a hair dryer in the morning, that way I am saving energy by not running two hair dryers at the same time.


RE: finally
By ShadowZERO on 1/4/2008 2:05:51 PM , Rating: 2
What happen? Someone set up us the bomb. All your base are belong to us!!! You have no chance to survive, make your time. ha ha ha!


"This is from the DailyTech.com. It's a science website." -- Rush Limbaugh

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki