backtop


Print 110 comment(s) - last by IGoodwin.. on Jan 10 at 5:42 PM

NVIDIA's D9M makes its first appearance on corporate roadmaps

NVIDIA's newest mid-range processor, codenamed D9M, will make its official debut as the GeForce 9600 GT.

Corporate guidance from NVIDIA lists the initial GeForce 9600 GT shipments come stock with a 650 MHz core clock and a 1625 MHz unified shader clock.  Unlike the G84 core found on GeForce 8600 GT, D9M will feature a 256-bit memory bus interface.  Coupled with a 900 MHz memory clock, NVIDIA calculates the memory bandwidth at 57.6 GB/s. 

The texture fill rate is estimated at 20.8 billion pixels per second.  The company would not indicate how many shaders or stream processors reside on the D9M core. 

Late last year, NVIDIA confirmed the D9 family will use TSMC's 65nm process node.  The company introduced its first 65nm processor shrink in November 2007: the G92

Other details of the D9M family have already surfaced.  ChileHardware published slides yesterday claiming the GeForce 9600 requires a 400W power supply that requires 26A on the 12V rail.  Unlike previous mid-range GeForce cards, the D9M will require a 6-pin supplementary power connector.

NVIDIA publicly confirmed other details of D9M: DirectX 10.1 support, Shader Model 4.0, OpenGL 2.1 and PCIe 2.0 support just to name a few. 

Further documentation from NVIDIA claims the 9600 GT will also support the Quantum Effects physics processing engine. 

Like all NVIDIA processors, the GeForce 9600 is also HDCP compatible, though final support still depends on vendor implementation. 

NVIDIA declined to comment on expected price of GeForce 9600.   A representative for NVIDIA would comment that the performance increase between GeForce 9600 and GeForce 8600 is "almost double."


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: finally
By retrospooty on 1/3/2008 3:51:53 PM , Rating: 2
that doesnt change the fact that you cant run crysis at higg settings on this card, or the next gen mid range either for that matter, even on a high end system.


RE: finally
By finelemon on 1/3/2008 8:33:42 PM , Rating: 3
Why would you expect to be able to? No, there is no one in the pipeline who is in charge of making sure games will run well on current hardware. The game makers will always push for that little bit more than current hardware can do and hardware will always move along at its own pace regardless of what games would 'like'.

If you want to have someone to care about making sure that a game plays well on your hardware NOW then get a console.


RE: finally
By 1078feba on 1/4/2008 9:17:35 AM , Rating: 3
Provacative argument.

Maybe I'm displaying a bit too much ignorance here, but I really want to know.

If the highest end rigs around, with QX9650's OC'd to 3.6 and dual ultras on H2O and an NF780i at 1600 FSB can't run Crysis on a 26-30 inch monitor with all high settings at more than 20-30 FPS, how the hell did Crytek actually develop the game? What did they run it on? A Cray? I mean, how did they actually know that it would look spectacular if they were only able to run it 800x600?


RE: finally
By suryad on 1/4/08, Rating: 0
RE: finally
By finelemon on 1/5/2008 1:38:12 AM , Rating: 2
They use PC's. They are no different to the gamers themsevles. They would alternate between a lower resolution when they want a high frame rate or put up with a low frame rate to see it in high-res. Eg. no one in the world has seen Crysis running at 2048x1024 at 100FPS.


RE: finally
By retrospooty on 1/5/2008 1:42:43 PM , Rating: 3
"If the highest end rigs around, with QX9650's OC'd to 3.6 and dual ultras on H2O and an NF780i at 1600 FSB can't run Crysis on a 26-30 inch monitor with all high settings at more than 20-30 FPS, how the hell did Crytek actually develop the game?"

The answer is that it was not tested on a platform that can run it at playable speeds and highest settings, since this platform does not yet exist. Its a good thing to add higher settings for future systems. I wish more games did that.

With all that said, if you ask me, Crysis is not well optimized at all. Q4 and UT3 run very well at the highest settings with 4xAA enabled. Crysis certainly looks better than those games, but not THAT much better. It looks like 50% better and runs 300% slower. The juice isnt worth the squeeze.


“And I don't know why [Apple is] acting like it’s superior. I don't even get it. What are they trying to say?” -- Bill Gates on the Mac ads

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki