backtop


Print 110 comment(s) - last by IGoodwin.. on Jan 10 at 5:42 PM

NVIDIA's D9M makes its first appearance on corporate roadmaps

NVIDIA's newest mid-range processor, codenamed D9M, will make its official debut as the GeForce 9600 GT.

Corporate guidance from NVIDIA lists the initial GeForce 9600 GT shipments come stock with a 650 MHz core clock and a 1625 MHz unified shader clock.  Unlike the G84 core found on GeForce 8600 GT, D9M will feature a 256-bit memory bus interface.  Coupled with a 900 MHz memory clock, NVIDIA calculates the memory bandwidth at 57.6 GB/s. 

The texture fill rate is estimated at 20.8 billion pixels per second.  The company would not indicate how many shaders or stream processors reside on the D9M core. 

Late last year, NVIDIA confirmed the D9 family will use TSMC's 65nm process node.  The company introduced its first 65nm processor shrink in November 2007: the G92

Other details of the D9M family have already surfaced.  ChileHardware published slides yesterday claiming the GeForce 9600 requires a 400W power supply that requires 26A on the 12V rail.  Unlike previous mid-range GeForce cards, the D9M will require a 6-pin supplementary power connector.

NVIDIA publicly confirmed other details of D9M: DirectX 10.1 support, Shader Model 4.0, OpenGL 2.1 and PCIe 2.0 support just to name a few. 

Further documentation from NVIDIA claims the 9600 GT will also support the Quantum Effects physics processing engine. 

Like all NVIDIA processors, the GeForce 9600 is also HDCP compatible, though final support still depends on vendor implementation. 

NVIDIA declined to comment on expected price of GeForce 9600.   A representative for NVIDIA would comment that the performance increase between GeForce 9600 and GeForce 8600 is "almost double."


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Wha T Evarrrr
By mindless1 on 1/4/2008 10:41:48 PM , Rating: 2
Yes there is an "in between". An occasional gamer who, due to this, doesn't feel like paying for a higher-mid to high end card can still game at 30FPS.

Can they do it on the most demanding new game with high levels of eyecandy? Usually not, but that doesn't mean they can't enjoy many last generation games. You don't HAVE to crank up eyecandy and play above 1280x1024 to enjoy something, and my old 1280x monitor plays plenty of games staying above 30FPS "almost" always with a reused 7600GT video card. For that matter, back when Half Life 2 came out I tried it on an overclocked nForce2 IGP chipset. It ran in DX7 mode and I could see the difference, but I decided to try playing the game on that system and it was fun enough I played through the game again.

Unfortunately there is where many benchmarks fail, in advising customers how the low-mid to higher-midrange cards stack up not at the exact same eyecandy and resolution as the higher end cards but rather, what concessions would need be made to game at that 30FPS minimal mark. Actually a better index would be if it doesn't drop below 20FPS more than 5% of the time and stays above 40FPS on average.

If you want to have it all, just pay more for a different card. It's that simple. nVidia and ATI are going to market such cards to gamers even if they don't meet your standard of gaming because truth be told they can play "some" games, and there would be little point to them without gaming since integrated video will do practiallly all non-gaming, 2D tasks outside of some HD video decoding.


"What would I do? I'd shut it down and give the money back to the shareholders." -- Michael Dell, after being asked what to do with Apple Computer in 1997

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki