backtop


Print 110 comment(s) - last by IGoodwin.. on Jan 10 at 5:42 PM

NVIDIA's D9M makes its first appearance on corporate roadmaps

NVIDIA's newest mid-range processor, codenamed D9M, will make its official debut as the GeForce 9600 GT.

Corporate guidance from NVIDIA lists the initial GeForce 9600 GT shipments come stock with a 650 MHz core clock and a 1625 MHz unified shader clock.  Unlike the G84 core found on GeForce 8600 GT, D9M will feature a 256-bit memory bus interface.  Coupled with a 900 MHz memory clock, NVIDIA calculates the memory bandwidth at 57.6 GB/s. 

The texture fill rate is estimated at 20.8 billion pixels per second.  The company would not indicate how many shaders or stream processors reside on the D9M core. 

Late last year, NVIDIA confirmed the D9 family will use TSMC's 65nm process node.  The company introduced its first 65nm processor shrink in November 2007: the G92

Other details of the D9M family have already surfaced.  ChileHardware published slides yesterday claiming the GeForce 9600 requires a 400W power supply that requires 26A on the 12V rail.  Unlike previous mid-range GeForce cards, the D9M will require a 6-pin supplementary power connector.

NVIDIA publicly confirmed other details of D9M: DirectX 10.1 support, Shader Model 4.0, OpenGL 2.1 and PCIe 2.0 support just to name a few. 

Further documentation from NVIDIA claims the 9600 GT will also support the Quantum Effects physics processing engine. 

Like all NVIDIA processors, the GeForce 9600 is also HDCP compatible, though final support still depends on vendor implementation. 

NVIDIA declined to comment on expected price of GeForce 9600.   A representative for NVIDIA would comment that the performance increase between GeForce 9600 and GeForce 8600 is "almost double."


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Wha T Evarrrr
By roadrun777 on 1/3/2008 2:51:29 PM , Rating: -1
quote:
You sir, have certainly made an invalid point. First of all... 30 fps at WHAT resolution and WITH OR WITHOUT AA/AF and at what QUALITY? There are some mid-range cards that can play on the lowest settings and get more than 30fps on demanding games. It really comes down to resolution, graphics quality and if you use anisotropic filtering or antialiasing and such. If you want to play Crysis on a 24" screen at basically full HD resolution on a very high quality then you are going to have to dish out the bucks. Same reason you have to dish out the bucks if you want to watch your movies in 1080p instead of 720p or 480p. Go out and get yourself an 8800GTX or wait for the upcoming 9 series and then get an enthusiast card. You'll get your 30fps.


You sir are a baboon with a blandly colored rear end!
To say that I am somehow an enthusiast and should have to pay excessive prices just to play a game at 30fps at native panel resolutions is really ridiculous. You sound like a marketing fan boy. If I buy card "A" from company "A" and it can't play any game at standard panel resolutions that have been around for 5 years now (my 20" panel I have owned for 5 years), without any AA settings then what am I to think? You would have me think that it's a privilege to be able to play a game at native panel resolutions, and one that I should bleed cash for, and consume large amounts of electricity in the process. I only see the fact that I have had my panel for 5 years and have been unable to play any game at acceptable frame rates, because of the lack of innovation from said companies. So go ahead and keep regurgitation marketing BS, I will not listen! If it can't keep every game I play from dipping below 30fps at native panel resolutions then I consider it false advertising to say its a gaming card. It is NOT a privilege to play games at higher resolutions! It is a marketing ploy to make more money with stagnant technology. So go show your blandly colored arse somewhere else.


RE: Wha T Evarrrr
By munky on 1/3/2008 4:06:01 PM , Rating: 2
So you expect to have a mainstream card that plays Crysis at full settings with no less than 30fps? If you wait for about 3-4 years, you might get your wish. Games keep increasing in visual complexity every year, and when you have a game that renders over a million polygons per frame, along with complex shaders, it's gonna kill any mainstream card, regardless of what resolution you use.


RE: Wha T Evarrrr
By johnadams on 1/4/2008 5:29:23 AM , Rating: 1
Hahaha watching you two go at each other is entertaining!


RE: Wha T Evarrrr
By johnadams on 1/4/2008 5:29:46 AM , Rating: 2
I meant "reading".


“We do believe we have a moral responsibility to keep porn off the iPhone.” -- Steve Jobs

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki