backtop


Print 110 comment(s) - last by IGoodwin.. on Jan 10 at 5:42 PM

NVIDIA's D9M makes its first appearance on corporate roadmaps

NVIDIA's newest mid-range processor, codenamed D9M, will make its official debut as the GeForce 9600 GT.

Corporate guidance from NVIDIA lists the initial GeForce 9600 GT shipments come stock with a 650 MHz core clock and a 1625 MHz unified shader clock.  Unlike the G84 core found on GeForce 8600 GT, D9M will feature a 256-bit memory bus interface.  Coupled with a 900 MHz memory clock, NVIDIA calculates the memory bandwidth at 57.6 GB/s. 

The texture fill rate is estimated at 20.8 billion pixels per second.  The company would not indicate how many shaders or stream processors reside on the D9M core. 

Late last year, NVIDIA confirmed the D9 family will use TSMC's 65nm process node.  The company introduced its first 65nm processor shrink in November 2007: the G92

Other details of the D9M family have already surfaced.  ChileHardware published slides yesterday claiming the GeForce 9600 requires a 400W power supply that requires 26A on the 12V rail.  Unlike previous mid-range GeForce cards, the D9M will require a 6-pin supplementary power connector.

NVIDIA publicly confirmed other details of D9M: DirectX 10.1 support, Shader Model 4.0, OpenGL 2.1 and PCIe 2.0 support just to name a few. 

Further documentation from NVIDIA claims the 9600 GT will also support the Quantum Effects physics processing engine. 

Like all NVIDIA processors, the GeForce 9600 is also HDCP compatible, though final support still depends on vendor implementation. 

NVIDIA declined to comment on expected price of GeForce 9600.   A representative for NVIDIA would comment that the performance increase between GeForce 9600 and GeForce 8600 is "almost double."


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Wha T Evarrrr
By Volrath06660 on 1/5/2008 10:05:48 AM , Rating: 2
And that is a completely logical standpoint. The only caveat I would add to that is that for a gpu to be viable for the three years, it had to have been purchased from the high end of the gpu spectrum. If you buy some midlevel card, the life expectancy of said card is greatly diminished due to its initial lack of horsepower. I have always picked up high end cards when I build, and with the exception of my first gpu, a Geforce FX5200, they are all still viable, including that old 9800Pro. I can still run Warhammer 40K DOW on mid range settings with a resolution of 1280x1024 and am still pulling down over 30 fps by fraps. So high end cards are worth it if you buy them when they first come out.

And while I do pick up high end cards, I personally see no reason to get the SLI, Crossfire, Quad SLI..etc systems because they do not scale well from the getgo, but also because they are just locking you into a purchase for longer than you really would want. I will be sticking with single gpus until they can get good performance scaling.


RE: Wha T Evarrrr
By Oobu on 1/5/2008 7:31:48 PM , Rating: 2
I'd have to agree. I tried SLI twice and I just see no point, maybe for bragging rights? Just buy ONE good highend card and call it a day. I think SLI, Crossfire, and anything else is just a gimmick to get your hard earned cash.


"This is from the DailyTech.com. It's a science website." -- Rush Limbaugh

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki