NVIDIA Details GeForce 9600 GT
January 3, 2008 11:51 AM
comment(s) - last by
NVIDIA's D9M makes its first appearance on corporate roadmaps
NVIDIA's newest mid-range processor,
, will make its official debut as the GeForce 9600 GT.
Corporate guidance from NVIDIA lists the initial GeForce 9600 GT shipments come stock with a 650 MHz core clock and a 1625 MHz unified shader clock. Unlike the G84 core found on GeForce 8600 GT, D9M will feature a 256-bit memory bus interface. Coupled with a 900 MHz memory clock, NVIDIA calculates the memory bandwidth at 57.6 GB/s.
The texture fill rate is estimated at 20.8 billion pixels per second. The company would not indicate how many shaders or stream processors reside on the D9M core.
Late last year, NVIDIA confirmed the D9 family will use TSMC's 65nm process node. The company introduced its first 65nm processor shrink in November 2007:
Other details of the D9M family have already surfaced.
published slides yesterday
claiming the GeForce 9600 requires a 400W power supply that requires 26A on the 12V rail. Unlike previous mid-range GeForce cards, the D9M will require a 6-pin supplementary power connector.
NVIDIA publicly confirmed other details of D9M: DirectX 10.1 support, Shader Model 4.0, OpenGL 2.1 and PCIe 2.0 support just to name a few.
Further documentation from NVIDIA claims the 9600 GT will also support the Quantum Effects physics processing engine.
Like all NVIDIA processors, the GeForce 9600 is also HDCP compatible, though final support still depends on vendor implementation.
NVIDIA declined to comment on expected price of GeForce 9600. A representative for NVIDIA would comment that the performance increase between GeForce 9600 and GeForce 8600 is "almost double."
This article is over a month old, voting and posting comments is disabled
1/5/2008 1:42:43 PM
"If the highest end rigs around, with QX9650's OC'd to 3.6 and dual ultras on H2O and an NF780i at 1600 FSB can't run Crysis on a 26-30 inch monitor with all high settings at more than 20-30 FPS, how the hell did Crytek actually develop the game?"
The answer is that it was not tested on a platform that can run it at playable speeds and highest settings, since this platform does not yet exist. Its a good thing to add higher settings for future systems. I wish more games did that.
With all that said, if you ask me, Crysis is not well optimized at all. Q4 and UT3 run very well at the highest settings with 4xAA enabled. Crysis certainly looks better than those games, but not THAT much better. It looks like 50% better and runs 300% slower. The juice isnt worth the squeeze.
"You can bet that Sony built a long-term business plan about being successful in Japan and that business plan is crumbling." -- Peter Moore, 24 hours before his Microsoft resignation
What a Mess: Next-generation NVIDIA Codenames Revealed
November 8, 2007, 2:18 PM
But Wait, There's More NVIDIA G92 for 2007
October 31, 2007, 3:20 PM
"Prepare to be Punished": Microsoft is Killing OneDrive With Cuts, Blames Users
November 3, 2015, 8:23 PM
Apple's New "Magic" Peripheral Line Packs High Tech, High Prices
October 13, 2015, 9:39 PM
Samsung Adds 2 TB 850 EVO, PRO SSDs for $800, $1000
July 7, 2015, 4:23 PM
Seagate Senior Researcher: Heat Can Kill Data on Stored SSDs
May 13, 2015, 2:49 PM
How to Recover Most Apps After Your NVIDIA Driver Crashes in Windows 10
March 30, 2015, 12:54 PM
Tinkerer Gets Old School Mac Plus Running on the Modern Web
March 24, 2015, 6:41 PM
Latest Blog Posts
Sceptre Airs 27", 120 Hz. 1080p Monitor/HDTV w/ 5 ms Response Time for $220
Dec 3, 2014, 10:32 PM
Costco Gives Employees Thanksgiving Off; Wal-Mart Leads "Black Thursday" Charge
Oct 29, 2014, 9:57 PM
"Bear Selfies" Fad Could Turn Deadly, Warn Nevada Wildlife Officials
Oct 28, 2014, 12:00 PM
The Surface Mini That Was Never Released Gets "Hands On" Treatment
Sep 26, 2014, 8:22 AM
ISIS Imposes Ban on Teaching Evolution in Iraq
Sep 17, 2014, 5:22 PM
More Blog Posts
Copyright 2016 DailyTech LLC. -
Terms, Conditions & Privacy Information