backtop


Print 110 comment(s) - last by IGoodwin.. on Jan 10 at 5:42 PM

NVIDIA's D9M makes its first appearance on corporate roadmaps

NVIDIA's newest mid-range processor, codenamed D9M, will make its official debut as the GeForce 9600 GT.

Corporate guidance from NVIDIA lists the initial GeForce 9600 GT shipments come stock with a 650 MHz core clock and a 1625 MHz unified shader clock.  Unlike the G84 core found on GeForce 8600 GT, D9M will feature a 256-bit memory bus interface.  Coupled with a 900 MHz memory clock, NVIDIA calculates the memory bandwidth at 57.6 GB/s. 

The texture fill rate is estimated at 20.8 billion pixels per second.  The company would not indicate how many shaders or stream processors reside on the D9M core. 

Late last year, NVIDIA confirmed the D9 family will use TSMC's 65nm process node.  The company introduced its first 65nm processor shrink in November 2007: the G92

Other details of the D9M family have already surfaced.  ChileHardware published slides yesterday claiming the GeForce 9600 requires a 400W power supply that requires 26A on the 12V rail.  Unlike previous mid-range GeForce cards, the D9M will require a 6-pin supplementary power connector.

NVIDIA publicly confirmed other details of D9M: DirectX 10.1 support, Shader Model 4.0, OpenGL 2.1 and PCIe 2.0 support just to name a few. 

Further documentation from NVIDIA claims the 9600 GT will also support the Quantum Effects physics processing engine. 

Like all NVIDIA processors, the GeForce 9600 is also HDCP compatible, though final support still depends on vendor implementation. 

NVIDIA declined to comment on expected price of GeForce 9600.   A representative for NVIDIA would comment that the performance increase between GeForce 9600 and GeForce 8600 is "almost double."


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Wha T Evarrrr
By ZiggyDeath on 1/3/2008 3:31:59 PM , Rating: 5
I take offense to the "real gamer" comment. Especially since I am a computer gamer. When Counterstrike was still in it's beta I would rack up over 16 hours a day during the summer. When WoW came out I was in a end game raiding guild for over 2 years, racking up over 20 hours of raiding a week not to mention the regular grind. And while I have diverged from strictly PC gaming into the console dominion there has been a trend for years in terms of the cat and mouse played between videocards and games.

When Doom was released it could be played at the lowest settings by many cards, cards which were already years older: the sacrifice was image quality. But even high end cards had difficulty with the highest settings until the next generation came to the market.

Now we're seeing this again with Crysis. I plunked down the money for a 7900GT when it just hit the market. That card is just reaching its second birthday and it can barely handle Crysis at Medium settings at 12x10. That was considered to be the lowest of the high end cards (before the GS hit the market) that Nvidia had to offer. What game at that card alive? Oblivion. It was impossible to turn on everything to the max even at a paltry 12x10. Did I complain? No, was I disappointed? Yes. But I had been gaming for years and I knew that there are always sacrifices to be made. That or hardware isn't keeping pace with software.

Now you're complaining about not being able to do the native rez of a 20inch flat. Well sorry, how about I complain about not being able to do the default resolution of a 20inch CRT of 1600x1200 back 10 years ago? Where's the justice in that, WAH WAH My games can't play at 16x12 on a videocard over 7 years old. No sir, that makes no sense.

While I would agree that it's purpose defeating to run a game any anything but the native resolution of a LCD panel due to the drop in image quality, the fact is you bought into new monitor technology and you should be aware of its limitations.

Also, if you had done any research at all, you'd find that the 8800GTX/Ultra in a triple SLI still gets wrecked by Crysis. So I don't see whey you're crying as to why a new maintstream card fails. Sorry, but the jump from the G80 to the G90 isn't the same as the FX to 6X00 series jump, the last time there was such a leap from the 6800 to the 7600.

Marketing fanboy, heh, that's the first time I've been called that. But in anycase, what you're asking for is literally saying "I want my N64 to be able to play GC games"


RE: Wha T Evarrrr
By mdogs444 on 1/3/2008 3:35:10 PM , Rating: 2
quote:
When Counterstrike was still in it's beta I would rack up over 16 hours a day during the summer. When WoW came out I was in a end game raiding guild for over 2 years, racking up over 20 hours of raiding a week not to mention the regular grind.

Try not to take offense, but I'm just trying to be comical .... but I honestly have no idea what the heck you just said, outside of saying that you have way too much free time.


RE: Wha T Evarrrr
By ZiggyDeath on 1/3/2008 4:12:51 PM , Rating: 2
I actually do have too much free time!


RE: Wha T Evarrrr
By Crystalysis on 1/10/2008 2:17:52 PM , Rating: 2
Well, it's either video games or nuclear underwater basket weaving lessons. The only two activities that are known to exist.

Me? I like both. :)


RE: Wha T Evarrrr
By FITCamaro on 1/3/2008 8:44:49 PM , Rating: 2
quote:
. When Counterstrike was still in it's beta I would rack up over 16 hours a day during the summer. When WoW came out I was in a end game raiding guild for over 2 years, racking up over 20 hours of raiding a week not to mention the regular grind.


This is not something to be proud of and brag about.


RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By Volrath06660 on 1/5/2008 10:08:14 AM , Rating: 2
Agreed sir.


"Young lady, in this house we obey the laws of thermodynamics!" -- Homer Simpson

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki