backtop


Print 110 comment(s) - last by IGoodwin.. on Jan 10 at 5:42 PM

NVIDIA's D9M makes its first appearance on corporate roadmaps

NVIDIA's newest mid-range processor, codenamed D9M, will make its official debut as the GeForce 9600 GT.

Corporate guidance from NVIDIA lists the initial GeForce 9600 GT shipments come stock with a 650 MHz core clock and a 1625 MHz unified shader clock.  Unlike the G84 core found on GeForce 8600 GT, D9M will feature a 256-bit memory bus interface.  Coupled with a 900 MHz memory clock, NVIDIA calculates the memory bandwidth at 57.6 GB/s. 

The texture fill rate is estimated at 20.8 billion pixels per second.  The company would not indicate how many shaders or stream processors reside on the D9M core. 

Late last year, NVIDIA confirmed the D9 family will use TSMC's 65nm process node.  The company introduced its first 65nm processor shrink in November 2007: the G92

Other details of the D9M family have already surfaced.  ChileHardware published slides yesterday claiming the GeForce 9600 requires a 400W power supply that requires 26A on the 12V rail.  Unlike previous mid-range GeForce cards, the D9M will require a 6-pin supplementary power connector.

NVIDIA publicly confirmed other details of D9M: DirectX 10.1 support, Shader Model 4.0, OpenGL 2.1 and PCIe 2.0 support just to name a few. 

Further documentation from NVIDIA claims the 9600 GT will also support the Quantum Effects physics processing engine. 

Like all NVIDIA processors, the GeForce 9600 is also HDCP compatible, though final support still depends on vendor implementation. 

NVIDIA declined to comment on expected price of GeForce 9600.   A representative for NVIDIA would comment that the performance increase between GeForce 9600 and GeForce 8600 is "almost double."


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Wha T Evarrrr
By roadrun777 on 1/3/2008 1:31:34 PM , Rating: -1
All these arguments and discussions miss the point entirely!!!
If you can't give me a card that can do a minimum of 30fps (MINIMUM!!!) and never dip below that, then what good is it?
I don't want more slide show crud from a company that is only tweaking 6 year old technology to squeeze a little more out of it and squeeze our pocket books once again.
I haven't upgraded in 5 years because of this. You only have 2 categories, people who DO NOT play games and people who DO play games. There is no in between, despite the retarded marketing execs trying so hard to create imaginary tiers so they can make even more money




RE: Wha T Evarrrr
By marsbound2024 on 1/3/08, Rating: 0
RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By munky on 1/3/2008 4:06:01 PM , Rating: 2
So you expect to have a mainstream card that plays Crysis at full settings with no less than 30fps? If you wait for about 3-4 years, you might get your wish. Games keep increasing in visual complexity every year, and when you have a game that renders over a million polygons per frame, along with complex shaders, it's gonna kill any mainstream card, regardless of what resolution you use.


RE: Wha T Evarrrr
By johnadams on 1/4/2008 5:29:23 AM , Rating: 1
Hahaha watching you two go at each other is entertaining!


RE: Wha T Evarrrr
By johnadams on 1/4/2008 5:29:46 AM , Rating: 2
I meant "reading".


RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By ZiggyDeath on 1/3/2008 2:47:23 PM , Rating: 2
Go buy a console, you'll get your "30fps" in all games. Your vague and over-encompassing views would dictate that a videocard should be able to run all games at a minimum of 30fps without specifying resolution (other thank 10x7) AA, AS, HDR, and other processing intensive eye candy. Therefore the only thing that could possibly get close to what you're asking for is a console.


RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By ZiggyDeath on 1/3/2008 3:31:59 PM , Rating: 5
I take offense to the "real gamer" comment. Especially since I am a computer gamer. When Counterstrike was still in it's beta I would rack up over 16 hours a day during the summer. When WoW came out I was in a end game raiding guild for over 2 years, racking up over 20 hours of raiding a week not to mention the regular grind. And while I have diverged from strictly PC gaming into the console dominion there has been a trend for years in terms of the cat and mouse played between videocards and games.

When Doom was released it could be played at the lowest settings by many cards, cards which were already years older: the sacrifice was image quality. But even high end cards had difficulty with the highest settings until the next generation came to the market.

Now we're seeing this again with Crysis. I plunked down the money for a 7900GT when it just hit the market. That card is just reaching its second birthday and it can barely handle Crysis at Medium settings at 12x10. That was considered to be the lowest of the high end cards (before the GS hit the market) that Nvidia had to offer. What game at that card alive? Oblivion. It was impossible to turn on everything to the max even at a paltry 12x10. Did I complain? No, was I disappointed? Yes. But I had been gaming for years and I knew that there are always sacrifices to be made. That or hardware isn't keeping pace with software.

Now you're complaining about not being able to do the native rez of a 20inch flat. Well sorry, how about I complain about not being able to do the default resolution of a 20inch CRT of 1600x1200 back 10 years ago? Where's the justice in that, WAH WAH My games can't play at 16x12 on a videocard over 7 years old. No sir, that makes no sense.

While I would agree that it's purpose defeating to run a game any anything but the native resolution of a LCD panel due to the drop in image quality, the fact is you bought into new monitor technology and you should be aware of its limitations.

Also, if you had done any research at all, you'd find that the 8800GTX/Ultra in a triple SLI still gets wrecked by Crysis. So I don't see whey you're crying as to why a new maintstream card fails. Sorry, but the jump from the G80 to the G90 isn't the same as the FX to 6X00 series jump, the last time there was such a leap from the 6800 to the 7600.

Marketing fanboy, heh, that's the first time I've been called that. But in anycase, what you're asking for is literally saying "I want my N64 to be able to play GC games"


RE: Wha T Evarrrr
By mdogs444 on 1/3/2008 3:35:10 PM , Rating: 2
quote:
When Counterstrike was still in it's beta I would rack up over 16 hours a day during the summer. When WoW came out I was in a end game raiding guild for over 2 years, racking up over 20 hours of raiding a week not to mention the regular grind.

Try not to take offense, but I'm just trying to be comical .... but I honestly have no idea what the heck you just said, outside of saying that you have way too much free time.


RE: Wha T Evarrrr
By ZiggyDeath on 1/3/2008 4:12:51 PM , Rating: 2
I actually do have too much free time!


RE: Wha T Evarrrr
By Crystalysis on 1/10/2008 2:17:52 PM , Rating: 2
Well, it's either video games or nuclear underwater basket weaving lessons. The only two activities that are known to exist.

Me? I like both. :)


RE: Wha T Evarrrr
By FITCamaro on 1/3/2008 8:44:49 PM , Rating: 2
quote:
. When Counterstrike was still in it's beta I would rack up over 16 hours a day during the summer. When WoW came out I was in a end game raiding guild for over 2 years, racking up over 20 hours of raiding a week not to mention the regular grind.


This is not something to be proud of and brag about.


RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By Volrath06660 on 1/5/2008 10:08:14 AM , Rating: 2
Agreed sir.


RE: Wha T Evarrrr
By Villains on 1/3/2008 2:59:11 PM , Rating: 2
You sound like some pissed off senior citizen who cant afford to upgrade and wants to play Crysis on your 6600. You make zero sense.


RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By Holly on 1/3/2008 4:30:55 PM , Rating: 2
And Jan Werich said:
"The worst crash in the life is the crash with an idiot."


RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By lambofgode3x on 1/3/2008 4:34:58 PM , Rating: 2
you are an absolute idiot. just reading the crap that you write makes me want to kill myself. how can someone be so damn stupid...you have a balls slow, old ass computer and you expect to play recent games at native resolutions?? only if your resolutions is literally 3 by 2. other than that, its not going to happen. i have a 2900 pro video card and even though it flies though games, it does have trouble in some games. here's a hint for you, you lame brain jackass. HARDWARE CANNOT KEEP UP WITH SOFTWARE. there is no modern gpu that can run crysis at the moment. it happens every time. so now i have an idea for you. since you obviously have the mentality of a retarded 2 year old, why don't you call your internet service provider, tell them to cancel your account, then drive over to newark, nj, get out of your car, and start screaming the n word repeatedly. you moron


RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By Volrath06660 on 1/3/08, Rating: 0
RE: Wha T Evarrrr
By roadrun777 on 1/4/08, Rating: -1
RE: Wha T Evarrrr
By madoka on 1/3/2008 7:07:09 PM , Rating: 2
Tomorrow roadrun777 goes to car forums and demands his 1998 Geo Metro Sport be able to do 0-60 in 3.5 or else there was false advertising and we're all suckers for buying new cars with stagnant technology.


RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By mindless1 on 1/4/2008 10:41:48 PM , Rating: 2
Yes there is an "in between". An occasional gamer who, due to this, doesn't feel like paying for a higher-mid to high end card can still game at 30FPS.

Can they do it on the most demanding new game with high levels of eyecandy? Usually not, but that doesn't mean they can't enjoy many last generation games. You don't HAVE to crank up eyecandy and play above 1280x1024 to enjoy something, and my old 1280x monitor plays plenty of games staying above 30FPS "almost" always with a reused 7600GT video card. For that matter, back when Half Life 2 came out I tried it on an overclocked nForce2 IGP chipset. It ran in DX7 mode and I could see the difference, but I decided to try playing the game on that system and it was fun enough I played through the game again.

Unfortunately there is where many benchmarks fail, in advising customers how the low-mid to higher-midrange cards stack up not at the exact same eyecandy and resolution as the higher end cards but rather, what concessions would need be made to game at that 30FPS minimal mark. Actually a better index would be if it doesn't drop below 20FPS more than 5% of the time and stays above 40FPS on average.

If you want to have it all, just pay more for a different card. It's that simple. nVidia and ATI are going to market such cards to gamers even if they don't meet your standard of gaming because truth be told they can play "some" games, and there would be little point to them without gaming since integrated video will do practiallly all non-gaming, 2D tasks outside of some HD video decoding.


RE: Wha T Evarrrr
By MBlueD on 1/9/2008 5:21:47 AM , Rating: 1
While it does seem that you're a troll on a mission, I think it would be fun to participate :)

quote:
My point is that there is no "average" gamer. Either you get fluid motion in a game or you do not.

Another poster mentioned a valid point - 'fluid motion' by itself is an ambiguous term. I can't get 'fluid motion' running Crysis on my 1680x1050 22" LCD on high settings. That is true. But I can get very fluid motion running Pro Evolution Soccer 2008 with all high. Pro Evo also runs with fluid motion on the same LCD on a medium-range card. So if I only care to play Pro Evolution a medium range card is probably going to serve me well, especially considering the price difference relative to a high end card. See? here's a use for midrange cards.
Now to the definition of 'average gamer'. A gamer who only cares to play Pro Evo for example can be described as average gamer. I for example want to play every (good) RPG or FPS that comes out. I don't consider myself an 'average' gamer so when I buy a card, I don't consider midrange unless the price of high end cards is beyond my reach. Average gamers are defined by how much they care about gaming and the quality of their gaming experience. There probably are people willing to play Crysis at all-low settings at a resolution lower than their native LCD resolutions - these are 'average' gamers (or enthusiast gamers without a budget).

quote:
It basically means you will get a slide show of the game you thought you were going to play

If you want to buy a card to play a particular game, it is your responsibility to do some research on what CAN run that game. If you go and buy the first box that says 'High performance gaming' and expect it to play your game, it's your problem. As much as I do hate marketing terms, it's not their responsibility if you 'mis-interpret' their claims; I'm pretty sure thay can find an old game where their card will run at very high fps. They will call that 'high performance gaming' and prove their claim true. Even if they claimed you could play Crysis at high frame rates, once challenged, they would run it on (e.g.) 640x480 with all settings lowest possible.

My point is not to defend marketing claims - it's just that it's our responsibility as consumers to do our homework.

quote:
My solution is to implement "minimum experience ratings" which where suggested many years ago as a way to gauge a card's performance. I still think that is the way to go.

That'd be great!

quote:
And I do think cars are stagnant, I mean combustion engines? come on... The technology is so old that the paper the patent was written on turned to dust already if that helps your small monkey mind.

Innovation is not something that comes when one wants it. When Innovation comes, it comes. Also, please not that car manufacturers (and graphic card manufacturers) are not research companies - they are business companies, and thus business (profit) is top of the their list. You can't demand innovation from someone - unless you are rich enough to fund an RnD operation for the development of an alternative technology for something you are tired with, and even then you have a significant risk of spending ALL your money and still coming out with nothing useful.
A technology does not only need to pay for itself - it also needs to pay for failed attempts. Again, these companies are there to make a profit for themselves first and foremost.

I do think Graphic Cards changed greatly in the last five years. Please take into consideration that the absence of 'visible' innovations doesn't mean no innovations happened at all. You make it sound like you really expected a jump to quantum processing technology for the GPUs, and perhaps anti-gravity for air and land craft, all in 5 years!


RE: Wha T Evarrrr
By IGoodwin on 1/10/2008 5:42:54 PM , Rating: 1
To determine that something is false, you have to have a statement, and conditions against which to measure it. This linguistic gymnastics is something better left to advertisments, or lawyers.

Certainly, to use the car analogy, A 1998 Geo Metro Sport can do 0-60 in under 3.5 seconds! If I found a cliff high enough it could do the quater mile pretty quickly too.

Conversly, placing four heavy people in the car would slow it, making in any performance claim false, unless the correct conditions are applied, and the appropriate professional driver employed.

Therefore, claiming aminimum specification, like 30fps, is pointless, as it cannot be guarenteed by the performance of the graphics card alone. The processor, motherboad, memory, and the display all alter the dynamic, not to mention the operating system, drivers, and the game itself.

The graphics cards do mention there performance in terms that cannot be affected by external properties, and in that way, they are not producing flase advertising; where, claiming 30 fps in all games would be false.


RE: Wha T Evarrrr
By Lightnix on 1/3/2008 10:20:49 PM , Rating: 2
Wow, I read this and signed up for an account.

This post is ridiculous.

You will be able to play most games, including Crysis at minimum settings at most panel resolutions with this card... But that's not my point. There is such a thing as a casual gamer, a person willing to turn down the eye candy for an hour or two of gameplay every now and then just for fun. They don't have to turn up all settings to maximum, it's just a bit of a jolly. Maybe somebody who was told Portal was an awesome game by their friends so they thought they'd try it.

Some people even have these funny old monitors called 'CRTs' in which reducing the resolution is barely noticable. If you want to play at bigger, badder resolutions with all the eye candy turned up because you think a 5 year old LCD monitor is the best way to do it, then you're gonna have to shell out a bit for a higher end card.

As for the console resolution thing I saw somewhere else on this comment section, the two consoles with comparably current generation hardware (PS3 and Xbox 360) can render games at up to 1080P, or correctly stated: 1920x1080. They can also do 720P, 1280x720. And even bog standard TV resolution (differs depending on whether you're an American or an anyone else). Before anyone says it, yes the 360 can do 1080P when games are coded for it, and yes, the newer models do in fact have real HDMI ports.

By the way, a card that is roughly 5 years old, for example the Radeon 9800... Well the GeForce 6600 GT was about twice as fast as that. The 7600 GT more than twice the speed of that. The 8600 GT was a bit of a disappointment and only matched the 7600 GT, and the 9600 GT has specs that make it roughly twice as powerful as the 8600 GTS.

That's all.


RE: Wha T Evarrrr
By Oobu on 1/4/2008 2:41:37 AM , Rating: 2
I don't agree with everything roadrun777 said, but I do agree that companies shouldn't be able to get away with selling cards like the ATI 9200SE, or the nVidia 5500 cards at places like Wal-Mart with boxes that quite clearly say "High Performance Gaming" on the side. I guess it's all relative to the gamers opinion, but in my opinion, those cards are in no way "high performance gaming" cards and SHOULD NOT be sold off as though they were. Many people don't know the difference, they read the box, buy it, and then go home to be horribly disappointed. You can't expect everyone to know the differences in the cards, or the technical details.

Whether you guys want to agree with him or not, he makes some valid points... even though some of his complaints are pretty ridiculous.


RE: Wha T Evarrrr
By roadrun777 on 1/4/08, Rating: 0
RE: Wha T Evarrrr
By rdeegvainl on 1/4/08, Rating: 0
RE: Wha T Evarrrr
By Volrath06660 on 1/4/08, Rating: 0
RE: Wha T Evarrrr
By Oobu on 1/5/2008 7:28:56 PM , Rating: 2
I was just using those models as an example, I don't know what they sell anymore. The last time I looked, which has been awhile mind you, I remember seeing those 5500's, 9500's, and 9200's. Every single one of them said "High Performance Gaming" and you probably couldn't play any modern games on any of those (or whatever they have now) with decent settings. Even putting your games on the lowest resolution with the quality on very low they would probably still run like a slideshow. Yet they still put "High Performance Gaming" on the box, sad. Not too long ago I saw a PCI card, not AGP, not PCI Express, a PCI video card that said "High Performance Gaming" within the last two years. How many games can even run properly on that? Come on, you may not like what this guy is saying but he's got a point. I don't completely agree with him, but he's got the right idea.


RE: Wha T Evarrrr
By robinthakur on 1/4/2008 6:06:29 AM , Rating: 2
While Roadrunner is indeed a bitter old troll, and should not be encouraged, what I find more amazing is that people buy Crisis, most of them fully knowing that no system, regardless of cost, can run it in high res as it was intended to be played. I also find it odd that the developer released the game before most people had the capability to play it, presumably to claw back some development costs as the profits to be made in the games industry on PC are ever decreasing for all but online sub games.

Most developers do indeed develop for future hardware, but how far in the future is the question!! Do they really think that in two years time or whenever people think a midrange Nvidia card will play this game respectably at medium settings, the publisher will still be marketing Crisis as heavily or it won't have been superceded by something new?

There can be little worse a feeling as you try out your brand new tri SLI quad core system than playing Crisis at this point in time. I'm all for conspicuous consumption, but when you probably have to spend £2500 to play ONE game respectably, something is seriously not right with their economic model and you can understand why console gaming is more pervasive. More than anything, for us the customer, this model is most certainly not in your best interests. The fact that this graphical tour de force
is an otherwise unremarkable game is hard to overlook either.

I think that what he is basically saying is that on medium settings or whatever, all games within maybe 3 years should run at a certain level. That would at least give buyers some confidence in 'investing' in the graphics card market which otherwise is just crazily fast paced at nearly every level apart from the low end which has always remained fairly stable IMO :)


RE: Wha T Evarrrr
By Rockjock51 on 1/4/2008 11:03:33 AM , Rating: 2
Fairly sure this guy is just posting to make people mad at this point.


RE: Wha T Evarrrr
By Volrath06660 on 1/5/2008 10:05:48 AM , Rating: 2
And that is a completely logical standpoint. The only caveat I would add to that is that for a gpu to be viable for the three years, it had to have been purchased from the high end of the gpu spectrum. If you buy some midlevel card, the life expectancy of said card is greatly diminished due to its initial lack of horsepower. I have always picked up high end cards when I build, and with the exception of my first gpu, a Geforce FX5200, they are all still viable, including that old 9800Pro. I can still run Warhammer 40K DOW on mid range settings with a resolution of 1280x1024 and am still pulling down over 30 fps by fraps. So high end cards are worth it if you buy them when they first come out.

And while I do pick up high end cards, I personally see no reason to get the SLI, Crossfire, Quad SLI..etc systems because they do not scale well from the getgo, but also because they are just locking you into a purchase for longer than you really would want. I will be sticking with single gpus until they can get good performance scaling.


RE: Wha T Evarrrr
By Oobu on 1/5/2008 7:31:48 PM , Rating: 2
I'd have to agree. I tried SLI twice and I just see no point, maybe for bragging rights? Just buy ONE good highend card and call it a day. I think SLI, Crossfire, and anything else is just a gimmick to get your hard earned cash.


RE: Wha T Evarrrr
By morton on 1/4/08, Rating: 0
"And boy have we patented it!" -- Steve Jobs, Macworld 2007

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki