Print 110 comment(s) - last by IGoodwin.. on Jan 10 at 5:42 PM

NVIDIA's D9M makes its first appearance on corporate roadmaps

NVIDIA's newest mid-range processor, codenamed D9M, will make its official debut as the GeForce 9600 GT.

Corporate guidance from NVIDIA lists the initial GeForce 9600 GT shipments come stock with a 650 MHz core clock and a 1625 MHz unified shader clock.  Unlike the G84 core found on GeForce 8600 GT, D9M will feature a 256-bit memory bus interface.  Coupled with a 900 MHz memory clock, NVIDIA calculates the memory bandwidth at 57.6 GB/s. 

The texture fill rate is estimated at 20.8 billion pixels per second.  The company would not indicate how many shaders or stream processors reside on the D9M core. 

Late last year, NVIDIA confirmed the D9 family will use TSMC's 65nm process node.  The company introduced its first 65nm processor shrink in November 2007: the G92

Other details of the D9M family have already surfaced.  ChileHardware published slides yesterday claiming the GeForce 9600 requires a 400W power supply that requires 26A on the 12V rail.  Unlike previous mid-range GeForce cards, the D9M will require a 6-pin supplementary power connector.

NVIDIA publicly confirmed other details of D9M: DirectX 10.1 support, Shader Model 4.0, OpenGL 2.1 and PCIe 2.0 support just to name a few. 

Further documentation from NVIDIA claims the 9600 GT will also support the Quantum Effects physics processing engine. 

Like all NVIDIA processors, the GeForce 9600 is also HDCP compatible, though final support still depends on vendor implementation. 

NVIDIA declined to comment on expected price of GeForce 9600.   A representative for NVIDIA would comment that the performance increase between GeForce 9600 and GeForce 8600 is "almost double."

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

so what about the rumoured 8800gs?
By RamarC on 1/3/2008 1:40:04 PM , Rating: 3
like this 9600gt, the 8800gs was supposed to slot between the 8600gts and 8800gt.

By RamarC on 1/3/2008 1:47:44 PM , Rating: 2
btw, i'm asking because if the 9600gt debuts in < 60 days with comparable performance, then it seems to me that the 8800gs won't see the light of day.

RE: so what about the rumoured 8800gs?
By ImSpartacus on 1/3/2008 4:49:51 PM , Rating: 2
I've been praying for an 8800gs since the original 8800 lineup was released (gtx,640 gts, 320 gts, NOTHING, gts, gt, etc). If the 8800gt's prices came down it might fill the gap, but thats not happening.

I think the 9600gt will probably msrp a little south of the 200$ mark to compete with the 3850 (i think the 256mb 8800gt was supposed to be sold at 200$ but is more 215-220). However it will surely rise.

RE: so what about the rumoured 8800gs?
By RamarC on 1/3/2008 8:29:52 PM , Rating: 2
now i'm really confused. hardocp has pics of a supposed 9800 gx2 which seems to be a g92 based 8800gtx x2.

9600gt, 8800gs, 9800gx2... something's screwy, or maybe it's me 8^).

RE: so what about the rumoured 8800gs?
By JackBeQuick on 1/4/2008 9:46:33 AM , Rating: 2
Looks pretty photoshop to me. G92 is already drying up, i think the likelihood of 8800 GS is almost nill. But then what would Kyle and Fudo write about ....

By ImSpartacus on 1/6/2008 9:53:07 AM , Rating: 2
Yeah, I understand an 8800 GS isn't going to happen, but it was a dream. the 8800 GT pretty much does it all anyway (aside from low end which is covered by this new 9600 GT).

By initialised on 1/10/2008 4:35:14 AM , Rating: 2
nVidia's got a bloody cheek, the Radeon 9600 was a cracking mid range card a few years back.

By Lord 666 on 1/3/2008 9:34:45 PM , Rating: 1
The word on the street is the 8800GS will be an AGP part specfically aimed at grandma's with old gear but wanting to use Aero ;)

I wonder
By SpaceRanger on 1/3/2008 12:45:58 PM , Rating: 3
If the performance of a 8600GT -> 9600GT is almost doubled, then this card will be a pleasant addition to their family. As long as it is priced right.

What I'd like to know is how the 9600GT will stack up against the G92 (8800 GT)

RE: I wonder
By TheDiceman on 1/3/2008 12:57:07 PM , Rating: 1
I am sure the MSRP will be reasonable, the bigger question is how high will retailers raise that price should the demand be high.

RE: I wonder
By HaZaRd2K6 on 1/3/2008 1:22:05 PM , Rating: 3
That all depends on if it competes with G92. If it does, then factory-overclocked D9Ms will likely be a lot more popular than stock G92s, in turn driving the prices of factory-overclocked D9Ms up, and stock-clocked G92s down.

The real thing to watch for here is the price. Assuming G92 stays around its current ~$275 price point and D9M comes in anywhere below that (with the performance difference much less than the price difference), Nvidia may be obsoleting its own lineup again.

RE: I wonder
By CvP on 1/3/2008 1:46:07 PM , Rating: 5
9600GT will be very close to 8800GT in lower res.
it will trail behind faster in higher res.

They both have more or less same (8800GT,9600GT)

Membandwidth 57.6 GB/s
memory bus 256-bit
Core Speed MHz: 600,650
Memory Speed MHz: 900 (1800 effective)
DirectX / Shader Model DX 10,10.1 / SM 4.0

however, 9600GT's texture fill rate is low (20.8 bp/s) compared to 8800GT's (33.6 bp/s).

now, it all depends on Stream processor count. but i doubt 9600GT can even get close to 8800GT in higher res.

Street Date?
By Cygni on 1/3/2008 12:45:40 PM , Rating: 2
Does anyone have any info on the street date (and/or price range info) of the D9x series? Are we looking at a mid/late 08 product?

RE: Street Date?
By retrospooty on 1/3/2008 12:48:02 PM , Rating: 2
Most sources are saying February, but its not confirmed by NV yet. Price will likely be in the $150 range since 8800GT is faster and supposed to be as low as $200

RE: Street Date?
By AmbroseAthan on 1/3/2008 1:14:54 PM , Rating: 2
I thought the D9E was slated for sometime in February/March, while the D9M would be a month or two following D9E.

RE: Street Date?
By HaZaRd2K6 on 1/3/2008 1:24:00 PM , Rating: 2
That would make sense considering Nvidia and AMD both usually release their high-end products before their mid-range and low-end products. I'd like to see some performance benchmarks first, though (and not 3DMark!).

RE: Street Date?
By kamel5547 on 1/3/2008 12:52:25 PM , Rating: 1
Site below says Marchish production date, I would expect a release date of April/May based on that. Seems fairly plausible.

so where does this leave all of us 8800gt owners?
By vexingv on 1/3/08, Rating: 0
By Polynikes on 1/3/2008 2:07:52 PM , Rating: 2
I won a card game and pocketed $300 at Thanksgiving (from my girlfriend's family, no less) and was seriously considering buying the 8800GT, but I had a feeling there would be some even better stuff coming in the 1st half of '08, so I decided to make that the base of my system refresh fund, which I have been adding bits of money to. New video card, new quad core. Can't wait.

By bohhad on 1/3/2008 2:42:40 PM , Rating: 2
i wouldn't say you got shafted, more like you're just a victim of the vicious development cycle graphics cards go thru. it happens once or twice every year, the geforce 8s left geforce 7 owners feeling screwed, the 7 left the 6xxx series owners feeling screwed... you could wait forever and never buy anything because the latest and greatest is only a couple months away. all tech is like this, graphics cards are just way worse. i just buy what i can afford and live with it, knowing that something twice as fast and half the cost is on its way

By PitViper007 on 1/3/2008 2:54:04 PM , Rating: 2
And that's why I only buy a new card every couple of series. Case in point, I am now running a 6800. I'm at the point now where I'm ready to upgrade, and the 9600GT looks like it might be the one for me. Or if it takes too long coming out, I'll bite on the 8800GT.

By Lightning III on 1/3/2008 4:30:21 PM , Rating: 2

Shader 4.0 or 4.1 support?
By wingless on 1/3/2008 1:37:51 PM , Rating: 3
I thought the DX10.1 shader model was Shader 4.1. WTF?

RE: Shader 4.0 or 4.1 support?
By mars777 on 1/6/2008 4:03:44 AM , Rating: 2
Why would a card have to support shader 4.1 to be 10.1 compliant? AFAIK shaders 4.1 are only a spec allowed by DX 10.1 and since shaders don't influence cap bits (what was removed from DX10.1) there is no need to use a shader 4.1 if you don't want to. Why do n+1 loop iterations to do something that can be done with n loops... basically shaders 4.1 are a more loose spec than shaders 4.0 (not a more strict spec).

I think DX 10.1 allows for shaders 4.1, does not enforce them.

Radeon 9600?
By jonmcc33 on 1/3/2008 11:45:36 PM , Rating: 3
Um, please nVIDIA come up with a new naming convention. Memory serves correct that ATi's 9X00 series were the ones that slapped nVIDIA in the face. Do they REALLY want to be using those numbers still?

By RU482 on 1/4/2008 9:37:08 AM , Rating: 3
Even though there are only two competitors, it's like there are 3 or 4 since they keep competing with themselves.

By mindless1 on 1/4/2008 10:59:34 PM , Rating: 2
Up to twice as fast seems really good, but historically we saw this tends to mean the largest performance gains are in the newer DX features, not the raw performance needed to stay at good framerates if you turn down eyecandy enough to make a new game playable with what will probably end up being a ~ $150, then down to $75 AR (given some time, the span of 6-12 months) part.

It's good news that the average midrange cards performance goes up a bit every generation but I think it being 2X as fast as 8600GT has a little to do with 8600GT being a smaller performance increase than preceeding generations saw compared to the generations they replaced (with exception of the worse FX5nnn series).

Even with this good news, common sense tells us it isn't going to be a replacement for a higher end card, nVidia isn't often going to kill their high end like they did with the 8800GT, they just couldn't put off getting this tech into the market until it had depreciated more as they now show they have more parts coming as always.

Gamers are SO stingy though, they have a multi-hundred dollar system and (supposedly, unless they're pirating the games) paying $30 plus per game, then will wait weeks or spend hours trying to get a product with only a few dozen dollars difference in price. If paying $50 more makes that much difference should you really be gaming? It's not that I don't like good value, even getting something for nothing is a novel quest, but if you gamed a dozen hours you had the spare time, could've just worked at McDonalds/etc for that period and bought a higher tiered card instead.

And the 9800:
By Clauzii on 1/7/2008 6:00:24 PM , Rating: 2,news-203.html

Funny how nVidia uses old ATI names now.

By rupaniii on 1/8/2008 10:41:11 AM , Rating: 2
So, their wizbang new chip has all of those features that weren't important in AMD's new chips are in their new chip.
Go figure!!

By andhesaid on 1/3/08, Rating: 0
By Crazygeek on 1/3/08, Rating: 0
By nosfe on 1/3/08, Rating: -1
RE: finally
By ChronoReverse on 1/3/2008 12:49:43 PM , Rating: 5
Did you completely miss the 3850 mid-range card that has a 256bit memory interface?

RE: finally
By mholler on 1/3/2008 12:54:55 PM , Rating: 5
Took the words right out of my keyboard.

RE: finally
By Noya on 1/3/08, Rating: -1
RE: finally
By Duwelon on 1/3/2008 1:20:41 PM , Rating: 2
You're an AMDist.

RE: finally
By gtrinku on 1/3/2008 4:55:08 PM , Rating: 5
Wouldn't that be an antiAMDite?

RE: finally
By Visual on 1/4/2008 4:52:03 AM , Rating: 2

RE: finally
By ChronoReverse on 1/3/2008 2:01:04 PM , Rating: 4
Well, I purely look at the merits of each card and the 3850 just so happens to be a worthy card even if most of its brethen aren't. Drivers in single card mode work fine too.

Frankly, AMD drivers have been since the launch of Vista with speed almost par to XP. Nvidia has really screwed Vista over by giving the impression that Vista is significantly slower in games compared to XP. At least nowadays both sides have drivers that are par with XP but the damage has already been done.

RE: finally
By beepandbop on 1/4/2008 11:59:25 AM , Rating: 2
That's really ignorant. AMD is lagging in the processor area, but for once, AMD/ATi put out midrange/high performing cards that compete on the midrange level--which is where most of the market is. So you could have labeled yourself a complete moron, and save us the time for reading that worthless comment.

RE: finally
By Kamgusta on 1/3/2008 4:07:16 PM , Rating: 2
I found quite funny to call "mid-range" the 3850.

It shares the same architecture of 3870 (that I think should be called "very-high-end"), but only has lower clocks.

Also, how should we call 29x0s? "mid-low-range"? 26x0s? "low-range"? And 24x0s? "very-low-range"?

3850 and 3870 are high-end parts. If DAAMIT engaged a price war with NVIDIA and wants to sell them at discount prices, this doesn't change the situation.

Otherwise, I should also call an ATI X1950XT "very-low-end" 'cause I can pick one for 119 bucks @ newegg...

Performance - ATI Radeon™ HD 2900
Mainstream - ATI Radeon™ HD 2600
Value - ATI Radeon™ HD 2400

RE: finally
By Lightning III on 1/3/2008 4:24:26 PM , Rating: 1
if it starts with a 2 it called a previous generation stu boy

RE: finally
By Kamgusta on 1/3/2008 5:06:54 PM , Rating: 3
Didn't you noticed HD3400s/HD3600s are still to be launched? And "previous generation" HD2900 PRO was launched just on October 2007 (2 months ago)?

RE: finally
By Shining Arcanine on 1/5/2008 11:36:06 AM , Rating: 2
That is 3 months ago.

RE: finally
By ImSpartacus on 1/3/2008 4:42:51 PM , Rating: 5
The 3870 and 3850 are not high end at all. look at the price. they are competing on the 8800gt's low end. they basically replaced most of the 2x00 series.

comparing the 2900xt to the 3870 is like comparing the 8800gt to the 8800gtx. you get a hair more performance for a huge price jump.

and the 3850 pretty much covers the rest of amd's mid range.

Don't get me wrong, im no amd fanboy, I would get an 8800gt if they were cheaper (and in stock), but deals like a 512mb 3850 selling cheaper than the 256mb 8800gt (200$<215$) cant be passed up.

RE: finally
By Belard on 1/4/2008 12:07:09 AM , Rating: 2
Well... they are HIGH end, just as the $300 8800GT ended up being a HIGH-END part since its comparable to the 8800GTX.. and there isn't anything that is much faster... other than the $700 Ultra which isn't worth it.

Looking at another major site with a graph, the HD3870 slides in right under the 8800GT. So the TOP 5 cards are:
1 - 8800Ultra ($700~800 - avg $700)
2 - 8800GTX ($470~600 - avg $500)
3 - 8800GTS-512 ($330~$380) [not same as original gts 640/320)
4 - 8800GT ($265~320 - avg $300)
5 - HD 3870 ($240~280 - avg $250)

The performance delta of the 3870~8800Ultra is not that much.

The 9600 doesn't sound so hot... they put it at DOUBLE the 8600? Hmm... the 8800GT/3870 are already more than twice as fast as the 8600GTS!!. Now if the 9600 sells for under $125 then it maybe worthwhile. note: the $180 3850 is almost twice as fast as the 8600GTS..

Oh well... who knows...

RE: finally
By suryad on 1/4/08, Rating: 0
RE: finally
By ImSpartacus on 1/6/2008 9:16:10 AM , Rating: 2
Any purchase can be rationalized as "worth it" if you have the funding and the "need", but for the majority of us who have neither, it is a bad decision.

RE: finally
By ImSpartacus on 1/6/2008 9:43:33 AM , Rating: 2
Now that may be a single card set up, however AMD's current strategy is to eliminate the ultra high end (gtx/ultra, etc).

AMD's high end is cross-fire 3870's. Why do you think the 38*0's are so damn good at scaling multiple graphics cards AND they have extremely low power requirements?

It just makes sense that AMD will begin to endorse lots of multiple card setups (quad x-fire?).

I personally think it's a novel idea, but right now I will stick with a single card setup (that's why I was eying an 8800gt).

I would setup a more relevant list due to the manufacturer's wishes.

1. SLI 8800 Ultra
2. SLI 8800 GTX
3. SLI 8800 GTS (512mb) / 8800 Ultra
4. SLI 8800 GT / CF 3870
5. 8800 GTX
6. 8800 GTS (512 mb) / CF 3850
7. 8800 GT
8. 3870
9. 3850

That doesn't count quad CF or 3-way SLI. That is mostly made of my conjunctures, your previous list, and my previous reading on the net (COUGH*ANANDTECH*). I'm likely wrong somewhere, but it gives you can idea.

Good reading: 38*0 8800 GT

RE: finally
By Alpha4 on 1/3/2008 9:07:02 PM , Rating: 1

RE: finally
By otispunkmeyer on 1/4/2008 6:07:10 AM , Rating: 2
yeah but those, i dunno they dont seem too mainstream to me, theyre still fairly high end

plus the 3850 is gonna have an 8800GS to contend with now, 192bit mem. 96 shaders .....

i think what he meant was, in that class of cards where the 8600 and 2600 sit...there is no 256bit card. and when the top products are using 384bit and 512bit you would of thought that there really should of been a bit more progress in that part. and finally there is.

RE: finally
By AggressorPrime on 1/3/2008 1:18:34 PM , Rating: 2
You are so right. Memory bandwidth is extremely important now that we have games like Crysis out that greatly limit the resolution. Memory bandwidth allows for greater resolutions. I just hope they push 512MB versions as well since you need 512MB as well for the XHD resolutions.

RE: finally
By HaZaRd2K6 on 1/3/2008 1:26:36 PM , Rating: 5
Why you think you'll be able to play Crysis at XHD resolutions with settings cranked on a lower-mid-range card is beyond me.

It's just not going to happen that way. If you buy a mid-range card, the manufacturers assume you have a mid-range system with a mid-range monitor and don't plan on running Crysis on a 24" screen.

RE: finally
By finelemon on 1/3/08, Rating: 0
RE: finally
By retrospooty on 1/3/2008 3:51:53 PM , Rating: 2
that doesnt change the fact that you cant run crysis at higg settings on this card, or the next gen mid range either for that matter, even on a high end system.

RE: finally
By finelemon on 1/3/2008 8:33:42 PM , Rating: 3
Why would you expect to be able to? No, there is no one in the pipeline who is in charge of making sure games will run well on current hardware. The game makers will always push for that little bit more than current hardware can do and hardware will always move along at its own pace regardless of what games would 'like'.

If you want to have someone to care about making sure that a game plays well on your hardware NOW then get a console.

RE: finally
By 1078feba on 1/4/2008 9:17:35 AM , Rating: 3
Provacative argument.

Maybe I'm displaying a bit too much ignorance here, but I really want to know.

If the highest end rigs around, with QX9650's OC'd to 3.6 and dual ultras on H2O and an NF780i at 1600 FSB can't run Crysis on a 26-30 inch monitor with all high settings at more than 20-30 FPS, how the hell did Crytek actually develop the game? What did they run it on? A Cray? I mean, how did they actually know that it would look spectacular if they were only able to run it 800x600?

RE: finally
By suryad on 1/4/08, Rating: 0
RE: finally
By finelemon on 1/5/2008 1:38:12 AM , Rating: 2
They use PC's. They are no different to the gamers themsevles. They would alternate between a lower resolution when they want a high frame rate or put up with a low frame rate to see it in high-res. Eg. no one in the world has seen Crysis running at 2048x1024 at 100FPS.

RE: finally
By retrospooty on 1/5/2008 1:42:43 PM , Rating: 3
"If the highest end rigs around, with QX9650's OC'd to 3.6 and dual ultras on H2O and an NF780i at 1600 FSB can't run Crysis on a 26-30 inch monitor with all high settings at more than 20-30 FPS, how the hell did Crytek actually develop the game?"

The answer is that it was not tested on a platform that can run it at playable speeds and highest settings, since this platform does not yet exist. Its a good thing to add higher settings for future systems. I wish more games did that.

With all that said, if you ask me, Crysis is not well optimized at all. Q4 and UT3 run very well at the highest settings with 4xAA enabled. Crysis certainly looks better than those games, but not THAT much better. It looks like 50% better and runs 300% slower. The juice isnt worth the squeeze.

RE: finally
By Lightning III on 1/3/08, Rating: -1
RE: finally
By Rockjock51 on 1/3/08, Rating: 0
RE: finally
By roadrun777 on 1/3/2008 6:21:20 PM , Rating: 2
I am just curious about the green weenies...
Where do you buy those?!?

RE: finally
By onwisconsin on 1/3/2008 8:43:11 PM , Rating: 3
Whatever it is, you can find it on eBay ;)

RE: finally
By Haltech on 1/3/2008 7:23:26 PM , Rating: 3
is it me or can anyone understand his last paragraph?

RE: finally
By PLaYaHaTeD on 1/3/2008 7:33:53 PM , Rating: 2
Lol, i literally couldn't control my laughter after i read your post, then read the last paragraph. So funny.

RE: finally
By ImSpartacus on 1/6/2008 9:46:32 AM , Rating: 2
I kinda feel bad for that guy. That's some terrible grammar.

RE: finally
By roadrun777 on 1/3/2008 7:34:54 PM , Rating: 3
Um, he is saying that he is angry about the heat issues and the lack of performance like the rest of us. Then he says that he tried to go green (green weenies?) and got angry with the performance and gave the video card (or machine in this case) back to whomever he got it from.

I personally have a 2 8800GT cards in SLI and I duct taped a broom stick to my computer so I can use it as a hair dryer in the morning, that way I am saving energy by not running two hair dryers at the same time.

RE: finally
By ShadowZERO on 1/4/2008 2:05:51 PM , Rating: 2
What happen? Someone set up us the bomb. All your base are belong to us!!! You have no chance to survive, make your time. ha ha ha!

Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By marsbound2024 on 1/3/08, Rating: 0
RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By munky on 1/3/2008 4:06:01 PM , Rating: 2
So you expect to have a mainstream card that plays Crysis at full settings with no less than 30fps? If you wait for about 3-4 years, you might get your wish. Games keep increasing in visual complexity every year, and when you have a game that renders over a million polygons per frame, along with complex shaders, it's gonna kill any mainstream card, regardless of what resolution you use.

RE: Wha T Evarrrr
By johnadams on 1/4/2008 5:29:23 AM , Rating: 1
Hahaha watching you two go at each other is entertaining!

RE: Wha T Evarrrr
By johnadams on 1/4/2008 5:29:46 AM , Rating: 2
I meant "reading".

RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By ZiggyDeath on 1/3/2008 2:47:23 PM , Rating: 2
Go buy a console, you'll get your "30fps" in all games. Your vague and over-encompassing views would dictate that a videocard should be able to run all games at a minimum of 30fps without specifying resolution (other thank 10x7) AA, AS, HDR, and other processing intensive eye candy. Therefore the only thing that could possibly get close to what you're asking for is a console.

RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By ZiggyDeath on 1/3/2008 3:31:59 PM , Rating: 5
I take offense to the "real gamer" comment. Especially since I am a computer gamer. When Counterstrike was still in it's beta I would rack up over 16 hours a day during the summer. When WoW came out I was in a end game raiding guild for over 2 years, racking up over 20 hours of raiding a week not to mention the regular grind. And while I have diverged from strictly PC gaming into the console dominion there has been a trend for years in terms of the cat and mouse played between videocards and games.

When Doom was released it could be played at the lowest settings by many cards, cards which were already years older: the sacrifice was image quality. But even high end cards had difficulty with the highest settings until the next generation came to the market.

Now we're seeing this again with Crysis. I plunked down the money for a 7900GT when it just hit the market. That card is just reaching its second birthday and it can barely handle Crysis at Medium settings at 12x10. That was considered to be the lowest of the high end cards (before the GS hit the market) that Nvidia had to offer. What game at that card alive? Oblivion. It was impossible to turn on everything to the max even at a paltry 12x10. Did I complain? No, was I disappointed? Yes. But I had been gaming for years and I knew that there are always sacrifices to be made. That or hardware isn't keeping pace with software.

Now you're complaining about not being able to do the native rez of a 20inch flat. Well sorry, how about I complain about not being able to do the default resolution of a 20inch CRT of 1600x1200 back 10 years ago? Where's the justice in that, WAH WAH My games can't play at 16x12 on a videocard over 7 years old. No sir, that makes no sense.

While I would agree that it's purpose defeating to run a game any anything but the native resolution of a LCD panel due to the drop in image quality, the fact is you bought into new monitor technology and you should be aware of its limitations.

Also, if you had done any research at all, you'd find that the 8800GTX/Ultra in a triple SLI still gets wrecked by Crysis. So I don't see whey you're crying as to why a new maintstream card fails. Sorry, but the jump from the G80 to the G90 isn't the same as the FX to 6X00 series jump, the last time there was such a leap from the 6800 to the 7600.

Marketing fanboy, heh, that's the first time I've been called that. But in anycase, what you're asking for is literally saying "I want my N64 to be able to play GC games"

RE: Wha T Evarrrr
By mdogs444 on 1/3/2008 3:35:10 PM , Rating: 2
When Counterstrike was still in it's beta I would rack up over 16 hours a day during the summer. When WoW came out I was in a end game raiding guild for over 2 years, racking up over 20 hours of raiding a week not to mention the regular grind.

Try not to take offense, but I'm just trying to be comical .... but I honestly have no idea what the heck you just said, outside of saying that you have way too much free time.

RE: Wha T Evarrrr
By ZiggyDeath on 1/3/2008 4:12:51 PM , Rating: 2
I actually do have too much free time!

RE: Wha T Evarrrr
By Crystalysis on 1/10/2008 2:17:52 PM , Rating: 2
Well, it's either video games or nuclear underwater basket weaving lessons. The only two activities that are known to exist.

Me? I like both. :)

RE: Wha T Evarrrr
By FITCamaro on 1/3/2008 8:44:49 PM , Rating: 2
. When Counterstrike was still in it's beta I would rack up over 16 hours a day during the summer. When WoW came out I was in a end game raiding guild for over 2 years, racking up over 20 hours of raiding a week not to mention the regular grind.

This is not something to be proud of and brag about.

RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By Volrath06660 on 1/5/2008 10:08:14 AM , Rating: 2
Agreed sir.

RE: Wha T Evarrrr
By Villains on 1/3/2008 2:59:11 PM , Rating: 2
You sound like some pissed off senior citizen who cant afford to upgrade and wants to play Crysis on your 6600. You make zero sense.

RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By Holly on 1/3/2008 4:30:55 PM , Rating: 2
And Jan Werich said:
"The worst crash in the life is the crash with an idiot."

RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By lambofgode3x on 1/3/2008 4:34:58 PM , Rating: 2
you are an absolute idiot. just reading the crap that you write makes me want to kill myself. how can someone be so damn have a balls slow, old ass computer and you expect to play recent games at native resolutions?? only if your resolutions is literally 3 by 2. other than that, its not going to happen. i have a 2900 pro video card and even though it flies though games, it does have trouble in some games. here's a hint for you, you lame brain jackass. HARDWARE CANNOT KEEP UP WITH SOFTWARE. there is no modern gpu that can run crysis at the moment. it happens every time. so now i have an idea for you. since you obviously have the mentality of a retarded 2 year old, why don't you call your internet service provider, tell them to cancel your account, then drive over to newark, nj, get out of your car, and start screaming the n word repeatedly. you moron

RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By Volrath06660 on 1/3/08, Rating: 0
RE: Wha T Evarrrr
By roadrun777 on 1/4/08, Rating: -1
RE: Wha T Evarrrr
By madoka on 1/3/2008 7:07:09 PM , Rating: 2
Tomorrow roadrun777 goes to car forums and demands his 1998 Geo Metro Sport be able to do 0-60 in 3.5 or else there was false advertising and we're all suckers for buying new cars with stagnant technology.

RE: Wha T Evarrrr
By roadrun777 on 1/3/08, Rating: -1
RE: Wha T Evarrrr
By mindless1 on 1/4/2008 10:41:48 PM , Rating: 2
Yes there is an "in between". An occasional gamer who, due to this, doesn't feel like paying for a higher-mid to high end card can still game at 30FPS.

Can they do it on the most demanding new game with high levels of eyecandy? Usually not, but that doesn't mean they can't enjoy many last generation games. You don't HAVE to crank up eyecandy and play above 1280x1024 to enjoy something, and my old 1280x monitor plays plenty of games staying above 30FPS "almost" always with a reused 7600GT video card. For that matter, back when Half Life 2 came out I tried it on an overclocked nForce2 IGP chipset. It ran in DX7 mode and I could see the difference, but I decided to try playing the game on that system and it was fun enough I played through the game again.

Unfortunately there is where many benchmarks fail, in advising customers how the low-mid to higher-midrange cards stack up not at the exact same eyecandy and resolution as the higher end cards but rather, what concessions would need be made to game at that 30FPS minimal mark. Actually a better index would be if it doesn't drop below 20FPS more than 5% of the time and stays above 40FPS on average.

If you want to have it all, just pay more for a different card. It's that simple. nVidia and ATI are going to market such cards to gamers even if they don't meet your standard of gaming because truth be told they can play "some" games, and there would be little point to them without gaming since integrated video will do practiallly all non-gaming, 2D tasks outside of some HD video decoding.

RE: Wha T Evarrrr
By MBlueD on 1/9/2008 5:21:47 AM , Rating: 1
While it does seem that you're a troll on a mission, I think it would be fun to participate :)

My point is that there is no "average" gamer. Either you get fluid motion in a game or you do not.

Another poster mentioned a valid point - 'fluid motion' by itself is an ambiguous term. I can't get 'fluid motion' running Crysis on my 1680x1050 22" LCD on high settings. That is true. But I can get very fluid motion running Pro Evolution Soccer 2008 with all high. Pro Evo also runs with fluid motion on the same LCD on a medium-range card. So if I only care to play Pro Evolution a medium range card is probably going to serve me well, especially considering the price difference relative to a high end card. See? here's a use for midrange cards.
Now to the definition of 'average gamer'. A gamer who only cares to play Pro Evo for example can be described as average gamer. I for example want to play every (good) RPG or FPS that comes out. I don't consider myself an 'average' gamer so when I buy a card, I don't consider midrange unless the price of high end cards is beyond my reach. Average gamers are defined by how much they care about gaming and the quality of their gaming experience. There probably are people willing to play Crysis at all-low settings at a resolution lower than their native LCD resolutions - these are 'average' gamers (or enthusiast gamers without a budget).

It basically means you will get a slide show of the game you thought you were going to play

If you want to buy a card to play a particular game, it is your responsibility to do some research on what CAN run that game. If you go and buy the first box that says 'High performance gaming' and expect it to play your game, it's your problem. As much as I do hate marketing terms, it's not their responsibility if you 'mis-interpret' their claims; I'm pretty sure thay can find an old game where their card will run at very high fps. They will call that 'high performance gaming' and prove their claim true. Even if they claimed you could play Crysis at high frame rates, once challenged, they would run it on (e.g.) 640x480 with all settings lowest possible.

My point is not to defend marketing claims - it's just that it's our responsibility as consumers to do our homework.

My solution is to implement "minimum experience ratings" which where suggested many years ago as a way to gauge a card's performance. I still think that is the way to go.

That'd be great!

And I do think cars are stagnant, I mean combustion engines? come on... The technology is so old that the paper the patent was written on turned to dust already if that helps your small monkey mind.

Innovation is not something that comes when one wants it. When Innovation comes, it comes. Also, please not that car manufacturers (and graphic card manufacturers) are not research companies - they are business companies, and thus business (profit) is top of the their list. You can't demand innovation from someone - unless you are rich enough to fund an RnD operation for the development of an alternative technology for something you are tired with, and even then you have a significant risk of spending ALL your money and still coming out with nothing useful.
A technology does not only need to pay for itself - it also needs to pay for failed attempts. Again, these companies are there to make a profit for themselves first and foremost.

I do think Graphic Cards changed greatly in the last five years. Please take into consideration that the absence of 'visible' innovations doesn't mean no innovations happened at all. You make it sound like you really expected a jump to quantum processing technology for the GPUs, and perhaps anti-gravity for air and land craft, all in 5 years!

RE: Wha T Evarrrr
By IGoodwin on 1/10/2008 5:42:54 PM , Rating: 1
To determine that something is false, you have to have a statement, and conditions against which to measure it. This linguistic gymnastics is something better left to advertisments, or lawyers.

Certainly, to use the car analogy, A 1998 Geo Metro Sport can do 0-60 in under 3.5 seconds! If I found a cliff high enough it could do the quater mile pretty quickly too.

Conversly, placing four heavy people in the car would slow it, making in any performance claim false, unless the correct conditions are applied, and the appropriate professional driver employed.

Therefore, claiming aminimum specification, like 30fps, is pointless, as it cannot be guarenteed by the performance of the graphics card alone. The processor, motherboad, memory, and the display all alter the dynamic, not to mention the operating system, drivers, and the game itself.

The graphics cards do mention there performance in terms that cannot be affected by external properties, and in that way, they are not producing flase advertising; where, claiming 30 fps in all games would be false.

RE: Wha T Evarrrr
By Lightnix on 1/3/2008 10:20:49 PM , Rating: 2
Wow, I read this and signed up for an account.

This post is ridiculous.

You will be able to play most games, including Crysis at minimum settings at most panel resolutions with this card... But that's not my point. There is such a thing as a casual gamer, a person willing to turn down the eye candy for an hour or two of gameplay every now and then just for fun. They don't have to turn up all settings to maximum, it's just a bit of a jolly. Maybe somebody who was told Portal was an awesome game by their friends so they thought they'd try it.

Some people even have these funny old monitors called 'CRTs' in which reducing the resolution is barely noticable. If you want to play at bigger, badder resolutions with all the eye candy turned up because you think a 5 year old LCD monitor is the best way to do it, then you're gonna have to shell out a bit for a higher end card.

As for the console resolution thing I saw somewhere else on this comment section, the two consoles with comparably current generation hardware (PS3 and Xbox 360) can render games at up to 1080P, or correctly stated: 1920x1080. They can also do 720P, 1280x720. And even bog standard TV resolution (differs depending on whether you're an American or an anyone else). Before anyone says it, yes the 360 can do 1080P when games are coded for it, and yes, the newer models do in fact have real HDMI ports.

By the way, a card that is roughly 5 years old, for example the Radeon 9800... Well the GeForce 6600 GT was about twice as fast as that. The 7600 GT more than twice the speed of that. The 8600 GT was a bit of a disappointment and only matched the 7600 GT, and the 9600 GT has specs that make it roughly twice as powerful as the 8600 GTS.

That's all.

RE: Wha T Evarrrr
By Oobu on 1/4/2008 2:41:37 AM , Rating: 2
I don't agree with everything roadrun777 said, but I do agree that companies shouldn't be able to get away with selling cards like the ATI 9200SE, or the nVidia 5500 cards at places like Wal-Mart with boxes that quite clearly say "High Performance Gaming" on the side. I guess it's all relative to the gamers opinion, but in my opinion, those cards are in no way "high performance gaming" cards and SHOULD NOT be sold off as though they were. Many people don't know the difference, they read the box, buy it, and then go home to be horribly disappointed. You can't expect everyone to know the differences in the cards, or the technical details.

Whether you guys want to agree with him or not, he makes some valid points... even though some of his complaints are pretty ridiculous.

RE: Wha T Evarrrr
By roadrun777 on 1/4/08, Rating: 0
RE: Wha T Evarrrr
By rdeegvainl on 1/4/08, Rating: 0
RE: Wha T Evarrrr
By Volrath06660 on 1/4/08, Rating: 0
RE: Wha T Evarrrr
By Oobu on 1/5/2008 7:28:56 PM , Rating: 2
I was just using those models as an example, I don't know what they sell anymore. The last time I looked, which has been awhile mind you, I remember seeing those 5500's, 9500's, and 9200's. Every single one of them said "High Performance Gaming" and you probably couldn't play any modern games on any of those (or whatever they have now) with decent settings. Even putting your games on the lowest resolution with the quality on very low they would probably still run like a slideshow. Yet they still put "High Performance Gaming" on the box, sad. Not too long ago I saw a PCI card, not AGP, not PCI Express, a PCI video card that said "High Performance Gaming" within the last two years. How many games can even run properly on that? Come on, you may not like what this guy is saying but he's got a point. I don't completely agree with him, but he's got the right idea.

RE: Wha T Evarrrr
By robinthakur on 1/4/2008 6:06:29 AM , Rating: 2
While Roadrunner is indeed a bitter old troll, and should not be encouraged, what I find more amazing is that people buy Crisis, most of them fully knowing that no system, regardless of cost, can run it in high res as it was intended to be played. I also find it odd that the developer released the game before most people had the capability to play it, presumably to claw back some development costs as the profits to be made in the games industry on PC are ever decreasing for all but online sub games.

Most developers do indeed develop for future hardware, but how far in the future is the question!! Do they really think that in two years time or whenever people think a midrange Nvidia card will play this game respectably at medium settings, the publisher will still be marketing Crisis as heavily or it won't have been superceded by something new?

There can be little worse a feeling as you try out your brand new tri SLI quad core system than playing Crisis at this point in time. I'm all for conspicuous consumption, but when you probably have to spend £2500 to play ONE game respectably, something is seriously not right with their economic model and you can understand why console gaming is more pervasive. More than anything, for us the customer, this model is most certainly not in your best interests. The fact that this graphical tour de force
is an otherwise unremarkable game is hard to overlook either.

I think that what he is basically saying is that on medium settings or whatever, all games within maybe 3 years should run at a certain level. That would at least give buyers some confidence in 'investing' in the graphics card market which otherwise is just crazily fast paced at nearly every level apart from the low end which has always remained fairly stable IMO :)

RE: Wha T Evarrrr
By Rockjock51 on 1/4/2008 11:03:33 AM , Rating: 2
Fairly sure this guy is just posting to make people mad at this point.

RE: Wha T Evarrrr
By Volrath06660 on 1/5/2008 10:05:48 AM , Rating: 2
And that is a completely logical standpoint. The only caveat I would add to that is that for a gpu to be viable for the three years, it had to have been purchased from the high end of the gpu spectrum. If you buy some midlevel card, the life expectancy of said card is greatly diminished due to its initial lack of horsepower. I have always picked up high end cards when I build, and with the exception of my first gpu, a Geforce FX5200, they are all still viable, including that old 9800Pro. I can still run Warhammer 40K DOW on mid range settings with a resolution of 1280x1024 and am still pulling down over 30 fps by fraps. So high end cards are worth it if you buy them when they first come out.

And while I do pick up high end cards, I personally see no reason to get the SLI, Crossfire, Quad SLI..etc systems because they do not scale well from the getgo, but also because they are just locking you into a purchase for longer than you really would want. I will be sticking with single gpus until they can get good performance scaling.

RE: Wha T Evarrrr
By Oobu on 1/5/2008 7:31:48 PM , Rating: 2
I'd have to agree. I tried SLI twice and I just see no point, maybe for bragging rights? Just buy ONE good highend card and call it a day. I think SLI, Crossfire, and anything else is just a gimmick to get your hard earned cash.

RE: Wha T Evarrrr
By morton on 1/4/08, Rating: 0
"This is about the Internet.  Everything on the Internet is encrypted. This is not a BlackBerry-only issue. If they can't deal with the Internet, they should shut it off." -- RIM co-CEO Michael Lazaridis
Related Articles

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki