backtop


Print 22 comment(s) - last by Ammohunt.. on Mar 27 at 9:53 AM


The Geforce GTX 590 is a beastly dual-GPU card.  (Source: Anandtech)

Clearly it's a dedicated response to AMD's dual-GPU Radeon HD 6990. Yet, it fails to surpass its competitor, on average, in benchmark performance.  (Source: Anandtech)
Card is obviously a direct response to AMD's Radeon 6990

AMD's dual-GPU Radeon HD 6990 card was a bit of a preposterous preposition to consumers.  It was hot, it was noisy, and it was expensive ($700).  And it offered users less bang for their buck than two individual Radeon HD 6970s.  Still, it was incredibly powerful and offered AMD bragging rights for owning the title of the most powerful single card solution.

NVIDIA wasn't about to let AMD walk away with this useless prize.  So it went and released its own equally over-the-top dual-GPU supercard.

That card is the GeForce GTX 590 and it broke cover today.

I. Specs

The GTX 590 is built on the Fermi architecture and manufactured at a 40 nm process.  It has a pair of GPU chips each with 512 stream processors.  Those 1024 hungry processing units are paired with 3 GB of GDDR5 memory.

The core is clocked at 607 MHz, the shaders are clocked at 1214 MHz, and the memory is clocked at 853 MHz.  The two GPU chips contain 6 billion transistors in total, and are manufactured by Taiwan's TSMC.

The card retails for a whopping $700 USD, $60 USD more than a pair of GeForce GTX 570s.  And the price of special performance models climbs even higher, to $730 USD or more.

II. Performance

The GTX 590 inherits the proud lineage of the dual-GPU GTX 295.  According to AnandTech it earns a virtual draw with the Radeon HD 6990, just 1 percent shy of its mean performance.

Ultimately, what that means is that the two competitors split the field.  In benchmarks of Civilization VDIRT 2, Mass Effect 2, and HAWX NVIDIA's card dominates, according to AnandTech.  AMD dominates Crysis, BattleForge, and especially STALKER: Call of Pripyat.  

This tie is somewhat negated by the fact that AMD's wins are a bit higher quality in so much that it gets better framerates in games which tend to get lower framerates (like Crysis), where as NVIDIA gets ridiculously high framerates in games with already good framerates (like HAWX).  

So perhaps AMD still can claim to hold the performance crown by a narrow margin.

Ultimately the worst thing about NVIDIA's performance is the same as the worst thing about AMD's, though -- the card is beat by a pair of single-GPU cards in SLI that cost less.  Namely, a pair of GeForce GTX 570s appears faster than the GTX 590 in most benchmarks, though they fall to the GTX 590's overclocked variant.

Tom's Hardware and HotHardware offer similar conclusions.

III.  Noise and Heat

You may recall the AMD card was "relatively cool" by this generation's standards, measuring up to 89 degrees Celsius (to be fair, this is "hot" in human terms -- hot enough to cook an egg) during loading exercises.  In gaming the AMD card was cooler than NVIDIA, but in benchmarks the NVIDIA card was cooler, indicating it might run cooler when used for GPU computing

The NVIDIA card edges out AMD's offering in noise by a more substantial margin.  While the Radeon 6990's stock 65 decibels pain the ears, the GTX 590 manages to operate at 57.9 decibels when loaded.  Even the overclocked version only manages to produce 63.1.  This is a bigger difference than you might think as decibels are a logarithmic measurement.

Similar to the heat issue, NVIDIA's card does well on power in benchmarks, but not quite as resoundingly well in gaming.  When running Furmark, the stock model sucks up ~455 watts in average benchmarks, while AMD card pulls ~520 watts.  However, the NVIDIA card consumes roughly 10 more watts when gaming.

IV. Conclusions

While the GTX 590, like its AMD doppelgänger, seems impressive it falls short in that it gets beat by a cheaper package of two cards.  So which ridiculous, overpriced single card is better?  That depends on what you value.

NVIDIA wins on temperature, noise, and power.  AMD arguably still holds the performance crown by a narrow margin.  

That said, the scale might tip in NVIDIA's favor ever so slightly given its card’s CUDA GPU computing capabilities, which are increasingly being used by researchers.

Ultimately, though, despite the shine and allure, this is a card rational customers simply should not buy, outside a few highly specialized scenarios.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Yawn
By Pirks on 3/24/2011 11:29:50 AM , Rating: 1
There are no games to take advantage of these $700 monsters anyway.

Crytek went console, Crysis 2 is not bad but console friendly and hence not requiring these to get high frame rate, and how many people run 2560x1600 with 16x AA?

Anyway for 99% of people GTX 570 (or lower) is way better investment, will run all the games with high framerates and you can always raise framerate by dropping AA to 2x or turn it off, no big deal.

And all this dual GPU scaling and lag issues, meh... too much trouble for no return.




RE: Yawn
By bighairycamel on 3/24/2011 11:41:58 AM , Rating: 5
These cards have never been about practicality so I never understand why they're critiqued that way. Sure a Ferrari isn't practical in the least but a fraction of people that can afford them will buy them.


RE: Yawn
By Mitch101 on 3/24/2011 3:36:41 PM , Rating: 3
It all depends on how many monitors you use and how much resolution your trying to push in a cutting edge game with a lot of detail.

I use 3 monitors running games at 5760x1080 or 3840x800. Once you game this way you never want to go back to single screen it feels like wearing blinders.

Before someone says thats cheating everyone has peripheral vision so I dont see this as cheating. My eyeballs cant cheat that I know of. You see stuff out of the corner of your eye so widescreen gaming is just more natural and immersive than a single monitor view. If I couldn't detect movement from the side areas of my eye then there would be no benefit to widescreen gaming. I would love to go 5 monitors wide because that's closer to my peripheral visions range. But 3 monitors loses the view angle and I would imagine 5 monitors the outtermost would provide little advantage until games are written to have widescreen vision similar to the way our eyes work.

Price:
Its not that expensive either you already have monitor 1 and the video card you just need to purchase two more monitors.

As for the person who thinks getting a 42" monitor is the same as triples screens its not. A 42" screen is just larger than say a 21" screen but both display the same visual area/angle. Three 21" screens have a more panoramic view.


RE: Yawn
By B3an on 3/25/2011 1:22:54 PM , Rating: 1
quote:
Once you game this way you never want to go back to single screen it feels like wearing blinders.


I've got three 30" 2560x1600 IPS monitors which are the perfect thing for this (7680x1600), yet i still game on just one of them.
Multi-monitor gaming is not an improvement. You have bezels in the way and distorted FOV, as games were just not designed for this wide aspect. It works ok with some games but for FPS and racing everything on the two outer monitors is stretched and distorted. And no matter how much i use it i cant get used to the bezels. Theres also many games that it dont work with.
So it's more down to personal tastes and the quality/visual standards of the person.


RE: Yawn
By Ammohunt on 3/27/2011 9:53:45 AM , Rating: 3
You know you are right! i think i am going to run out and buy 4 32" monitors and 4 GTX 590's! I was going to buy a car so i can go to work but now i can play Crysis 500FPS.


RE: Yawn
By Adonlude on 3/24/2011 11:47:23 AM , Rating: 3
But then there are enthusiasts who build super PC's just becuase they can. They have programs like CPUz, Afterburner, Prime95, Folding@Home, PCMark and 3DMark on their computers. These people might just buy two of these monsters and SLI them.


RE: Yawn
By spread on 3/24/2011 12:16:23 PM , Rating: 2
quote:
There are no games to take advantage of these $700 monsters anyway.


Mods. Even if the company drops the ball the modding community usually comes out with something far better.

Look at Stalker. The game looked like crap until the modding community released new textures, sharers and etc. Stalker Complete 2009 is the end result and it looks amazing.

You can also use this card for rendering. CUDA with a nice unbiased rendering? Yes please.


RE: Yawn
By Pirks on 3/24/2011 12:32:57 PM , Rating: 1
Even Stalker 2009 is way too weak rendering wise for these cards. Game industry has completely consolized by now, my AMD 5850 runs pretty recent and good looking Darksiders in 1920x1080 with like 90 fps or something, consoles just killed nVidia's bottom line, no need to buy new GPUs these days, maybe every 3 years and even this could be too often. Poor nVidia


RE: Yawn
By ClownPuncher on 3/24/2011 1:56:21 PM , Rating: 2
http://www.hardocp.com/article/2011/03/24/asus_gef...

I think this pretty much sums up what people use these cards for.


RE: Yawn
By kmmatney on 3/24/2011 2:40:48 PM , Rating: 2
From that review:

quote:
We have inquired as to how many units will be for sale in North America. Officially NVIDIA will only tell us, "Thousands will be available at launch worldwide." We have heard unsubstantiated rumors that there will be very few of these GTX 590 cards available. So depending on demand, the GeForce GTX 590 may or may not be a "collector's item" soon.


It's a niche product


RE: Yawn
By Pirks on 3/24/11, Rating: -1
RE: Yawn
By ClownPuncher on 3/24/11, Rating: 0
RE: Yawn
By Skywalker123 on 3/24/2011 8:29:06 PM , Rating: 2
scaling isn't an issue anymore, and less than 1% or people will buy these cards, so whats your point besides being a master of the obvious?


RE: Yawn
By KoolAidMan1 on 3/25/2011 12:04:51 AM , Rating: 2
They make sense if you have 27" or 30" displays, or if you do any sort of surround gaming.

It isn't for the majority of people out there. It certainly wasn't for me until I started gaming at 2560x1440, now I'm using two GTX 460 cards in SLI. I expect that I'll be on the "high-ish" end until it that kind of performance trickles down to the mid-range cards.


RE: Yawn
By brandonicus on 3/25/2011 5:11:08 AM , Rating: 2
Right, because no game will ever use more than these cards can offer.

/sarcasm

Future proofing is something everyone attempts in some way or another when buying a new system...even though they know it will be to no avail. Especially if you play PC games, or you simply have no budget.


RE: Yawn
By theapparition on 3/25/2011 6:39:38 AM , Rating: 2
quote:
There are no games to take advantage of these $700 monsters anyway.

There is more to life than just games. Widen your gaze.


Power Numbers?
By Natfly on 3/24/2011 12:50:30 PM , Rating: 4
quote:
The card also draws about 65 watts less. The stock model sucks up ~455 watts in average benchmarks, while AMD card pulls ~520 watts.


Where did you get the power numbers from? I ask cause this seems to be the opposite of what was determined on AT.

http://images.anandtech.com/graphs/graph4239/36071...
http://images.anandtech.com/graphs/graph4239/36070...

Furmark has never been a good indicator of power consumption, and that rings even more true today due to totally different throttling limits.




Power numbers are wrong
By nafhan on 3/24/2011 11:52:04 AM , Rating: 3
I, of course, don't have either of those cards, but from what I've seen in reviews... the only place there's as much of a power usage delta as described here is in artificial benchmarks such as Furmark - where Nvidia heavily (and artificially) throttles the cards power usage and speed. For the most part, power usage seems fairly close with stock clocked 6990 using less power and 6990 OC's using more than the 590.
Also, the Anandtech and Tech Report reviews have the 590 being slightly warmer than the 6990, which is what you'd expect for a quieter card using similar amount of power and a similar cooler.




By Hafgrim on 3/24/2011 12:15:51 PM , Rating: 2
title says it all...

So can this crazy Nvidia card run 3D Vision Surround Technology on its own or do you still need two cards?? lol.

=)




lol
By adiposity on 3/24/2011 2:14:22 PM , Rating: 1
quote:
AMD's dual-GPU Radeon HD 6990 card was a bit of a preposterous preposition to consumers.


Alliteration fail. lol




RE: lol
By adiposity on 3/24/2011 2:17:20 PM , Rating: 1
quote:
This tie is somewhat negated by the fact that AMD's wins are a bit higher quality in so much that it gets better framerates in games which tend to get lower framerates (like Crysis), where as NVIDIA gets ridiculously high framerates in games with already good framerates (like HAWX).

...

Ultimately the worst thing about NVIDIA's performance is the same as the worst thing about AMD's, though -- the card is beat by a pair of single-GPU cards in SLI that cost less.


That's some awkward wording, there.


By Codeman03xx on 3/24/2011 6:23:10 PM , Rating: 2
Didn't Nvidia/AMD get sued for price fixing already. Honestly this is BS. I never buy a card until i find the temps of it. If the card holds around 50c-70c MAX and i mean max. NO REASON A CARD SHOULD BE RUNNING AT 100c FREAKING BOILING BS. Any-who Nvidia/AMD will keep ripping people off until someone really sues them bad.




"This week I got an iPhone. This weekend I got four chargers so I can keep it charged everywhere I go and a land line so I can actually make phone calls." -- Facebook CEO Mark Zuckerberg














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki