backtop


Print 23 comment(s) - last by shin0bi272.. on Jun 26 at 1:40 AM

Radeon series GPU holds a slight edge in multi-monitor gaming

The latest round of the graphics war is over. Advanced Micro Devices, Inc. (AMD) came storming out first [1][2][3], but in the end it was NVIDIA Corp (NVDA) who seized the peformance crown with its GeForce GTX 680 Kepler GPU.  And that's where the story might have ended, had AMD not said "not so fast."

AMD countered with a specially binned Radeon HD 7970 "Gigahertz Edition" (GE).  The new card is essentially the exact same as the original Radeon HD 7970, which shipped at the end of January.  But it does bump the core clock from 925 MHz up to 1000 MHz (hence the GHZ part) and the memory clock from 5.5 GHz to 6 GHz.

But the majority of the gains come from a new set of drivers that in some ways mirror NVIDIA's Kepler drivers, by providing a "Boost" mode.  Unlike the Kepler drivers, AMD locks the cards into a solid maximum clock -- 1050 GHz -- which is sort of nice, given that you know what you're getting.  This contrasts with NVIDIA which guarantees a minimum boost clock, but whose maximum boost clock is somewhat random depending on chip quality -- essentially luck of the draw.

Gigahertz edition

That's all fine and good, but the compelling question is whether the Gigahertz Edition was worth alerting the press and claiming to steal NVIDIA's thunder.  Well it turns out it does provide a significant boost over the base model, enough to put it in contention once more for the graphics crown.

The card trades blows with the Geforce GTX 680 in AnandTech's testing, with neither card managing a convincing victory.  The only a couple of areas where one card clearly wins.  One is the multi-monitor tests, where AMD's card is the clear winner. But in power, noise, and heat, NVIDIA's card clearly wins.

The most important issue thus becomes price.  Both the GTX 680 and new HD 7970 GE are $500 USD.  Thus the graphics races is essentially a dead heat.

Moving down the ladder AMD offers the base HD 7970 for $430 USD.  For $30 USD less NVIDIA offers the $400 USD GTX 670 -- which predictably has a bit less performance than HD 7970.  Then there's AMD's HD 7950 at $360, which is in turn a bit lower performance than the GTX 670.  

In other words, smart pricing from both companies means that the best card for you in this round of the graphics war depends on which price point you're looking at.  At the points occupied by AMD, AMD wins, at the points occupied by NVIDIA, NVIDIA wins.  But at the top there's now no longer a clear winner.  

Sources: AMD, Anandtech



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Unimpressive due to power/heat
By corduroygt on 6/22/2012 12:23:25 PM , Rating: 1
Also boo on Jason to completely ignore power/heat where GTX 680 is superior. It is a big consideration along with price.


RE: Unimpressive due to power/heat
By JasonMick (blog) on 6/22/2012 12:32:06 PM , Rating: 5
quote:
Also boo on Jason to completely ignore power/heat where GTX 680 is superior. It is a big consideration along with price.
Fair enough, I added that to the piece. I still would call it a "tie" as if you truly do not favor either manufacturer, it would all boil down to what you want.

If you want the highest multi-monitor performance, a bit extra power and heat is probably worth it -- a win for AMD. If you want to put the card into some smaller HTPC enclosure and don't care about multi-monitor then NVIDIA clearly wins.

It all depends on what kind of system you're building, so again I'd say the term "tie" is accurate. That's even the term Ryan Smith (Anandtech) used at certain points in his analysis.


RE: Unimpressive due to power/heat
By crimson117 on 6/22/2012 12:47:42 PM , Rating: 2
And importantly, he notes they are "anything but equal" when you look at specific games, so consider which games you'll be playing:
quote:
The 7970GE scores some impressive wins in Crysis and DiRT 3, while NVIDIA manages to hold on to their substantial leads in Battlefield 3 and Portal 2


By Obujuwami on 6/22/2012 2:00:55 PM , Rating: 3
Clearly you haven't been a DYIer or gamer long enough to know that game designers design their games for specific video manufacturers. EA, who did C&C3, put a GIANT nVidia logo just after it's intro to point out that you should be using that card.

Studios do this, its a given, and it's something that shouldn't be included OR you should give the same test to a known game that favors each chip maker.


By corduroygt on 6/22/2012 1:08:04 PM , Rating: 2
quote:
Fair enough, I added that to the piece. I still would call it a "tie" as if you truly do not favor either manufacturer, it would all boil down to what you want.

I disagree, the GTX 480 beat the 5870 back in the day but it was such a power hog that most did not call it equal but favored AMD. Same with GTX580 vs. 6970. Finally NV have turned the tables around and they make the better GPU this generation.

I was an AMD fan for 4xxx/5xxx/6xxx, but now I'm a NV fan for GTX 6xx. Performance/power is the criteria to beat for me.


RE: Unimpressive due to power/heat
By Mitch101 on 6/22/2012 12:36:19 PM , Rating: 2
Ill throw in NOISE the Nvidia is 11.9dbs quieter and I really wish they had a Crossfire comparison. A very common config is two Radeon 6850's. Im not sure what the SLI common is on the NVIDIA side but I like to see those benchmarks in the review. Sometimes its cheaper to get two cards instead of one mamouth card and even be faster.


RE: Unimpressive due to power/heat
By bug77 on 6/22/2012 1:19:08 PM , Rating: 2
Considering that Crossfire and SLI combined have a market share of under 5%, I would say two Radeon 6850s is not that common.


RE: Unimpressive due to power/heat
By Mitch101 on 6/22/2012 2:09:46 PM , Rating: 2
5% is pretty good.

It would be more if people knew that dual mid range cards are faster than a single top end card.


RE: Unimpressive due to power/heat
By bug77 on 6/22/2012 6:59:24 PM , Rating: 2
I don't think so. Two card will always eat more power than a single, more powerful one (within the same generation, of course). If nothing else, because of the need for redundant memory. And then you have to worry about support for each title, micro stuttering and God knows what else. Whereas a single GPU just works.
There's always a reason when a technology fails in the market. This time, it's not the lack of marketing.


RE: Unimpressive due to power/heat
By Trisped on 6/22/2012 8:35:41 PM , Rating: 2
I think it would take more then just this knowledge.
Running dual cards requires much more power, a supporting Motherboard, and supporting software (unless things have changed in the last 5 years).


RE: Unimpressive due to power/heat
By Trisped on 6/22/2012 8:37:40 PM , Rating: 2
Hmm, seems Bug77 said it first and better.
I should have refreshed before I replied. :(


By silverblue on 6/25/2012 4:38:48 AM , Rating: 2
It's very unlikely that any third party supplier will use the stock AMD cooler for this. That in itself should make for quieter cards, though probably not down to 680 levels.

The cheapest method appears to be getting a stock 7970, adding a replacement cooler and overclocking the thing. You could end up with a cooler and quieter GHz Edition without having to pay the premium for it.


RE: Unimpressive due to power/heat
By Totally on 6/22/2012 1:43:40 PM , Rating: 2
and people didn't totally overlook that same fact with the 580?


RE: Unimpressive due to power/heat
By encia on 6/22/2012 6:40:51 PM , Rating: 2
Also boo on corduroygt to completely ignore 64bit DP FP where 7950/7970 is superior. It is a big consideration along with price.


By corduroygt on 6/22/2012 9:57:35 PM , Rating: 2
Define "big consideration"
I'd wager no more than 5% of the people buying this card have a need for 64-bit DP FP. These are sold as gaming cards first and foremost.


By someguy123 on 6/23/2012 2:30:57 AM , Rating: 2
You know, in synthetic tests, AMD was already ahead even last cycle. This doesn't mean much if the software isn't there, though. The only place I've seen proper OCL implementation is in parts of adobe products/rendering software, meanwhile I've seen cuda implementation in a few encoders and denoising programs. In the case of software like 3dsmax or maya, you'd probably be going for quadro/tesla/firegl for proper driver support.


"It seems as though my state-funded math degree has failed me. Let the lashings commence." -- DailyTech Editor-in-Chief Kristopher Kubicki














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki