Print 23 comment(s) - last by shin0bi272.. on Jun 26 at 1:40 AM

Radeon series GPU holds a slight edge in multi-monitor gaming

The latest round of the graphics war is over. Advanced Micro Devices, Inc. (AMD) came storming out first [1][2][3], but in the end it was NVIDIA Corp (NVDA) who seized the peformance crown with its GeForce GTX 680 Kepler GPU.  And that's where the story might have ended, had AMD not said "not so fast."

AMD countered with a specially binned Radeon HD 7970 "Gigahertz Edition" (GE).  The new card is essentially the exact same as the original Radeon HD 7970, which shipped at the end of January.  But it does bump the core clock from 925 MHz up to 1000 MHz (hence the GHZ part) and the memory clock from 5.5 GHz to 6 GHz.

But the majority of the gains come from a new set of drivers that in some ways mirror NVIDIA's Kepler drivers, by providing a "Boost" mode.  Unlike the Kepler drivers, AMD locks the cards into a solid maximum clock -- 1050 GHz -- which is sort of nice, given that you know what you're getting.  This contrasts with NVIDIA which guarantees a minimum boost clock, but whose maximum boost clock is somewhat random depending on chip quality -- essentially luck of the draw.

Gigahertz edition

That's all fine and good, but the compelling question is whether the Gigahertz Edition was worth alerting the press and claiming to steal NVIDIA's thunder.  Well it turns out it does provide a significant boost over the base model, enough to put it in contention once more for the graphics crown.

The card trades blows with the Geforce GTX 680 in AnandTech's testing, with neither card managing a convincing victory.  The only a couple of areas where one card clearly wins.  One is the multi-monitor tests, where AMD's card is the clear winner. But in power, noise, and heat, NVIDIA's card clearly wins.

The most important issue thus becomes price.  Both the GTX 680 and new HD 7970 GE are $500 USD.  Thus the graphics races is essentially a dead heat.

Moving down the ladder AMD offers the base HD 7970 for $430 USD.  For $30 USD less NVIDIA offers the $400 USD GTX 670 -- which predictably has a bit less performance than HD 7970.  Then there's AMD's HD 7950 at $360, which is in turn a bit lower performance than the GTX 670.  

In other words, smart pricing from both companies means that the best card for you in this round of the graphics war depends on which price point you're looking at.  At the points occupied by AMD, AMD wins, at the points occupied by NVIDIA, NVIDIA wins.  But at the top there's now no longer a clear winner.  

Sources: AMD, Anandtech

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Unimpressive due to power/heat
By corduroygt on 6/22/2012 12:21:39 PM , Rating: 3
GTX 580 went head to head with 6970 but it wasn't impressive overall due to excessive power usage and heat. This is the same but reverse.

RE: Unimpressive due to power/heat
By corduroygt on 6/22/2012 12:23:25 PM , Rating: 1
Also boo on Jason to completely ignore power/heat where GTX 680 is superior. It is a big consideration along with price.

RE: Unimpressive due to power/heat
By JasonMick on 6/22/2012 12:32:06 PM , Rating: 5
Also boo on Jason to completely ignore power/heat where GTX 680 is superior. It is a big consideration along with price.
Fair enough, I added that to the piece. I still would call it a "tie" as if you truly do not favor either manufacturer, it would all boil down to what you want.

If you want the highest multi-monitor performance, a bit extra power and heat is probably worth it -- a win for AMD. If you want to put the card into some smaller HTPC enclosure and don't care about multi-monitor then NVIDIA clearly wins.

It all depends on what kind of system you're building, so again I'd say the term "tie" is accurate. That's even the term Ryan Smith (Anandtech) used at certain points in his analysis.

RE: Unimpressive due to power/heat
By crimson117 on 6/22/2012 12:47:42 PM , Rating: 2
And importantly, he notes they are "anything but equal" when you look at specific games, so consider which games you'll be playing:
The 7970GE scores some impressive wins in Crysis and DiRT 3, while NVIDIA manages to hold on to their substantial leads in Battlefield 3 and Portal 2

By Obujuwami on 6/22/2012 2:00:55 PM , Rating: 3
Clearly you haven't been a DYIer or gamer long enough to know that game designers design their games for specific video manufacturers. EA, who did C&C3, put a GIANT nVidia logo just after it's intro to point out that you should be using that card.

Studios do this, its a given, and it's something that shouldn't be included OR you should give the same test to a known game that favors each chip maker.

By corduroygt on 6/22/2012 1:08:04 PM , Rating: 2
Fair enough, I added that to the piece. I still would call it a "tie" as if you truly do not favor either manufacturer, it would all boil down to what you want.

I disagree, the GTX 480 beat the 5870 back in the day but it was such a power hog that most did not call it equal but favored AMD. Same with GTX580 vs. 6970. Finally NV have turned the tables around and they make the better GPU this generation.

I was an AMD fan for 4xxx/5xxx/6xxx, but now I'm a NV fan for GTX 6xx. Performance/power is the criteria to beat for me.

RE: Unimpressive due to power/heat
By Mitch101 on 6/22/2012 12:36:19 PM , Rating: 2
Ill throw in NOISE the Nvidia is 11.9dbs quieter and I really wish they had a Crossfire comparison. A very common config is two Radeon 6850's. Im not sure what the SLI common is on the NVIDIA side but I like to see those benchmarks in the review. Sometimes its cheaper to get two cards instead of one mamouth card and even be faster.

RE: Unimpressive due to power/heat
By bug77 on 6/22/2012 1:19:08 PM , Rating: 2
Considering that Crossfire and SLI combined have a market share of under 5%, I would say two Radeon 6850s is not that common.

RE: Unimpressive due to power/heat
By Mitch101 on 6/22/2012 2:09:46 PM , Rating: 2
5% is pretty good.

It would be more if people knew that dual mid range cards are faster than a single top end card.

RE: Unimpressive due to power/heat
By bug77 on 6/22/2012 6:59:24 PM , Rating: 2
I don't think so. Two card will always eat more power than a single, more powerful one (within the same generation, of course). If nothing else, because of the need for redundant memory. And then you have to worry about support for each title, micro stuttering and God knows what else. Whereas a single GPU just works.
There's always a reason when a technology fails in the market. This time, it's not the lack of marketing.

RE: Unimpressive due to power/heat
By Trisped on 6/22/2012 8:35:41 PM , Rating: 2
I think it would take more then just this knowledge.
Running dual cards requires much more power, a supporting Motherboard, and supporting software (unless things have changed in the last 5 years).

RE: Unimpressive due to power/heat
By Trisped on 6/22/2012 8:37:40 PM , Rating: 2
Hmm, seems Bug77 said it first and better.
I should have refreshed before I replied. :(

By silverblue on 6/25/2012 4:38:48 AM , Rating: 2
It's very unlikely that any third party supplier will use the stock AMD cooler for this. That in itself should make for quieter cards, though probably not down to 680 levels.

The cheapest method appears to be getting a stock 7970, adding a replacement cooler and overclocking the thing. You could end up with a cooler and quieter GHz Edition without having to pay the premium for it.

RE: Unimpressive due to power/heat
By Totally on 6/22/2012 1:43:40 PM , Rating: 2
and people didn't totally overlook that same fact with the 580?

RE: Unimpressive due to power/heat
By encia on 6/22/2012 6:40:51 PM , Rating: 2
Also boo on corduroygt to completely ignore 64bit DP FP where 7950/7970 is superior. It is a big consideration along with price.

By corduroygt on 6/22/2012 9:57:35 PM , Rating: 2
Define "big consideration"
I'd wager no more than 5% of the people buying this card have a need for 64-bit DP FP. These are sold as gaming cards first and foremost.

By someguy123 on 6/23/2012 2:30:57 AM , Rating: 2
You know, in synthetic tests, AMD was already ahead even last cycle. This doesn't mean much if the software isn't there, though. The only place I've seen proper OCL implementation is in parts of adobe products/rendering software, meanwhile I've seen cuda implementation in a few encoders and denoising programs. In the case of software like 3dsmax or maya, you'd probably be going for quadro/tesla/firegl for proper driver support.

and now
By shin0bi272 on 6/22/2012 12:17:42 PM , Rating: 2
Nvidia will now announce the GTX 685 built on the GK110 and offering 512bit memory bus and 25% faster performance in games.

RE: and now
By crimson117 on 6/22/2012 12:51:27 PM , Rating: 2
Nvidia will now announce the GTX 680+ built on the GK110 and offering 512bit memory bus and 25% faster performance in games.

Fixed the product name for you...

RE: and now
By BruceLeet on 6/22/2012 3:49:43 PM , Rating: 2
I think BigK is being saved for the 700 series. Why push BigK as a 685? GK110 as GTX 780 sounds more appropriate, GTX 680 is only GK104.

RE: and now
By Totally on 6/22/2012 4:24:13 PM , Rating: 2
Wasn't GK110 a Tesla only part? Why do people still hold onto this notion?

RE: and now
By tviceman on 6/22/2012 5:18:18 PM , Rating: 2
It is being released FIRST as a tesla only part. It will be released as a Geforce product sometime after that. Nvidia has never made a GPU that was Tesla only.

RE: and now
By shin0bi272 on 6/26/2012 1:40:14 AM , Rating: 2
Im just going by the roadmap that they released last year and the rumored names being thrown around.

We can also see that with the 670's performance being 5% less than the 680 that the 256bit mem bandwidth is hindering the 680. Even being hamstrung by that small fact the 680 is still faster than what AMD is offering which is why they called it the 680 from what I heard. Otherwise they would have called it the 650 and released the midrange card first (like it was rumored amd was going to do due to issues with the highK I think it was called 28nm process at TSMC for a while). Thus making the gk110 the flagship 680 card... but when amd's card wasnt as powerful as expected they changed up the plan and called the 650 the 680 and plan on releasing the gk110 4Q this year.

The chip actually taped out from what I heard back in jan... which means 6-8 months to launch so that would put it right at the end of Q3 for seeing it in stores. Whether they label it the 780 or the 685 is menial.. its just a sticker. I want the performance not the name.

"Mac OS X is like living in a farmhouse in the country with no locks, and Windows is living in a house with bars on the windows in the bad part of town." -- Charlie Miller

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Laptop or Tablet - Which Do You Prefer?
September 20, 2016, 6:32 AM
Update: Samsung Exchange Program Now in Progress
September 20, 2016, 5:30 AM
Smartphone Screen Protectors – What To Look For
September 21, 2016, 9:33 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki