Print 55 comment(s) - last by Lerianis.. on Mar 11 at 6:01 AM

The Radeon HD 6990 (top, both pictures) atop the Radeon HD 5970 (bottom, both pictures), AMD's previous dual-GPU, single-card solution.   (Source: Anandtech)

The Radeon 6990 is absolutely the king of single-card performance, but its price makes it essentially unsellable.  (Source: Anandtech)
AMD's new card sounds like a jumbo jet and will break the bank, but it is fast

Like the dual-GPU chip Radeon HD 5970, the Radeon HD 6990 [press release] may be a victory lap of sorts for AMD.  Unlike the Radeon 5000 series, this generation the chipmaker faced tougher competition, with rival NVIDIA actually delivering in a (relatively) timely fashion an impressive set of GPUs -- the Geforce 500 Series.  But sales reports indicate that AMD has maintained its lead over NVIDIA, despite this recovery.

What better way to celebrate than to make a decadent and superfluous, but utterly powerful single card offering?

I.  More Beast Than Beauty

While the Radeon HD 6990 is a one-PCB card, it essentially acts like two.  It has two distinct vapor chambers, each with their own distinct heat-sink.  The GPU chips are linked by an internal CrossFire connection.  They share 4 GB of GDDR5 RAM, with each GPU essentially getting 2 GB -- the same as single card offerings from AMD.

Performance-wise the card acts much like two cards as well.

In testing by Anandtech, the card sucked down over 490 watts of power during intense gaming benchmarks.  The good news is that it's power dissipation was pretty incredible, allowing the card to stay at a cool 88 degrees Celsius -- one of the best things about the card.

The card can be aggressively overclocked on air up to 830 MHz, at least, but it require power aplenty (almost 550 watts of power, to be precise).

Something not so impressive was the roaring fan speeds required to keep the card running so cool.  NVIDIA -- long the butt of many a joke for its Geforce 400 Series' fan noise -- can breathe a sigh of relief.  If a Geforce 400 was a noisy lawn mower, the Radeon HD 6990 might as well be a jumbo jet.  It generates 70.2 decibels under load -- almost 4 db more than a pair of SLIed Geforce GTX580s.  When overclocked the card reached 77 (!) decibels.  At those levels, you might want to invest in a nice pair of gamer ear-plugs.

So how's the performance?  Well the good news is that this card easily beats any single card on the market right now by NVIDIA or AMD.  Sure that's because it's essentially two cards, but if the competition criteria is "single slot card", the Radeon HD 6990 is king.

Against two cards it falls flat, though.  It's consistently approximately 8 percent behind a pair of CrossFired AMD Radeon HD 6970 cards.  And in some titles like Crysis: Warhead, it even falls behind a pair of Geforce GTX 580s in SLI.

II. "Huh, yeah, what is it good for?" -- Edwin Starr

When it comes to what kind of utility this exercise in extreme single-slot power may have, the answer isn't quite "absolutely nothing", but it's pretty close.

Priced at $700 USD the card is outperformed by a pair of HD 6970s -- approximately $640 USD -- and roughly equaled by a pair of HD 6950s -- approximately $520 USD.

So why in the world would you buy a noisier, less powerful card that's over $60 more expensive?  Well, there's a couple reasons why you might -- but they're uncommon.

One is if you have to have 5 monitors.  Currently only the 5870 Eyefinity 6 and this card support driving 5 monitors from a single or dual card solution.  So if you have to have 5 monitors and you're willing to pay for top power, this is the card for you.

Secondly, if you only have two full PCI-E x16 slots in a board with the slots spaced at least 3 slots apart, and the board is CrossFire-ready, this is your ultimate solution, if heat, noise, and power consumption be damned.  Of course, that's because you're really squeezing 4 GPUs into two slots, but that's a minor technicality.

The final candidate would be a very small subset of single-card boards.  Generally if you have a single slot, your board/case is too small to sufficiently cool the HD 6990.  But it is possible, there are a couple of small systems out there that could muster sufficient cooling for this.  Arguably even if such a configuration doesn't exist already, someone could take a micro-ATX case, put a single HD 6990 in it, and pair it with a couple of ridiculous huge fans that cover the walls (maybe 200 mm?).  

That would yield a compact, powerful, yet ridiculous expensive system, but the upside would be that it would offer the HD 6990 in a package that might appeal to select few.

Unless AMD somehow cooks up octa-GPU CrossFire drivers, this is card, while a marvel of engineering, is a novelty/niche product, sure to sell few, if any, units.

Much like the protagonists of F. Scott Fitzgerald's "The Beautiful and Damned" the card cuts an attractive figure, but hides ugliness (power, noise, performance) underneath.  The upside is that is that it's destined for fame, or perhaps infamy, the downside is that its sales are doomed by its decadence.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By ClownPuncher on 3/8/2011 12:54:31 PM , Rating: 2
Those are some pretty arbitrary temps you are asking for. If the card is designed for it, why does it really matter? Are you going to stuff a $700 card in a $49.99 case? This thing is cooler than single GTX480's while being more than twice as powerful. The only thing that would concern me would be fan noise.

By Motoman on 3/8/2011 6:35:08 PM , Rating: 1
Arbitrary? Hardly. And the point that I'm making is that the card isn't so much "designed for it" as it is a result of it's design.

...also, the cost of a case is utterly irrelevant. Either a case has adequate airflow or it doesn't - I have seen plenty of $20 cases that have superior airflow to $200 cases.

88 degrees is way too hot for comfort. Even if it remains stable, you're literally burning the life out of that GPU. The cooling system is not adequate...period.

By PrinceGaz on 3/8/2011 9:18:55 PM , Rating: 2
You're not burning the life out of anything if it was designed to withstand that temperature for its expected useful lifespan.

How long are these high-end graphics-cards meant to last? I'd say two years will be fine for most users, or three years at most before they're reading reviews of cards from both manufacturers which are much faster than what they've got. A seriously high-end graphics-card is like a bright star-- it shines very brightly for a while and then explodes at the end of its life, whilst more conservative cards will last many many years.

Bear in mind the 90+C was in FurMark with "AUSUM" (possibly the worst acronym in history) enabled, and I think under normal overlocked conditions, it will be fine until the day it's a paperweight,

By Motoman on 3/8/2011 11:48:39 PM , Rating: 2
...why burn it out faster than it needs to be? If it lasts 2 years running at 90 degrees, but would last 4 years at 70 degrees...does it really make sense to run it that freaking hot?

Video cards have longer useful lives than you think. If you're playing on a 19" monitor, a Radeon 1900XT probably would still play WoW at high settings just fine - despite the fact that it's, what, 6 years old?

Just burning something up because you declare you don't care about it makes no sense. It's not that hard to put an adequate cooling system on a video card - if it was, there'd be no aftermarket offering vastly better-than-stock coolers.

By Lerianis on 3/11/2011 5:57:35 AM , Rating: 2
You are missing that these cards are for uber-gamers. Not the ones who are running World of Warcraft (which is light on GPU resources comparatively to what I am about to say) but Crysis on Enthusiast settings at highest resolution.

For those people? 2-3 years IS the usable life of a card. I'd say for some that even one year is the usable life of a card before they are looking for something better.

By Sazabi19 on 3/9/2011 9:19:21 AM , Rating: 2
These cards are rated for over 100 Celcius (not sure of exact temp rating but I am positive it is over 100 Celcius)

By Motoman on 3/9/2011 10:01:35 AM , Rating: 2
It's a simple equation - more heat = less life.

The point is simple - a better cooling solution would prolong the life of the card.

What about that point is upsetting you?

By Jcfili on 3/9/2011 4:14:18 PM , Rating: 2
Agree !!!

REAL GAMERS and People that love and care what they buy will go Liquid cooling !!! ^_^..

Problem solved .. no more loud fans ... and no more HEAT!!!

By ClownPuncher on 3/9/2011 11:40:55 AM , Rating: 2
It's arbitrary because you have no numbers backing up your claims that this will degrade the GPU faster. Then we have the fact that those temps were full load Furmark, which doesn't represent day to day gaming in the slightest. There is the other thing you didn't consider; multi gpu cards are always this hot, yet it doesn't seem like their failure rate is any higher.

"If a man really wants to make a million dollars, the best way would be to start his own religion." -- Scientology founder L. Ron. Hubbard

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki