Print 55 comment(s) - last by Lerianis.. on Mar 11 at 6:01 AM

The Radeon HD 6990 (top, both pictures) atop the Radeon HD 5970 (bottom, both pictures), AMD's previous dual-GPU, single-card solution.   (Source: Anandtech)

The Radeon 6990 is absolutely the king of single-card performance, but its price makes it essentially unsellable.  (Source: Anandtech)
AMD's new card sounds like a jumbo jet and will break the bank, but it is fast

Like the dual-GPU chip Radeon HD 5970, the Radeon HD 6990 [press release] may be a victory lap of sorts for AMD.  Unlike the Radeon 5000 series, this generation the chipmaker faced tougher competition, with rival NVIDIA actually delivering in a (relatively) timely fashion an impressive set of GPUs -- the Geforce 500 Series.  But sales reports indicate that AMD has maintained its lead over NVIDIA, despite this recovery.

What better way to celebrate than to make a decadent and superfluous, but utterly powerful single card offering?

I.  More Beast Than Beauty

While the Radeon HD 6990 is a one-PCB card, it essentially acts like two.  It has two distinct vapor chambers, each with their own distinct heat-sink.  The GPU chips are linked by an internal CrossFire connection.  They share 4 GB of GDDR5 RAM, with each GPU essentially getting 2 GB -- the same as single card offerings from AMD.

Performance-wise the card acts much like two cards as well.

In testing by Anandtech, the card sucked down over 490 watts of power during intense gaming benchmarks.  The good news is that it's power dissipation was pretty incredible, allowing the card to stay at a cool 88 degrees Celsius -- one of the best things about the card.

The card can be aggressively overclocked on air up to 830 MHz, at least, but it require power aplenty (almost 550 watts of power, to be precise).

Something not so impressive was the roaring fan speeds required to keep the card running so cool.  NVIDIA -- long the butt of many a joke for its Geforce 400 Series' fan noise -- can breathe a sigh of relief.  If a Geforce 400 was a noisy lawn mower, the Radeon HD 6990 might as well be a jumbo jet.  It generates 70.2 decibels under load -- almost 4 db more than a pair of SLIed Geforce GTX580s.  When overclocked the card reached 77 (!) decibels.  At those levels, you might want to invest in a nice pair of gamer ear-plugs.

So how's the performance?  Well the good news is that this card easily beats any single card on the market right now by NVIDIA or AMD.  Sure that's because it's essentially two cards, but if the competition criteria is "single slot card", the Radeon HD 6990 is king.

Against two cards it falls flat, though.  It's consistently approximately 8 percent behind a pair of CrossFired AMD Radeon HD 6970 cards.  And in some titles like Crysis: Warhead, it even falls behind a pair of Geforce GTX 580s in SLI.

II. "Huh, yeah, what is it good for?" -- Edwin Starr

When it comes to what kind of utility this exercise in extreme single-slot power may have, the answer isn't quite "absolutely nothing", but it's pretty close.

Priced at $700 USD the card is outperformed by a pair of HD 6970s -- approximately $640 USD -- and roughly equaled by a pair of HD 6950s -- approximately $520 USD.

So why in the world would you buy a noisier, less powerful card that's over $60 more expensive?  Well, there's a couple reasons why you might -- but they're uncommon.

One is if you have to have 5 monitors.  Currently only the 5870 Eyefinity 6 and this card support driving 5 monitors from a single or dual card solution.  So if you have to have 5 monitors and you're willing to pay for top power, this is the card for you.

Secondly, if you only have two full PCI-E x16 slots in a board with the slots spaced at least 3 slots apart, and the board is CrossFire-ready, this is your ultimate solution, if heat, noise, and power consumption be damned.  Of course, that's because you're really squeezing 4 GPUs into two slots, but that's a minor technicality.

The final candidate would be a very small subset of single-card boards.  Generally if you have a single slot, your board/case is too small to sufficiently cool the HD 6990.  But it is possible, there are a couple of small systems out there that could muster sufficient cooling for this.  Arguably even if such a configuration doesn't exist already, someone could take a micro-ATX case, put a single HD 6990 in it, and pair it with a couple of ridiculous huge fans that cover the walls (maybe 200 mm?).  

That would yield a compact, powerful, yet ridiculous expensive system, but the upside would be that it would offer the HD 6990 in a package that might appeal to select few.

Unless AMD somehow cooks up octa-GPU CrossFire drivers, this is card, while a marvel of engineering, is a novelty/niche product, sure to sell few, if any, units.

Much like the protagonists of F. Scott Fitzgerald's "The Beautiful and Damned" the card cuts an attractive figure, but hides ugliness (power, noise, performance) underneath.  The upside is that is that it's destined for fame, or perhaps infamy, the downside is that its sales are doomed by its decadence.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled
By Motoman on 3/8/2011 10:03:24 AM , Rating: 4
Sorry...but 88 degrees celsius is far from cool...under any consideration that might apply here, human or computer. That is wickedly hot, even for a GPU. It categorically does not demonstrate that the cooling system is demonstrates that the cooling system is dangerously weak and unable to keep the GPUs within optimal operating temps. For stability and longevity, I'd like to see GPUs stay under 70 degrees for sure...preferably under 60 degrees. I like to keep my CPUs down around 40 if at all possible...

By FITCamaro on 3/8/2011 10:42:20 AM , Rating: 1
My old X1950XTX got up to 105 before. But I do agree with you.

By jonmcc33 on 3/8/2011 7:45:57 PM , Rating: 2
Yeah. My Radeon X1900XT got to 95C in a bad air flow case. That was extremely hot. It would warm my room up quite a bit during gaming.

By ClownPuncher on 3/8/2011 12:54:31 PM , Rating: 2
Those are some pretty arbitrary temps you are asking for. If the card is designed for it, why does it really matter? Are you going to stuff a $700 card in a $49.99 case? This thing is cooler than single GTX480's while being more than twice as powerful. The only thing that would concern me would be fan noise.

By Motoman on 3/8/2011 6:35:08 PM , Rating: 1
Arbitrary? Hardly. And the point that I'm making is that the card isn't so much "designed for it" as it is a result of it's design.

...also, the cost of a case is utterly irrelevant. Either a case has adequate airflow or it doesn't - I have seen plenty of $20 cases that have superior airflow to $200 cases.

88 degrees is way too hot for comfort. Even if it remains stable, you're literally burning the life out of that GPU. The cooling system is not adequate...period.

By PrinceGaz on 3/8/2011 9:18:55 PM , Rating: 2
You're not burning the life out of anything if it was designed to withstand that temperature for its expected useful lifespan.

How long are these high-end graphics-cards meant to last? I'd say two years will be fine for most users, or three years at most before they're reading reviews of cards from both manufacturers which are much faster than what they've got. A seriously high-end graphics-card is like a bright star-- it shines very brightly for a while and then explodes at the end of its life, whilst more conservative cards will last many many years.

Bear in mind the 90+C was in FurMark with "AUSUM" (possibly the worst acronym in history) enabled, and I think under normal overlocked conditions, it will be fine until the day it's a paperweight,

By Motoman on 3/8/2011 11:48:39 PM , Rating: 2
...why burn it out faster than it needs to be? If it lasts 2 years running at 90 degrees, but would last 4 years at 70 degrees...does it really make sense to run it that freaking hot?

Video cards have longer useful lives than you think. If you're playing on a 19" monitor, a Radeon 1900XT probably would still play WoW at high settings just fine - despite the fact that it's, what, 6 years old?

Just burning something up because you declare you don't care about it makes no sense. It's not that hard to put an adequate cooling system on a video card - if it was, there'd be no aftermarket offering vastly better-than-stock coolers.

By Lerianis on 3/11/2011 5:57:35 AM , Rating: 2
You are missing that these cards are for uber-gamers. Not the ones who are running World of Warcraft (which is light on GPU resources comparatively to what I am about to say) but Crysis on Enthusiast settings at highest resolution.

For those people? 2-3 years IS the usable life of a card. I'd say for some that even one year is the usable life of a card before they are looking for something better.

By Sazabi19 on 3/9/2011 9:19:21 AM , Rating: 2
These cards are rated for over 100 Celcius (not sure of exact temp rating but I am positive it is over 100 Celcius)

By Motoman on 3/9/2011 10:01:35 AM , Rating: 2
It's a simple equation - more heat = less life.

The point is simple - a better cooling solution would prolong the life of the card.

What about that point is upsetting you?

By Jcfili on 3/9/2011 4:14:18 PM , Rating: 2
Agree !!!

REAL GAMERS and People that love and care what they buy will go Liquid cooling !!! ^_^..

Problem solved .. no more loud fans ... and no more HEAT!!!

By ClownPuncher on 3/9/2011 11:40:55 AM , Rating: 2
It's arbitrary because you have no numbers backing up your claims that this will degrade the GPU faster. Then we have the fact that those temps were full load Furmark, which doesn't represent day to day gaming in the slightest. There is the other thing you didn't consider; multi gpu cards are always this hot, yet it doesn't seem like their failure rate is any higher.

By Sazabi19 on 3/9/2011 9:17:54 AM , Rating: 2
My Radeon 4870x2 got up to 96 Celcius a few days ago while teseting the new Crysis 2 demo (1600x1200 on max settings), the only thing that happend was the fan reved up to 100% (usually set manually for 60%) and i noticed. This card is geared toward people like me who want a signle card with 2 GPUs on it. Noise is not a huge factor to me as much as staying cooler (i have 8x 120cm fans in my case, all Yate Loon's, they were cheap but noisy as hell but man they move some air through my computer)and i have 5.1 setup on my rig. Apparently either I am not the only 1 that wants one of these, or they have a supply issue; Newegg just got these in and every brand is out of stock, i have been eyeing the egg like hawk and didnt notice them come in, they are already gone. Cant wait to get 1! This 1 card will generate less heat and consume less power than 2 individual cards, and by the way they are priced on the egg, getting 2 6970s or 1 6990 is the same price, so go for whichever you want more.

By Motoman on 3/9/2011 10:02:30 AM , Rating: 2
My Radeon 4870x2 got up to 96 Celcius a few days ago

...and you think that's a good thing? If so, I have a really nice bridge you might be interested in...

By Sazabi19 on 3/9/2011 10:17:03 AM , Rating: 2
Lol meaning no damage has been done, its not the 1st time either. Sure it warms up the room really well and i don't much care for it, but the card is RATED for that and higher. There is no impact on the card accept possibly some perfromance hits (none that i see when gaming and watching FPS). The point you are trying to make is moot. This is posing no danger to the card which i got when it came out, i have had no problems with it whatsoever.

By boogle on 3/9/2011 10:13:45 AM , Rating: 2
I didn't know you were on the design team at AMD?

There are components designed to operate at well over 100c for many years, similarly there are components designed to operate at well below 0c for many years. The IC itself will have no problem with extremely high temperatures (above 100c easy peasy) - it's the packaging that matters. Suffice it to say, they won't be using the solder you normally buy off the shelf for your LEDs.

I remember when CPUs weren't meant to operate above 40c, any higher and they would 'burn out'. I take it your arbitary 70c comes from the max design temps of cards like the GeForce 2. Very few GPUs have been designed around a max of 70c for a long, long time. Just look at a laptop to see how components are designed to run at 100c for extended periods of time.

By Motoman on 3/10/2011 10:12:56 AM , Rating: 1
Wow, all of you people are just amazing.

For the record, no, I'm not a semiconductor engineer. However - I will bet you any amount of money that any qualified engineer you find will confirm the following:

More heat = less life.

Heat is the nemesis of, well, lots of stuff...but in this case, especially electronics, and especially semiconductors. Whether you're talking about a CPU, a GPU, a network controller, a north bridge, a south bridge, is a perfectly valid point that operating that device under higher temps will generally increase it's rate of degradation. Running the device under lower temps will, naturally, comparatively decrease it's rate of degradation.

It's basic physics - ergo, unavoidable.

If you want to declare that you're fine with something burning out in a couple years because that's what it was "designed" to do...I guess that's your problem. Personally, I'd rather see a better cooling solution that keeps the device running for several years...if not indefinitely.

By Lerianis on 3/11/2011 6:01:16 AM , Rating: 2
Guess again. The graphics card in my Gateway gaming laptop starts freezing when it gets above 70C, and it's only 3 years old.

9800GTS to be specific in my machine. You are right about these things are DESIGNED to run at 100C, but this is not a common thing, it's new.

Retakes? When did they ever lose it?
By animekenji on 3/8/2011 10:37:06 AM , Rating: 5
AMD never lost the fastest single card crown so how could they retake it? 5970 is faster than GTX580 and the Asus Ares is the fastest of all.

RE: Retakes? When did they ever lose it?
By Beavermatic on 3/8/11, Rating: -1
RE: Retakes? When did they ever lose it?
By animekenji on 3/8/2011 10:55:01 AM , Rating: 2
You deny the 5970 is faster than GTX580 and that the Asus Ares is the fastest video card ever made prior to the 6990?

RE: Retakes? When did they ever lose it?
By Beavermatic on 3/8/11, Rating: -1
RE: Retakes? When did they ever lose it?
By inighthawki on 3/8/2011 12:18:12 PM , Rating: 5
Regardless of how many GPU chips are on the card, it is a SINGLE card, and it WAS the fastest card. You sir are just being ignorant.

RE: Retakes? When did they ever lose it?
By Beavermatic on 3/8/11, Rating: -1
By ClownPuncher on 3/8/2011 12:51:25 PM , Rating: 3
Actually, all that really matters is price vs performance. Putting things in "classes" is just a trick people with penis envy use.

By inighthawki on 3/8/2011 12:56:26 PM , Rating: 2
Of course it's not in the same class, in the same way a Ferrari is in a different class than your average Ford, but that doesn't mean both cars are not single cars capable of being compared for performance. You cannot simply disqualify someone for putting more performance in a single package.

You need to understand there is a difference between fastest card and fastest chip . Nvidia quite certainly holds the crown for the fastest single chip, the 580, but what people really care about is the fastest CARD on the market.

RE: Retakes? When did they ever lose it?
By PrinceGaz on 3/8/2011 9:42:41 PM , Rating: 2
Has the concept of fastest single card escaped you?

Fastest single card does not mean fastest single card with one GPU. It means fastest single card regardless of what is on the card, so long as it is one card which fits into one socket. That's easy to understand, yes? You put one card into one socket, and attach your monitor, and then find out which card is fastest. You don't take the card apart to see if it has one or two GPUs underneath the heatsink.

Back in he days if 3dfx had their way, we'd have had crazy beasts like the Voodoo 5 6000 with 4 GPUs onboard! In fact, I think this review is the first time since looking at those pre-release samples I've actually burst into hysterical laughter whilst reading a graphics card review, at the sound-level of this monster. I had managed to contain my gasps at the power levels until then.

Graphics cards are going mad, and both AMD and nVidia are equally guilty of power escalation. I know GPU/GPGPU cores lend themselves to ever more transistors for ever more parallel work, but it is going out of control. I only hope game developers (Crytek!) don't target the very high end, as the very high end is now becoming silly, with said PCs consuming around a kilowatt of power when fully loaded.

By inighthawki on 3/9/2011 12:12:42 AM , Rating: 2
I only hope game developers (Crytek!) don't target the very high end

I take it you have not seen the new Unreal 3 engine from GDC then. Have a look:
It is rendered in real-time and looks quite beautiful imo, and requires 3x580s to render (though it apparently hasn't been optimized)

By animekenji on 3/8/2011 1:53:51 PM , Rating: 4
The title of the article was AMD retakes single card crown. In order to retake something, one first has to lose it. The 5970 has been the fastest single card since it was introduced. You should go back to school and take a few classes in reading comprehension because you are sorely lacking in that area.

By supermitsuba on 3/8/2011 11:20:09 AM , Rating: 3
Cheeta blood

By silverblue on 3/8/2011 11:02:22 AM , Rating: 2
Very true. nVidia holds the title of fastest single-GPU card.

As for people saying that two 580s in SLi would whallop this card, maybe so, but two 580s in SLi would use an ungodly amount of power as well as cost significantly more. That said, if you're in the market for such a card and money really isn't an issue, then until that VLIW4 setup gets a chance to shine, the 590 - IF it's two 580s, which I doubt - would be a better performer.

Until the 590 appears I'm going to consider two 580s on a single board to be wishful thinking; I'm expecting two 570s at the most. How many power plugs is this thing going to need if they really go the whole hog?

Why just increase width only?
By XZerg on 3/8/2011 11:03:45 AM , Rating: 3
Why are these guys not increasing the height instead of just the width? These type of cards would demand a fatter case too to have more air in the system then why not make use of that space instead of limiting to only very wide cases? The only two reasons I can think of are 1) modular design that is similar to the regular cards but that seems like little to worry about as the dimensions are already different than rest of the cards anyhow requiring different parts regardless. 2) keep more space between the two GPUs but that can still be done by putting the capacitors and memory towards the top of the chips and squeezing that extra space.

So why go with longer version than shorter?

By Integral9 on 3/9/2011 7:56:09 AM , Rating: 2
the PCI formfactor describes a height, length and width maximum. This is probably more related to the ATX standard than PCI as whatever the card is and no matter where the pins are positioned, it still has to fit inside the case. This could change when (if ever) a new case standard is developed and agreed to, but until then we are basically stuck with our 20yr old standard.

70 dB!!
By PaterPelligrino on 3/8/2011 11:58:14 AM , Rating: 3
Doesn't matter how fast the card is, at 70db there's no way I'd have this in my system.

RE: 70 dB!!
By QuantumPion on 3/10/2011 12:51:10 PM , Rating: 2
Agreed, but then again if you can afford this $700 monster then you should be able to afford a few hundred more for water cooling :p

88C is not cool
By Flunk on 3/8/2011 11:13:07 AM , Rating: 2
I think you may have missed something in the temperature translation but 88C is very hot for a graphics card, you could cook an egg on that easily.

RE: 88C is not cool
By tamalero on 3/8/2011 11:29:11 AM , Rating: 2
I'm confused with your answer, you claim that the "you could cook an egg"
last generation of cards (GTX480, 5870, 5970)
both had similar temperatures, why the change of suddenly everyone claiming 88C is "TOO MUCH".
remember the 480GTX went easily up in the 100's °C
the 5970 in a bad ventilated case was in the 90's °C.

Anyway I feel the vibe this review was kinda... sided to Nvidia's offering. it only marks and remarks the drawbacks but noone of the "pros" apart of the "fastest single card".

Anyway let's remember the 580GTX supposed to be "top end"
just like the 480GTX was.. it all depends on price in the end.

Also I think the most interesting aspect is.. how both sides finally will say "screw this" to the supposed 300W PCI-E power draw limit.
as a single GTX480, GTX580 and the dual cards 5970 and 6990 were clawing the limit or already surpased it. I cant imagine how the dual GTX 580 will be...

Edit Please?
By Sazabi19 on 3/9/2011 9:10:44 AM , Rating: 2
One is if you have to have 5 monitors. Currently only the 5870 Eyefinity 6 and this card support driving 5 monitors from a single or dual card solution. So if you have to have 5 monitors and you're willing to pay for top power, this is the card for you.

Please take this out, there is a 6970 by ASUS (recent to the market by about at least a week)that is capable of using 6 monitors (eyefinity 6?). Not your fault, it's the other idiots.

RE: Edit Please?
By Sazabi19 on 3/9/2011 9:20:34 AM , Rating: 2
Not sure how i quoted myself, but the 2nd item is me asking you to cut that paragraph out please Mick :)

By FITCamaro on 3/8/2011 10:41:34 AM , Rating: 2
Anyone wanna buy a 5970 for cheap with aftermarket cooler? :)

By Beavermatic on 3/8/11, Rating: -1
RE: crap...
By Da W on 3/8/2011 9:45:17 AM , Rating: 2
Nvidia is doing dual GPU now? Kind of proof that AMD's stratedy is not worthless.

RE: crap...
By Beavermatic on 3/8/11, Rating: -1
RE: crap...
By Jakall78 on 3/8/2011 10:25:02 AM , Rating: 5
Matter of fact... I think Nvidia kicked off the 2-gpus on a single card first.

Incorrect there, ATI was the first to integrate 2 GPU`s (in 1999 they were called just graphic chips or graphic processors) on 1 board , with the Rage Fury Maxx. 3Dfx was first with Voodoo 2 SLI in 1998, but on 2 separate cards . Nvidia`s SLI came later, in 2004, with the Geforce 6 series.
Anyway, this card has many advantages compared to a Crossfire setup, only the noise is just too much. Tough sell, I think.

RE: crap...
By Beavermatic on 3/8/2011 10:30:47 AM , Rating: 2
I stand corrected. I forgot about the Rage Fury Maxx, your correct sir, in that respect.

But I keep thinking there was someone who did it even earlier than ATI.... hmmmm, PowerVR or Trident or S3

Why do I keep thinking one of those guys tried it first? I could be wrong.

RE: crap...
By Beavermatic on 3/8/2011 10:35:16 AM , Rating: 2
0or maybe it was a matrox card im thinking of... damn i cant remember.

RE: crap...
By animekenji on 3/8/2011 10:43:16 AM , Rating: 3
Sorry but you're wrong, too. Quantum3D made single slot SLI cards based on the Voodoo2 chipset as far back as 1997. I own one.

RE: crap...
By MrBlastman on 3/8/2011 11:16:02 AM , Rating: 3
You had quite a bit of coin to spend back then... I recall seeing it at CompUSA at the time for about 700 bucks. :)

It was a beast. I bet it would have run Quakeworld Team Fortress in Open GL (with the vis-patched water) @ 640x480 or 1024x768 at a dreamy framerate.

RE: crap...
By animekenji on 3/8/2011 10:46:51 AM , Rating: 3
And if you don't believe me, look here

Single board with two Voodoo2 chipsets. This is not one of the dual board models mounted in a single slot.

RE: crap...
By FITCamaro on 3/8/2011 10:51:52 AM , Rating: 3
Well it has over 400 million more transistors per GPU, I hope its faster.

AMD is giving 95+% of the performance with 85% of the size.

And a dual-GPU GTX590 will suffer from the same power and heat limitations as the 6990. Meaning lower core and memory clocks. Unless they release a water cooled only card.

RE: crap...
By Da W on 3/8/2011 11:37:35 AM , Rating: 2
The 6990 draw as much power as two 560Sli. Two 580SLI draw about 200 more watt. They won't be able to pack two 580 at full speed on a single card, period! May be two 570 could match the 6990, MAYBE.

RE: crap...
By nafhan on 3/8/2011 10:43:42 AM , Rating: 2
It'll be interesting to see how many monitors the GTX 590 can drive since that's one of the only reasons to get such a high performing video card.

RE: crap...
By animekenji on 3/8/2011 10:58:37 AM , Rating: 2
Actually, a single 30 inch LCD at 2560 x 1600 requires quite a bit of power to drive games with everything turned up to ultra high settings and have smooth, playable framerates. You need multiple GPU's for that.

excellent editing
By Owls on 3/8/11, Rating: -1
RE: excellent editing
By Marlonsm on 3/8/2011 9:55:19 AM , Rating: 1
Having both GPUs in one board isn't the same as having them in separate boards.
But as the article said, I still don't see much point in buying it over a pair a cheaper cards. The biggest reason is if you want to pair two HD6990 so you'll have 4 GPUs.

"Intel is investing heavily (think gazillions of dollars and bazillions of engineering man hours) in resources to create an Intel host controllers spec in order to speed time to market of the USB 3.0 technology." -- Intel blogger Nick Knupffer

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Automaker Porsche may expand range of Panamera Coupe design.
September 18, 2016, 11:00 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
No More Turtlenecks - Try Snakables
September 19, 2016, 7:44 AM
ADHD Diagnosis and Treatment in Children: Problem or Paranoia?
September 19, 2016, 5:30 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki