backtop


Print 55 comment(s) - last by Lerianis.. on Mar 11 at 6:01 AM


The Radeon HD 6990 (top, both pictures) atop the Radeon HD 5970 (bottom, both pictures), AMD's previous dual-GPU, single-card solution.   (Source: Anandtech)

The Radeon 6990 is absolutely the king of single-card performance, but its price makes it essentially unsellable.  (Source: Anandtech)
AMD's new card sounds like a jumbo jet and will break the bank, but it is fast

Like the dual-GPU chip Radeon HD 5970, the Radeon HD 6990 [press release] may be a victory lap of sorts for AMD.  Unlike the Radeon 5000 series, this generation the chipmaker faced tougher competition, with rival NVIDIA actually delivering in a (relatively) timely fashion an impressive set of GPUs -- the Geforce 500 Series.  But sales reports indicate that AMD has maintained its lead over NVIDIA, despite this recovery.

What better way to celebrate than to make a decadent and superfluous, but utterly powerful single card offering?

I.  More Beast Than Beauty

While the Radeon HD 6990 is a one-PCB card, it essentially acts like two.  It has two distinct vapor chambers, each with their own distinct heat-sink.  The GPU chips are linked by an internal CrossFire connection.  They share 4 GB of GDDR5 RAM, with each GPU essentially getting 2 GB -- the same as single card offerings from AMD.

Performance-wise the card acts much like two cards as well.

In testing by Anandtech, the card sucked down over 490 watts of power during intense gaming benchmarks.  The good news is that it's power dissipation was pretty incredible, allowing the card to stay at a cool 88 degrees Celsius -- one of the best things about the card.

The card can be aggressively overclocked on air up to 830 MHz, at least, but it require power aplenty (almost 550 watts of power, to be precise).

Something not so impressive was the roaring fan speeds required to keep the card running so cool.  NVIDIA -- long the butt of many a joke for its Geforce 400 Series' fan noise -- can breathe a sigh of relief.  If a Geforce 400 was a noisy lawn mower, the Radeon HD 6990 might as well be a jumbo jet.  It generates 70.2 decibels under load -- almost 4 db more than a pair of SLIed Geforce GTX580s.  When overclocked the card reached 77 (!) decibels.  At those levels, you might want to invest in a nice pair of gamer ear-plugs.

So how's the performance?  Well the good news is that this card easily beats any single card on the market right now by NVIDIA or AMD.  Sure that's because it's essentially two cards, but if the competition criteria is "single slot card", the Radeon HD 6990 is king.

Against two cards it falls flat, though.  It's consistently approximately 8 percent behind a pair of CrossFired AMD Radeon HD 6970 cards.  And in some titles like Crysis: Warhead, it even falls behind a pair of Geforce GTX 580s in SLI.

II. "Huh, yeah, what is it good for?" -- Edwin Starr

When it comes to what kind of utility this exercise in extreme single-slot power may have, the answer isn't quite "absolutely nothing", but it's pretty close.

Priced at $700 USD the card is outperformed by a pair of HD 6970s -- approximately $640 USD -- and roughly equaled by a pair of HD 6950s -- approximately $520 USD.

So why in the world would you buy a noisier, less powerful card that's over $60 more expensive?  Well, there's a couple reasons why you might -- but they're uncommon.

One is if you have to have 5 monitors.  Currently only the 5870 Eyefinity 6 and this card support driving 5 monitors from a single or dual card solution.  So if you have to have 5 monitors and you're willing to pay for top power, this is the card for you.

Secondly, if you only have two full PCI-E x16 slots in a board with the slots spaced at least 3 slots apart, and the board is CrossFire-ready, this is your ultimate solution, if heat, noise, and power consumption be damned.  Of course, that's because you're really squeezing 4 GPUs into two slots, but that's a minor technicality.

The final candidate would be a very small subset of single-card boards.  Generally if you have a single slot, your board/case is too small to sufficiently cool the HD 6990.  But it is possible, there are a couple of small systems out there that could muster sufficient cooling for this.  Arguably even if such a configuration doesn't exist already, someone could take a micro-ATX case, put a single HD 6990 in it, and pair it with a couple of ridiculous huge fans that cover the walls (maybe 200 mm?).  

That would yield a compact, powerful, yet ridiculous expensive system, but the upside would be that it would offer the HD 6990 in a package that might appeal to select few.

Unless AMD somehow cooks up octa-GPU CrossFire drivers, this is card, while a marvel of engineering, is a novelty/niche product, sure to sell few, if any, units.

Much like the protagonists of F. Scott Fitzgerald's "The Beautiful and Damned" the card cuts an attractive figure, but hides ugliness (power, noise, performance) underneath.  The upside is that is that it's destined for fame, or perhaps infamy, the downside is that its sales are doomed by its decadence.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

...cool?
By Motoman on 3/8/2011 10:03:24 AM , Rating: 4
Sorry...but 88 degrees celsius is far from cool...under any consideration that might apply here, human or computer. That is wickedly hot, even for a GPU. It categorically does not demonstrate that the cooling system is adequate...it demonstrates that the cooling system is dangerously weak and unable to keep the GPUs within optimal operating temps. For stability and longevity, I'd like to see GPUs stay under 70 degrees for sure...preferably under 60 degrees. I like to keep my CPUs down around 40 if at all possible...




RE: ...cool?
By FITCamaro on 3/8/2011 10:42:20 AM , Rating: 1
My old X1950XTX got up to 105 before. But I do agree with you.


RE: ...cool?
By jonmcc33 on 3/8/2011 7:45:57 PM , Rating: 2
Yeah. My Radeon X1900XT got to 95C in a bad air flow case. That was extremely hot. It would warm my room up quite a bit during gaming.


RE: ...cool?
By ClownPuncher on 3/8/2011 12:54:31 PM , Rating: 2
Those are some pretty arbitrary temps you are asking for. If the card is designed for it, why does it really matter? Are you going to stuff a $700 card in a $49.99 case? This thing is cooler than single GTX480's while being more than twice as powerful. The only thing that would concern me would be fan noise.


RE: ...cool?
By Motoman on 3/8/2011 6:35:08 PM , Rating: 1
Arbitrary? Hardly. And the point that I'm making is that the card isn't so much "designed for it" as it is a result of it's design.

...also, the cost of a case is utterly irrelevant. Either a case has adequate airflow or it doesn't - I have seen plenty of $20 cases that have superior airflow to $200 cases.

88 degrees is way too hot for comfort. Even if it remains stable, you're literally burning the life out of that GPU. The cooling system is not adequate...period.


RE: ...cool?
By PrinceGaz on 3/8/2011 9:18:55 PM , Rating: 2
You're not burning the life out of anything if it was designed to withstand that temperature for its expected useful lifespan.

How long are these high-end graphics-cards meant to last? I'd say two years will be fine for most users, or three years at most before they're reading reviews of cards from both manufacturers which are much faster than what they've got. A seriously high-end graphics-card is like a bright star-- it shines very brightly for a while and then explodes at the end of its life, whilst more conservative cards will last many many years.

Bear in mind the 90+C was in FurMark with "AUSUM" (possibly the worst acronym in history) enabled, and I think under normal overlocked conditions, it will be fine until the day it's a paperweight,


RE: ...cool?
By Motoman on 3/8/2011 11:48:39 PM , Rating: 2
...why burn it out faster than it needs to be? If it lasts 2 years running at 90 degrees, but would last 4 years at 70 degrees...does it really make sense to run it that freaking hot?

Video cards have longer useful lives than you think. If you're playing on a 19" monitor, a Radeon 1900XT probably would still play WoW at high settings just fine - despite the fact that it's, what, 6 years old?

Just burning something up because you declare you don't care about it makes no sense. It's not that hard to put an adequate cooling system on a video card - if it was, there'd be no aftermarket offering vastly better-than-stock coolers.


RE: ...cool?
By Lerianis on 3/11/2011 5:57:35 AM , Rating: 2
You are missing that these cards are for uber-gamers. Not the ones who are running World of Warcraft (which is light on GPU resources comparatively to what I am about to say) but Crysis on Enthusiast settings at highest resolution.

For those people? 2-3 years IS the usable life of a card. I'd say for some that even one year is the usable life of a card before they are looking for something better.


RE: ...cool?
By Sazabi19 on 3/9/2011 9:19:21 AM , Rating: 2
These cards are rated for over 100 Celcius (not sure of exact temp rating but I am positive it is over 100 Celcius)


RE: ...cool?
By Motoman on 3/9/2011 10:01:35 AM , Rating: 2
It's a simple equation - more heat = less life.

The point is simple - a better cooling solution would prolong the life of the card.

What about that point is upsetting you?


RE: ...cool?
By Jcfili on 3/9/2011 4:14:18 PM , Rating: 2
Agree !!!

REAL GAMERS and People that love and care what they buy will go Liquid cooling !!! ^_^..

Problem solved .. no more loud fans ... and no more HEAT!!!


RE: ...cool?
By ClownPuncher on 3/9/2011 11:40:55 AM , Rating: 2
It's arbitrary because you have no numbers backing up your claims that this will degrade the GPU faster. Then we have the fact that those temps were full load Furmark, which doesn't represent day to day gaming in the slightest. There is the other thing you didn't consider; multi gpu cards are always this hot, yet it doesn't seem like their failure rate is any higher.


RE: ...cool?
By Sazabi19 on 3/9/2011 9:17:54 AM , Rating: 2
My Radeon 4870x2 got up to 96 Celcius a few days ago while teseting the new Crysis 2 demo (1600x1200 on max settings), the only thing that happend was the fan reved up to 100% (usually set manually for 60%) and i noticed. This card is geared toward people like me who want a signle card with 2 GPUs on it. Noise is not a huge factor to me as much as staying cooler (i have 8x 120cm fans in my case, all Yate Loon's, they were cheap but noisy as hell but man they move some air through my computer)and i have 5.1 setup on my rig. Apparently either I am not the only 1 that wants one of these, or they have a supply issue; Newegg just got these in and every brand is out of stock, i have been eyeing the egg like hawk and didnt notice them come in, they are already gone. Cant wait to get 1! This 1 card will generate less heat and consume less power than 2 individual cards, and by the way they are priced on the egg, getting 2 6970s or 1 6990 is the same price, so go for whichever you want more.


RE: ...cool?
By Motoman on 3/9/2011 10:02:30 AM , Rating: 2
quote:
My Radeon 4870x2 got up to 96 Celcius a few days ago


...and you think that's a good thing? If so, I have a really nice bridge you might be interested in...


RE: ...cool?
By Sazabi19 on 3/9/2011 10:17:03 AM , Rating: 2
Lol meaning no damage has been done, its not the 1st time either. Sure it warms up the room really well and i don't much care for it, but the card is RATED for that and higher. There is no impact on the card accept possibly some perfromance hits (none that i see when gaming and watching FPS). The point you are trying to make is moot. This is posing no danger to the card which i got when it came out, i have had no problems with it whatsoever.


RE: ...cool?
By boogle on 3/9/2011 10:13:45 AM , Rating: 2
I didn't know you were on the design team at AMD?

There are components designed to operate at well over 100c for many years, similarly there are components designed to operate at well below 0c for many years. The IC itself will have no problem with extremely high temperatures (above 100c easy peasy) - it's the packaging that matters. Suffice it to say, they won't be using the solder you normally buy off the shelf for your LEDs.

I remember when CPUs weren't meant to operate above 40c, any higher and they would 'burn out'. I take it your arbitary 70c comes from the max design temps of cards like the GeForce 2. Very few GPUs have been designed around a max of 70c for a long, long time. Just look at a laptop to see how components are designed to run at 100c for extended periods of time.


RE: ...cool?
By Motoman on 3/10/2011 10:12:56 AM , Rating: 1
Wow, all of you people are just amazing.

For the record, no, I'm not a semiconductor engineer. However - I will bet you any amount of money that any qualified engineer you find will confirm the following:

More heat = less life.

Heat is the nemesis of, well, lots of stuff...but in this case, especially electronics, and especially semiconductors. Whether you're talking about a CPU, a GPU, a network controller, a north bridge, a south bridge, whatever...it is a perfectly valid point that operating that device under higher temps will generally increase it's rate of degradation. Running the device under lower temps will, naturally, comparatively decrease it's rate of degradation.

It's basic physics - ergo, unavoidable.

If you want to declare that you're fine with something burning out in a couple years because that's what it was "designed" to do...I guess that's your problem. Personally, I'd rather see a better cooling solution that keeps the device running for several years...if not indefinitely.


RE: ...cool?
By Lerianis on 3/11/2011 6:01:16 AM , Rating: 2
Guess again. The graphics card in my Gateway gaming laptop starts freezing when it gets above 70C, and it's only 3 years old.

9800GTS to be specific in my machine. You are right about these things are DESIGNED to run at 100C, but this is not a common thing, it's new.


"My sex life is pretty good" -- Steve Jobs' random musings during the 2010 D8 conference














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki