Print 37 comment(s) - last by Lerianis.. on Dec 18 at 12:08 PM

Opened Radeon HD 6970 with Vapor Chamber Cooling (click to enlarge)

AMD's PowerTune technology (click to enlarge)

Architectural overview
Targets NVIDIA's GeForce GTX 570

AMD has launched its latest Radeon HD 6970 and Radeon HD 6950 video cards with 2GB of GDDR5 using a 256-bit wide memory bus. The new cards feature new 40nm Cayman GPUs produced by the Taiwan Semiconductor Manufacturing Company (TSMC). Retailers are expected to sell the Radeon HD 6970 for $369 and the Radeon HD 6950 for $299 starting today.

The Radeon 6900 series was originally designed for the 32nm node, but problems at TSMC forced the cancellation of the plans and a quick backport of the design for 40nm design rules. Cayman chips consist of 2.64 billion transistors and measure 389 mm2. Part of the reason for the larger size is the new eighth generation tesellator which will improve DirectX 11 performance.

New Morphological Anti-Aliasing (MAA) and Enhanced Quality Anti-Aliasing (EQAA) modes have also been added. MAA is a post-process filtering technique that is accelerated with DirectCompute. It delivers full-scene anti-aliasing and is not limited to polygon edges or alpha-tested surfaces.

The big difference in the 6900 series is that the cards can by limited by Thermal Design Power using AMD's PowerTune technology, rather than be limited by clock speed as had been done previously.  The engine and memory clocks will be dynamically adjusted to fit within the TDP limit of each card. The Radeon HD 6970 can be set to a maximum TDP of 250W while its less powerful sibling will be limited to 200W.
Idle power  for both is relatively low at 20W. The Radeon HD 6950 requires two 6-pin plugs, but the Radeon HD 6970 will require an 8-pin plug and a 6-pin plug. A BIOS switch has also been added to reset to the default BIOS in case of problems with tweaking.

Vapor Chamber Cooling has been relatively rare, but AMD is using it for the reference card design that most Add-In-Board partners will adopt. Vapor chamber coolers are more efficient than traditional heatpipes and easier to design as there are no routing concerns. Chief competitor NVIDIA's high-end GeForce GTX 580 also uses Vapor Chamber Cooling.

Both cards have 2 DVI slots, an HDMI 1.4a slot, and 2 mini-DisplayPorts. Four monitors can be supported natively, but up to six monitors can be supported if DisplayPort hubs are used. Support for DisplayPort 1.2 has been added, allowing for a bandwidth of 21.6 Gbps each. DisplayPort hubs are able to tap into this bandwidth using a technology called Multi Stream Transport (MST) and split it for multiple monitors.

The new cards are meant to replace the Radeon HD 5800 series, one of the company's most successful video cards in its history. It was the first to support DirectX 11, but has now been deemed to costly to produce and is being phased out. The Radeon HD 6700 series recently launched offers similar performance to the 5800 series but at a much lower price. The Radeon HD 6970 will generally provide 20% better performance than a Radeon HD 5870 for only $20 more.

The Radeon HD 6990 codenamed Antilles will launch next month. A dual GPU version of the Radeon HD 6970, it is expected to have 4GB of GDDR5 memory.



AMD Radeon HD 6970


AMD Radeon HD 6950

AMD Radeon HD 6870

AMD Radeon HD 5870

Stream Processors







Texture Units














Core Clock







Memory Clock

1.0GHz (4.0GHz effective) GDDR5

1.375GHz (5.5GHz effective) GDDR5

950MHz (3.8GHz effective) GDDR5

1.25GHz (5.0GHz effective) GDDR5

1.05GHz (4.2GHz effective) GDDR5

1.2GHz (4.8GHz effective) GDDR5

Memory Bus Width







Frame Buffer







Transistor Count







Die Size








1x 8-pin

1x 6-pin

1x 8-pin

1x 6-pin

2x 6-pin

2x 6-pin

2x 6-pin

2x 6-pin

Manufacturing Process

TSMC 40nm

TSMC 40nm

TSMC 40nm

TSMC 40nm

TSMC 40nm

TSMC 40nm

Price Point








Comments     Threshold

This article is over a month old, voting and posting comments is disabled

My Take
By Mitch101 on 12/15/2010 9:41:35 AM , Rating: 5
Nvidia might have won this round in performance just edging out the AMD card but you only need one AMD video card to do triple screen gaming making this a no brainer for me because of my budget.

For me once you do triple screen gaming there is no going back. Triple screen gaming I believe is PC gaming's major edge over consoles and with LCD panels coming down in price its not that horrible an upgrade. LCD panels usually last multiple PC's. Video cards come and go. Needing one instead of two even better.

RE: My Take
By PAPutzback on 12/15/10, Rating: -1
RE: My Take
By Mitch101 on 12/15/2010 10:14:34 AM , Rating: 5
If you bought an AMD video (5750 of higher) card you only need to buy two Additional Monitors because you should already have the first monitor. So 2 x $100.00 - I got mine using a staples $25 off $100.00. Plus a $25.00 DP to VGA adapter.

$225.00 to upgrade to triple screens using AMD

1- Requires two video cards to do triple screens so add at least $150.00 for the second video card.
2- Requires a more costly mother board that has SLI - $30.00
3- Because your running two video cards you may need a higher wattage power supply I'll go cheap here - $20.00
4- Now add the two additional monitors needed for the side screens using the staples $100.00 monitors

$400.00 to upgrade to triple screens using NVIDIA

Sure the NVIDIA dual should have more performance but Im currently using a Radeon 5770 and its handling my games in triple screen mode with no problems. For Crysis I bump down the screen resolution. But for L4D, L4D2, WOW, Dragon Age Origins, etc are no problem.

When I need to upgrade I just replace the single video card. No need to buy two video cards when upgrading.

RE: My Take
By kleinma on 12/15/2010 3:36:27 PM , Rating: 2
I was under the impression that on an nVidia system, if you SLI 2 cards, you can only use 1 card for output anyway, meaning if you turn SLI on, you can't run 3 monitors. Is that the case? If so, and you can run 3 monitors on 1 ATI card, then I would think ATI would have a head up on this because you should still be able to xfire and run 3 monitors, no?

RE: My Take
By Mitch101 on 12/15/2010 4:03:34 PM , Rating: 2
Dont honestly know all the details for NVIDIA. What I do know is you need two NVIDIA cards to do triple screen gaming.

I can confirm the ATI/AMD video cards can drive triple monitors aka eyefinity with one card. ATI/AMD said they designed this from the beginning. Real easy to switch between ultra wide gaming mode and triple screen mode too with the profiles option.

RE: My Take
By Etern205 on 12/15/2010 5:40:55 PM , Rating: 2
Both ATi and Nvidia has the ability to run triple screens, just that ATi only needs one card to achieve it while Nvidia requires 2.

ATi names their multiple screen technology Eyefinity and the minimum is 3 monitors, but you can scale up to 6 or more depending on the number of graphics you installed.
There is a Youtube clip which demos 4x HD5870 E6 cards doing Eyefinity up to 24 monitors.

As for Nvidia it's called 3D Vision (Surround).
This requires the user to have 3 monitors and it has to support 120Hz in order for 3D Vision to work. Then you will also need the 3D vision kit which comes with their 3D glasses and some receiver unit. Not all Nvidia cards will run 3D vision. And you can't expect a weak Nvidia card to have the same performance as a high-end card.

RE: My Take
By wallijonn on 12/16/2010 2:50:30 PM , Rating: 2
Plus a $25.00 DP to VGA adapter.

Seeing as you are paying a premium for the fasted vid-card in the word, why not also use DVI connections instead of the inferior VGA? Otherwise you're not getting the best possible picture out of your rig, no?

RE: My Take
By Lugaidster on 12/18/2010 1:42:35 AM , Rating: 2
For the resolution his panels probably have (not more than 1080p) there is no difference in quality between VGA and DVI. No need for a more expensive adapter.

RE: My Take
By bug77 on 12/15/2010 10:20:47 AM , Rating: 2
I get acceptable performance with a heavily overclocked GTX460 at 1920x1200 in games like Stalker:Call of Pripiat, Risen and even Fallout 3 occasionally stutters. Which single card are you using to play on 3 monitors at once? And which games do you play?

RE: My Take
By Mitch101 on 12/15/2010 10:29:36 AM , Rating: 2
From Memory.

L4D, L4D2, Dragon Age Origins, Stalker first one have Call second one still in plastic and some Battlefield versions dont have Black Ops yet hoping for Steam sale.

In a game like Crysis I drop the resolution down. But I prefer to play at lower resolution than to play in a single screen mode.

RE: My Take
By B3an on 12/16/2010 6:32:52 PM , Rating: 2
Dont know why people want to play with 3 monitors. It's a gimmick, almost as bad a 3D TV. You have massive bezels in the way and games are simply not designed to be played with that much vertical space - everything is stretched on the two end monitors, the perspective is totally messed up. If someone dont know what i mean just google for some images.

It's a much better idea just to get a single 30" IPS monitor. You'll still have very high res but no bezels distracting you, vastly superior image quality, better viewing angles, and higher colour gamut. I'd take image quality and no bezels over sticking some cheap monitors together any day.

RE: My Take
By Mitch101 on 12/17/2010 1:15:12 PM , Rating: 2
True but you have to try it.

You focus on the center screen and your eyes use the side screen for movement/reference. As your reading this post your not focused on the information at the sides of your monitor you can tell whats there but your not directly focused on it and they are not in focus. You might quick scan to it and turn your point of view toward that item but overall you focus directly on the center. Plain and simple it works and its getting better. I hope to use 5 monitors in an extra wide view someday.

As for the bezels they dont get in the way. They aren't blocking any information like a car does with information hiding behind it and to reduce the bezels you overlap your monitors bezels instead of putting them side by side minimizing the bezel area.

Like you state your looking at pictures on the web you have to try it and within one try you will get it. Technically I dont care if you do it with NVIDIA or ATI but it works and it adds a lot to gaming.

RE: My Take
By FaceMaster on 12/15/2010 10:24:49 AM , Rating: 1
Wait, budget... triple screens? Sorry, I don't see how you can have both, three decent screens will require more horsepower than a 'budget' card can manage.

RE: My Take
By Mitch101 on 12/15/2010 10:36:48 AM , Rating: 2
This guy posted some of his benches running a 5770 triple screen running at 5760x1080.

For the games that dont run at that resolution smooth you just drop the resolution down to 3840x900 and set details to something like medium if necessary.

RE: My Take
By Mitch101 on 12/15/2010 10:39:40 AM , Rating: 2
Note he also used the highest video settings in every game with the exception of AA.

If your in a FPS shooter chances are your moving around too fast to need the extra detail of highest video settings.

Oh and I play Bioshock too. Still have Bioshock 2 in plastic.

RE: My Take
By bug77 on 12/15/2010 11:26:09 AM , Rating: 2
Not exactly blazing fast and he has a good chunk of shooters in there, where 30fps doesn't really cut it. But if you can live with that, you've got yourself a pretty cheap setup.

RE: My Take
By Motley on 12/15/2010 2:47:50 PM , Rating: 2
30FPS? Some of those are showing minimums of 5FPS. I think that about says it all. Honestly, if you are going to go that cheap, then you can always grab a Nvidia 295 or 9800GX2, and you can run 4 monitors of either one of those cards.

RE: My Take
By Mitch101 on 12/15/2010 3:15:11 PM , Rating: 2
By Some of those you mean Crysis which is the only one showing 5fps. That is at 5760x1080 at the highest settings too which brings many video cards to a crawl even on a single monitor at 1920x1080.

The Chart shows a good portion of games will run at 5760x1080 on a Radeon 5770 even at highest settings. Which can be found for around $105.00 - $125.00. Kind of entry level graphics card.

Seriously its more fun to play at 3840x900 at medium settings than it is to play on a single screen in high resolution.

RE: My Take
By bug77 on 12/15/2010 5:36:06 PM , Rating: 1
It's all a matter of perception really. Some people tolerate low FPS better than others. Plus, you can't go higher than 60 on a LCD anyway.

RE: My Take
By someguy123 on 12/15/2010 5:58:52 PM , Rating: 3
It's all a matter of perception really. Some people tolerate low FPS better than others. Plus, you can't go higher than 60 on a LCD anyway.

What is this, 1996?

RE: My Take
By bug77 on 12/16/2010 4:13:46 AM , Rating: 2
In all fairness, you can get a 120Hz display that will go higher than 60FPS. I forgot about those. But I have already gone IPS, I can't go back to TN-Film.

RE: My Take
By someguy123 on 12/16/2010 7:22:00 PM , Rating: 2
IPS panels should be able to run 75hz.

I loved my S-IPS HP, but switching over to a 120hz TN was a worth while trade off imo. If you're in need of color accuracy then IPS is still the only way to go, but TN panels are substantially better than they used to be although there are still viewing angle problems.

RE: My Take
By Phoque on 12/15/2010 8:09:22 PM , Rating: 2
Nvidia might have won this round in performance

It depends on the point of view. From the single Gpu point of view, yes. But who cares? Who cares when you`re looking for the single fastest graphics card on the market?

By the way, that card is still the Amd 5970.

And given Amd's better performance per mm^2, this won't change with a dual Nvidia Gpu card as I would speculate Antilles would beat it. We'll see next year.

All that really matters is that all/most market segments be covered ( especially the low/mid/mid-high-end ) at the best price point vs profit margin. I believe currently Amd is in the best position. But congrats to Nvidia for the sucessfull Fermi overhauled 580 and 570. A great price war should start soon enough.

RE: My Take
By someguy123 on 12/15/2010 10:06:06 PM , Rating: 2
Well there's still the issue of microstuttering in multi-gpu setups. 5970 is still the best bet for synthetics but 580 should provide smoother and more responsive gameplay. I'm actually a bit disappointed with AMD's pricing, but that's mainly due to how aggressive they were last gen. Right now they've priced their cards to perfectly fill in the gaps instead of trying to give nvidia a run for their money.

RE: My Take
By Lerianis on 12/16/2010 5:28:23 AM , Rating: 2
Agreed. While NVidia might have a LITTLE horsepower lead on AMD, the cost of two AMD cards being less than the cost of ONE NVidia card in some cases more than makes up for it.

Plus, they are putting more graphics memory on AMD graphics cards (though anything over 1GB is a waste according to Anandtech anyway).

By the way... has anyone had an issue with the graphics card memory being 'used up' totally and the system hard locking?

I realized that it wasn't heat issues on my Gateway P series that was locking it up.... it was memory run-out issues, after seeing how much graphics memory was being used in 1650*1200 mode in some games and being able to watch it tick up and once it got over 512 MB's (the graphics memory in my computer)... BOOM! Lockup even if the graphics card is only running at 50C.

RE: My Take
By Amiga500 on 12/16/2010 7:43:34 AM , Rating: 2
Nvidia might have won this round in performance

This round isn't over yet....

Still the 5990 to come, so you might as well start comparing a crossfire 5950 setup (and maybe subtract 10-15% performance for a fudge factor) to the 580 to see who wins the round...

By gibb3h on 12/15/2010 9:01:10 AM , Rating: 2
"The Radeon HD 6970 will generally provide 20% better performance than a Radeon HD 5970 for only $20 more."

is that meant to be 5870, not 5970? if not, thats quite a performance jump!

RE: typo?
By kensiko on 12/15/2010 9:56:33 AM , Rating: 2
The card has been tested. It's not faster than 5970. It's between 15 and 20% faster than a 5870.

RE: typo?
By Mitch101 on 12/15/2010 10:22:57 AM , Rating: 2
Keep in mind AMD wanted to use 32nm for this chip not 40nm. Had manufacturing been ready ready for this the 6970 should have been even faster and consumer less power. But it does give us the ability to compare the architecture of the AMD design to the NVIDIA design and where each other.

RE: typo?
By DarkPhoenix on 12/16/2010 6:38:38 AM , Rating: 3
LOL, so what ? Is AMD's performance really being based on "what ifs" now ?
The facts are HD 6970 trades blows with the GTX 570, but the worst of all is that the performance increase over the HD 5870 is barely 20% overall, which is quite a disappointment for a new architecture and over 1 year after the release of Cypress.

By Wierdo on 12/15/2010 12:49:28 PM , Rating: 2
First reaction to PowerTune was "groan" until I found out it was user-adjustable, then that turned into "Awesome!"

Would be curious to see what kinda things you could do by tweaking that up/down depending on different needs/scenarios (htpc/overclocking/etc).

RE: Powertune
By shiftypy on 12/16/2010 8:43:08 AM , Rating: 2
yep, I can already imagine tuning it to -20% for casual stuff, then setting to +20% for serious all-out FPS in headphones :)

how much do you think the performance of new cards is affected by drivers? VLIW4 would require optimization I recon.

RE: Powertune
By bug77 on 12/16/2010 9:44:46 AM , Rating: 2
Others have imagined before:
No "serious all-out FPS" was spotted.

RE: Powertune
By bug77 on 12/16/2010 10:08:55 AM , Rating: 2
And to your second question, the answer would be zero. With VLIW5, optimization might have been necessary for judicious use of the specialized unit. But now since there's no specialized unit anymore, generic code should work just fine already.

It's over, AMD is finished
By SerafinaEva on 12/17/2010 12:38:19 AM , Rating: 2
AMD's flagship card, the 6970 is can't even compete with Nvidia's offering. The majority of benchmarks show Nvidia's gpus beating AMD, especially in DX11 intensive games. When it comes to tesselation, the 6970 can't even beat the Nvidia 480, a last generation chip.

3DMark Vantage benchmarks show AMD's latest GPUs coming far behind Nvidia's 570 and 580 GPUs.

Nvidia definetely won this generation.

RE: It's over, AMD is finished
By bug77 on 12/17/2010 4:38:42 AM , Rating: 2
Somehow it's always been this way: each generation, one manufacturer held the performance crown and the other didn't. And yet, neither of them is finished yet.

RE: It's over, AMD is finished
By Lerianis on 12/18/2010 12:08:01 PM , Rating: 2
Did you take the power consumption into effect as well as the price? Because, a lot of people like myself are taking that into account more than the speed of the graphics card alone today.

"The Space Elevator will be built about 50 years after everyone stops laughing" -- Sir Arthur C. Clarke

Most Popular ArticlesAre you ready for this ? HyperDrive Aircraft
September 24, 2016, 9:29 AM
Leaked – Samsung S8 is a Dream and a Dream 2
September 25, 2016, 8:00 AM
Yahoo Hacked - Change Your Passwords and Security Info ASAP!
September 23, 2016, 5:45 AM
A is for Apples
September 23, 2016, 5:32 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki