Print 40 comment(s) - last by metaltoiletry.. on Oct 10 at 12:22 AM

For now AMD has the lead -- but will it last

Advanced Micro Devices, Inc. (AMD) has appeared to get the jump on archrival NVIDIA Corp. (NVDA) -- for the moment at least.  With the soft launch of Volcanic Islands, AMD has the world's first DirectX11.2 card, and a whole bag of new tricks -- or did it?

I. AMD R9 280X -- GeForce GTX 770-Like Performance For $100 Less

Some questioned AMD's September soft launch which came with no concrete details about launch date.  But this week the mystery ends, as gamers will get their first taste of Volcanic Islands; and it's a sweet taste of gaming paradise.

It seemed a bit idealistic from the start to believe that all the R7 and R9 cards previewed in September would be available at the exact same time.  Matt Skinner, General Manager of AMD's graphics business unit had promised, "a top to bottom product line for every gamer", remarking, "We believe we have winners at ever price."

R7 and R9
The R7 and R9 family is looking to be more of a traditional launch than some had hoped.

But in the end AMD delivered -- other than the absent flagship R9 290X, all the cards -- the Radeon R9 280X, R9 270X, R7 260X, R7 250, and R7 240 are launching this week.  

Priced at $299 USD and up, the R9 280X's chief competition price-wise -- for now -- is the NVIDIA’s GeForce GTX 760, a second-generation Kepler GPU that NVIDIA finally pushed out this week.  The AMD GPU will be available on Friday, Oct. 11; somewhat disappointingly it doesn't come with any games, where as the Radeon HD 7000 series GPUs now come with 1-3 free games, plus a fresh round of price cuts.

But in terms of performance it's hard to fault the R9 280X.  PCWorld tested a card from Micro-Star International (MSI) Comp. Ltd. (TPE:2377), while our good friends at AnandTech took cards from XFX (a division of Pine Technology Holdings Ltd. (HKG:1079)) and ASUSTek Computer, Inc. (TPE:2357) out for a spin.

Volcanic Islands still uses the same Graphics Core Next architecture.
[Image Source: AMD via AnandTech]

AnandTech's benchmarks show one of the cards (namely the overclocked ASUS model) beating out the Radeon HD 7970 Gigahertz Edition (GE), AMD's previous generation flagship.  Compared to NVIDIA, the new card comes within a hair of the performance of the also-new $400 USD GeForce GTX 770, and blows away the the $50 USD cheaper GeForce 760 by 15-20 percent.
AMD Asus
The ASUS R9 280X DirectCU II Top smoked the competition. [Image Source: AnandTech]

More benchmarks are available below, and they all pretty much tell the same story:

"AMD Radeon R7-260X R9-270X and R9-280X review - Introduction" -- Guru3D
"AMD Radeon R9 280X, R9 270X and R7 260X Review" -- PC Perspective
"AMD's Radeon R9 280X and 270X graphics cards" -- TechReport
"AMD Radeon R9 280X, R9 270X, And R7 260X: Old GPUs, New Names" -- Tom's Hardware

These other benchmarks show the R9 270X ($200 USD) to generally be neck-and-neck with the GeForce GTX 760 ($250 USD), while the R7 260X ($140 USD) is comparable to the Radeon HD 7850 ($125 USD, after discount) or the NVIDIA GeForce GTX 560 Ti ($160 USD), but beat by the GeForce GTX 650 Ti Boost ($110 USD, after discount).

Another piece of good news is that EyeFinity has been improved with the R9 chips to add support for a third HDMI/DVI monitored, and up to 3 DisplayPort monitors, allowing gamers to potential drive six monitors simultaneously.

II. AMD Seizes Holiday Lead, But Early 2014 May Belong to NVIDIA

Overall, AMD has a commanding lead on the high-end with the R9 cards beating NVIDIA cards priced $50 USD or more higher.  On the lower end (the R7 260X) the picture is muddled by budget offerings from both NVIDIA and AMD itself.  That said, this launch is a significant win for AMD as it is is about as clear a win as you're going to see on the high end.

However, in addition to the gripes about pricing on the low end there are also a few other caveats to consider, even with the dominant higher end models like the R9 280X.

NVIDIA is likely to cut the prices of the GeForce GTX 760/770.  While these cards have just come out, the market reality is that NVIDIA must cut, and cut it shall -- the question is when.  Also on the NVIDIA front, we must remember that NVIDIA is prepping the successor to KeplerMaxwell for launch.  The fact that the GeForce 700 Series is just now filling out with the launch of the GeForce GTX 760/770 suggests we may have to wait a while longer for that major launch.  But if Maxwell doesn't take too long to get here, the R7 and R9 cards should eventually face a far fiercer foe.

NVIDIA Maxwell
NVIDIA will look to regain the lead in early 2014 with Maxwell. [Image Source: VideoCardz]

And while the R9 280X trumps the GeForce 700 Series in load temperatures (running cooler) and noise (running quieter), it does consume a lot of power under load -- more than the GeForce GTX 770 even.

Lastly, while the R9 (and R7) come with a lot of new tricks, most of those tricks boil down to optimizations to the previous generation in terms of clock speed and computer units; or firmware upgrades.  Some of the biggest gains -- such as those realizable with AMD's low-level firmware alternative to DirectX -- Mantle -- will only be available if developers choose to optimize their games for AMD cards.  Given its current lead and strong historical presence, a number of AAA game developers should do precisely that in time, but it's not something you can always count on performance-wise.

At its core the R7 and R9 GPUs are still based on AMD's Graphics Core Next architecture and while we have yet to see some of the new series' most appealing cards (the more affordable models, and the flagship R9 290X), it's unlikely that the missing members of the family will reinvent the wheel too much.

Maxwell, on the other hand, is a true architecture bump, albeit coming at an inopportune time as fabs prepare to transition from 28 nm to 20 nm processes.  Maxwell 28 nm chips are rumored to be on pace for a Feb. or March 2014 launch.  That's a long way off, but given that AMD isn't expected to release 20 nm Graphics Core Next 2.0 cards until October 2014, it's highly probable we'll see NVIDIA regain the lead in early 2014.

Graphics Core Next
AMD's own architecture bump -- Graphics Core Next 2.0 -- is expected to accompany a 20 nm die shrink in Q4 2014.

In other words AMD's timing is much better -- seizing a command position as we approach the holiday season of spending -- but NVIDIA is still very much alive.

Sources: AnandTech, PCWorld, PC Perspective, Tech Report

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Re mantle...
By Amiga500 on 10/8/2013 3:58:24 PM , Rating: 1
Mantle will be the developer choice for PS4, XB1 and PC. All GCN architectures.

DirectX will become an afterthought.

RE: Re mantle...
By dj LiTh on 10/8/2013 4:31:50 PM , Rating: 2
Just like linux will replace windows as the gamers system of choice....

RE: Re mantle...
By StevoLincolnite on 10/8/2013 6:12:23 PM , Rating: 3
Difference is, potentially 2 consoles support Mantle and even PC developers have thrown their support behind it, like Frostbite which seems to be powering all of EA's upcoming titles.

Linux wishes it could have that kind of support. :)

RE: Re mantle...
By SPOOFE on 10/9/2013 3:49:18 AM , Rating: 2
The big difference, however, is that the static hardware of consoles makes it far more desirable to devote effort into close-to-metal optimizations, as that work will pay off for many years. Conversely, on the PC platform they can less optimized and simply expect 6-12 months to bring the improved hardware that brings the performance gains that would otherwise be gleaned by hypothetical optimizations, making high-level APIs more desirable.

That's how I see it, anyway.

RE: Re mantle...
By Amiga500 on 10/9/2013 3:56:35 AM , Rating: 2
AMD have talked about their "eco-system" allowing for the console makers to iterate more frequently than they have done to date.

I would expect Mantle to be relatively scalable across HW and allow significantly more common code (certainly compared to porting from PS3 to XB360 to DirectX).

Besides, its not hard to flick various options on which need higher GPU horsepower - such as high-resolution on your monitor or more AA.

RE: Re mantle...
By Mint on 10/9/2013 8:18:37 PM , Rating: 2
Not sure if you get it. Game devs devote close-to-metal optimizations on consoles, just like you said, but with Mantle AMD graphics cards on PCs benefit automatically from that optimization.

Meanwhile, NVidia (and Intel) will be stuck with the DirectX port.

RE: Re mantle...
By Amiga500 on 10/9/2013 4:00:11 AM , Rating: 2
Well - if mantle replaces DirectX and can be made available as a proprietary package for linux [same as say, the current closed Nvidia or AMD drivers] - no real reason then that Linux couldn't improve on the gaming front.

A lot of the (performance end of the) GPU driver would become much the same for Linux vs. Windows, allowing easier port of the heavier-resourced windows driver to linux.

RE: Re mantle...
By Da W on 10/9/2013 8:18:14 AM , Rating: 2
Lol. Now where did i heard that before? Oh yeah, it was 1998.

RE: Re mantle...
By Argon18 on 10/9/2013 11:03:59 AM , Rating: 1
Just like RIM will stay King of Smartphones, and Palm will rule the PDA market? Lol.

Microsoft's time is coming. And if its recent OS product failures, tablet failures, and phone market failures are any indication, it's coming sooner rather than later.

RE: Re mantle...
By Da W on 10/9/2013 2:11:49 PM , Rating: 2
Nothing is a straigt line in life. If anything, i would bet that Google is peaking and Microsoft will bounce back. Same for Nvidia-AMD. Same for ARM-Intel. (although AMD cpu division is doomed).

RE: Re mantle...
By KOOLTIME on 10/9/2013 11:08:52 AM , Rating: 2
Nvidia Mantle did not win the bid, AMD has been given the green for both the upcoming PS4 and XBOX 1.

RE: Re mantle...
By LRonaldHubbs on 10/9/2013 12:45:26 PM , Rating: 3
I think you confused 'Maxwell' and 'Mantle'. Mantle is AMD's low-level API alternative to DirectX and OpenGL.

By Da W on 10/8/2013 3:31:12 PM , Rating: 5
You see to miss the most important point.
R280X IS Tahiti IS 7970.
R270X IS Pitcairn IS 7870.
Performance, heat, noice, powerdraw is essentially the same. Coolers have improved. The Asus 280X is basicly an overclocked card which is why it outperform the 7970.

Still, Nvidia rebaggaged their own GPU. It's a fair war. And the war is on price, purely on price.

RE: Rebaggaged
By Dribble on 10/9/2013 4:29:14 AM , Rating: 2
This. You could already buy every card AMD has released so far.

The only new thing is AMD started a bit of a price war, which is great.

RE: Rebaggaged
By bug77 on 10/9/2013 5:46:05 AM , Rating: 2
Also worth noting that 290X will most likely eat more than the 250W the 280X does. So this generation, power usage will go up a notch.

RE: Rebaggaged
By bug77 on 10/9/13, Rating: -1
RE: Rebaggaged
By Da W on 10/9/2013 10:51:55 AM , Rating: 2
It's a new GPU, why would they do that?
I've heard 2816 / 176 / 44 shadder/TMU/ROP instead of 2048 / 128 / 32 of Tahiti. At same freq you get 37% more performace&powerdraw. Likely they reduced freq and mem to get inside 300W.

RE: Rebaggaged
By bug77 on 10/9/13, Rating: 0
RE: Rebaggaged
By Mint on 10/9/2013 9:30:13 PM , Rating: 2
That's either a typo (R280X having 2816 SPs) or leaving the door open to make a R280X variant to be a downclocked 290.

But last year's GPU doesn't have 2816 SPs. The R9-280X in all the reviews seen so far have Tahiti GPUs, i.e. 2048 SPs. The 290X will be a new GPU.

ATI no more
By DrApop on 10/8/2013 8:08:29 PM , Rating: 2
My 10 month old ATI 7850 is sitting on a shelf and my new card is a small gtx 650 Ti.

Bought the previous version of Adobe production suite 8 months ago and no ATI support. Not going to spend $1700 on the new suite just to get ATI graphics and rendering support. No ATI blender support. Not Maya support by ATI.

There is a lot more in life than using graphics to play games. Some of us want productivity too.

I love AMD and have used AMD in one form or another since my AMD 66 Mhz computer in the very early 90's. But I'm tired of the slow(er) processor and the lack of support for things I need. Not sure what my next computer will be. I know what my current and next graphics card will be.

RE: ATI no more
By Amiga500 on 10/9/2013 3:52:46 AM , Rating: 2
No doubt they'd point you to their Pro cards.

But that is beside the point - they should support rather basic things like Adobe without needing the drivers for professional cards.

What version of Adobe are you using?

RE: ATI no more
By DrApop on 10/9/2013 12:42:35 PM , Rating: 2
The previous version.....6 I believe. Their current version of Adobe Production Suite (for video/audio/graphics editing)is only available via monthly subscription (no hard copy).

It is supposed to support ATI cards. but I bought their previous version ($1700/$500 academic) in Jan of this year so not going to dump that.

Also, there are a number of rendering/animation programs that support cuda processors in Nvidia but don't using ATI for graphics rendering at all.

That is the real frustrating part. ATI/AMD seems to have top notch cards but it has major support is in the games market (granted, a highly prized and financially important segment)but is severely lacking at this time in the graphics/rendering/video market (granted again, a much smaller market segment than games)

By TacticalTrading on 10/8/2013 4:56:19 PM , Rating: 3
AMD's new card is faster and costs less.
Give them credit for a job well done.
And, Going forward, all games are going to be developed with their chips in mind.

Nvidia has great stuff too, though now it is and more expensive.
But PhysX... What percent of gaming devices have that capability? I'm not saying it isn't cool. What I am sayig is that as a developer, I think I would spend my resources somewhere else.... Um... Like optimizing games for AMD GPUs

Yes, I have AMD cards

Not sure if I can remain loyal to Red Team.
By Captain Orgazmo on 10/9/2013 4:00:27 AM , Rating: 1
I was bitten in the rear by the nvidia 5000 series way back 10 years ago, and have been purchasing ATI/AMD since then. X800Pro, X2900XT, HD4870, HD6870, and HD7950. X800Pro was good (was able to flash the firmware to make it the flagship XT model), but the cooler failed within a year. X2900 was a power gobbling, under-performing mistake. The 4870 failed after 2 years. 6870 was an excellent series, mine still running fine in a friend's comp, and another buddy with one still going strong after 3 years.

7950 has been a massive pain in the butt: the card will not reinitialize video output after monitor or S3 sleep (requiring me to alway shut down the computer), the 3D clocks throttle down at high load requiring the electrical current limit to be boosted (eating an extra 30W at load over advertised spec), and the heatsink was poorly designed and inadequate (no ram or VRM cooling, and hot air vented back into the case because it was designed to look like a blower while being a cheap POS), requiring a $60 aftermarket cooler. Also, it seems driver releases have been reduced to semi-annually.

As a result, the last couple computers I helped build for friends/family have been equipped with nvidia cards. From the looks of this rebadge BS, my next card will probably be nvidia too. I hope AMD gets back on the ball, competition is good for all.

By EasyC on 10/9/2013 7:19:57 AM , Rating: 2
You're lucky. Both my GTX 760 and GTX 780 drivers hose my entire windows install when swapped. I'm talking safe mode hosed, windows recovery hosed, startup fixing hosed. Everything nets a BSOD with a log that basically said "bad driver". Sad part is they use the same exact driver download.

I'm hoping the 290x comes out before the return period ends on the 780.

By Phoque on 10/8/2013 7:17:03 PM , Rating: 2
Let me go on a limb here... NO!

By Ammohunt on 10/8/13, Rating: -1
RE: Sucks
By SaltBoy on 10/8/13, Rating: -1
RE: Sucks
By Da W on 10/8/2013 3:54:33 PM , Rating: 2
You say that cause you only game on one screen.

RE: Sucks
By ClownPuncher on 10/8/2013 4:11:42 PM , Rating: 2
You get PhysX in what... 2 recent games?

RE: Sucks
By Ammohunt on 10/8/13, Rating: -1
RE: Sucks
By ClownPuncher on 10/8/2013 5:04:54 PM , Rating: 2
? I've owned far more nVidia cards than AMD. It's a legit question. If you want to be part of that discussion, you're going to have to drop the pretense.

RE: Sucks
By Ammohunt on 10/8/13, Rating: 0
RE: Sucks
By SlyNine on 10/9/2013 2:11:28 AM , Rating: 2
Yay Intel

RE: Sucks
By slunkius on 10/9/2013 2:29:54 AM , Rating: 2
cue smug replies, with pretense for humor

RE: Sucks
By Ammohunt on 10/8/13, Rating: 0
RE: Sucks
By ShaolinSoccer on 10/9/13, Rating: 0
RE: Sucks
By ClownPuncher on 10/9/2013 11:02:47 AM , Rating: 2
You get depressed when you can't see a few instances of particle physics that don't even need to be done via a proprietary api?

RE: Sucks
By metaltoiletry on 10/10/2013 12:22:07 AM , Rating: 3
My Logitech keyboard supports more games than that list you posted... Now that is actually quite sad... :/

RE: Sucks
By inperfectdarkness on 10/9/2013 3:25:46 AM , Rating: 1
Nvidia's epic-fail of GPU's going belly-up was enough to put me firmly in the ATI camp. Neither Nvidia or Sager was willing to cover my laptop's GPU replacement--even though it was barely a year old (I really don't blame Sager).

The MSI I got to replace it has an ATI that's been buzzing along just fine since I got it. I'm waiting till I can purchase an MSI GT60 3k with an ATI card...and I'll have it on pre-order.

"There's no chance that the iPhone is going to get any significant market share. No chance." -- Microsoft CEO Steve Ballmer

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki