backtop


Print 23 comment(s) - last by StanO360.. on Nov 5 at 5:22 PM


(The core's namesake)  (Source: SZ-Wholesaler)
Vishera FX is power hungry and still weak in single-thread performance, but is highly overclockable and cheap

In the consumer CPU market Advanced Micro Devices, Inc.'s (AMD) Zambezi replacement, Vishera was officially launched today.  The announced chips, which bear the FX designation, swap out the 32 nm Bulldozer cores for refined 32 nm Piledriver cores, offers quad-, hexa-, and octa-core configurations.

I. Meet Vishera, a 32 nm Piledriver-core Product

The die shrink war is over and Intel Corp. (INTC) has won, but AMD is still trying to stay competitive in the consumer PC market with its high-core-count, inexpensive second-generation processors on the 32 nm node.

Intel has not brought hexa- or octa-core Ivy Bridge (22 nm) chips to bear yet (though it does enjoy a large and diverse lineup), so AMD has the good fortune of having its second-generation 32 nm architecture compete with Intel's quad-core 22 nm parts.

As one might expect AMD chips earn wins in heavily threaded benchmarks, according to Anandtech, who tested the new Vishera-FX Series chips.  In single-thread loads, though, even the lowly Core-i3 Ivy Bridge parts beat up on the cream of the Vishera crop.  That means that gaming -- which is mostly GPU bound and lightly threaded -- tends to wind up in Intel's favor.
AMD Piledriver
Note, the maximum frequencies are for half-load; the maximum frequency for full load is 100 MHz slower in all cases except the FX-8320, where full-load is 200 MHz slower.

II. The Bad

Let's consider the bad news first.  Looking at the price, that you get worse performance in single-core loads than an $130 USD i3-3220 (Ivy Bridge; dual-core; 3.3 GHz).

AMD did deliver on its promise of faster single-threaded execution; Vishera-FX is approximately 20 percent faster in Anandtech's testing than Zambezi-FX.  That said, the new core architecture only closed approximately half the gap in single-thread execution time -- the Core-i3 (Ivy Bridge) is still roughly 20 percent faster than Vishera-FX in single-threaded loads.

Ivy Bridge
Vishera only halfway closes the single-thread performance gap w. Intel's Ivy Bridge (pictured).
[Image Source: BBC News]

The other bad news is that while power consumption has actually dipped slightly since Zambezi-FX, it's still far higher than Ivy Bridge parts.

So to recap, in single threaded performance and power consumption, Piledriver is better than Bulldozer, but worse than Ivy Bridge.

III. The Good

But there's also a fair amount of good news.

Where Bulldozer shone brightest (multi-threaded performance), Piledriver does even better, beating out Ivy Bridge parts.  Price-wise it's compelling to see the FX-8350 (Piledriver, octa-core; 4.2 GHz)-- a roughly $195 USD part -- beating the $330 USD i7-3770K (Ivy Bridge; quad-core; 3.9 GHz Turbo).  

The comparison is fair, as the cost of LGA-1155 motherboards (for Ivy Bridge) and Socket-AM3+ motherboards (for Vishera) is about the same -- starting at around $30 USD for the cheapst boards.  Both chipmakers kindly opted to stick with their current socket design with their latest core generation, so many consumers will be able to upgrade without replacing the board.

An interesting observation by Anandtech is that overall while AMD still suffers the same weaknesses that it did 2 years ago, that its latest releases have pushed it closer to closing the single-threaded performance gap, while widening the multi-threaded performance gap.  In other words, if AMD is able to keep up this trend, it could find itself beating Intel in both single-threaded and multi-threaded performance in a couple of years.


The new AMD parts are highly overclockable. [Image Source: Zazzle]

More good news comes regarding the overclockability. The quad-core FX-4300 was clocked up to 5 GHz (a 28.2% overclock) in Anandtech's testing (using AMD's stock closed-loop liquid cooling solution), which placed it on par with the hexa-core FX-6300 in multi-threaded loads.  The octa-core FX-8350 clocked up to 4.8 GHz (a 14.3% overclock), earning the best scores in heavily-threaded loads.  Beware, though -- a 4.8 GHz FX-8350 part consumes an epic 294.3 watts under heavy load.

IV. To Buy or Not to Buy?

So is Vishera-FX and the new Piledriver core worth a purchase?  That depends a lot on the consumer.  As single-core performance generally dictates the consumer experience, one could argue that the answer is generally "no".

That said, the overclockability and great multi-threaded performance of the 32 nm part make it hard to overlook entirely, particularly when it's priced reasonably competitively.  Ivy Bridge is probably the better solution for most, but if you do bu a Vishera chip, you won't be getting a bad deal.

Vishera v. Zambezi
Vishera (left) is a significant improvement from Zambezi (right). [Image Source: AMD]

The real potential for Vishera likely lies in its sister-series, aimed at enterprise users.  We should have some details on that in the near future to share with you.

In the mean time think of how the Piledriver core gains might be amplified by a highly-thread dependent load, such as hosting virtual environments for a thin-client deployment.  We'll say this -- if AMD keeps the pedal to the metal on the enterprise parts side, it may have a more clear-cut winner for many applications.

(For those interested, AMD's core gets its name from the titular piece of heavy machinery, which drives "piles" into the ground, pillars which support buildings.  Vishera is named after a river in Russia.)

Sources: AMD, Anandtech



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Any wonder
By Ammohunt on 10/23/2012 3:38:13 PM , Rating: 2
Why AMD is sucking eggs? i would run AMD on my desktop machine if they could achieve 90%+ of the performance of Intel CPU's since the price discount and the addition cores would be value add. I do happen to run a 6 core FX-6100 for an ESXi server which works great but i purposefully bought it as a low budget alternative to what would otherwise be a pricey overly powerful intel hobby server. I am rooting for AMD their products have gotten better and better over the years but not good enough to knock out intel from my gaming rigs.




RE: Any wonder
By inighthawki on 10/23/2012 4:01:30 PM , Rating: 2
Intel paid off a lot of people to use their CPUs, and as a result we're seeing Intel benefit from that with a lot more revenue for furthering R&D, something that AMD doesn't have the luxury to do after its sinking quarterly revenue. I think they're still feeling a lot of the effects of purchasing ATi.


RE: Any wonder
By someguy123 on 10/23/2012 5:43:33 PM , Rating: 4
Intel has shady business practices regarding competition, but the decision to jump to fat pipelines and modules has nothing to do with amount of R&D funding. Phenom cores are higher in IPC than bulldozer and now piledriver cores, though piledriver pulls ahead thanks to frequency headroom. Looking at AMD firing more engineers, I'd say its an upper management problem, probably one that demanded more cores regardless of performance for marketing reasons.


RE: Any wonder
By Mitch101 on 10/23/2012 8:25:25 PM , Rating: 2
Intel lost the antitrust lawsuit and while they probably didnt pay as much as they gained they still had to pay a hefty sum.

I read a few websites benchmarks and the new chips are not all Doom and Gloom. They do quite well in applications where multiple cores are utilized. Unfortunately not all apps are well optimized for multiple cores and Im not someone who uses 90% of the apps they test with. Seriously is everyone rendering images with POV tracing? I dont even use Photoshop its not in my budget and I dont use Visual Studio enough to complain about compile times but I do game.

Lets also keep in Mind AMD managed to produce an 8 core CPU with less transistor count than Intel's quad core even with hyper threading. Which is important to manufacturing costs and hopefully they can squeeze in some profits. The pricing is in line with their performance they arent trying to charge anyone a premium for a chip that doesn't outperform the Intel equivalent. No trickery going on here.

If you bought an AMD motherboard there is some additional cost savings in that you can likely upgrade just the CPU for a performance boost and sell off your old core.

However if your building a new rig you might opt to go the Intel route if you going to have to buy a Mobo and some Ram anyhow.

AMD said they would have a good performance jump in the next generation of CPU's so I would expect they realize they took a step back with thier branch preduction unit when they went bulldozer.


RE: Any wonder
By someguy123 on 10/23/2012 9:10:55 PM , Rating: 3
There is a place for these chips considering the pricing and highly threaded applications. My post is simply about the architecture. With thuban they were actually ahead of piledriver in overall IPC and managed to fit 6 cores in there. I don't know if there was some issues fitting in more cores with a shrink or something but it seems to me like shrinking down phenom and trying to up core count/frequency would've been a better choice in overall performance. It reminds me of the p4~conroe transition where intel eventually went back to a design similar to their p6 for core2 chips.


RE: Any wonder
By Mitch101 on 10/24/2012 4:33:55 PM , Rating: 2
Yup I felt the exact same way they should have continued to refine and shrink Thuban as they wound up doing like the P4 with PileDriver. Too deep a pipeline with a bad penalty for performance lowering their IPC.


RE: Any wonder
By polishvendetta on 10/23/2012 4:25:55 PM , Rating: 3
the amd vs intel battle is becomming moot. not because one is or isnt better then the other, but because monitor resolutions increasing at the same pace as processors and video cards.

mainstream monitors have accepted the 1920x1080 or 1920x1200 resolutions for several years. I have a 5 yr old gaming pc i built and just upgranding with a 150$ processor and 150$ graphics card i can play all of the games (borderlands 2 right now) with the highest details and graphic settings.

I thought about building a new i7 computer but theres really no point unless i have 1000$ to spend on a monitor as well.


RE: Any wonder
By MrBungle123 on 10/23/2012 4:43:33 PM , Rating: 2
I would agree that stagnant resolutions are part of the problem... I suspect that the biggest thing holding back the need for faster CPUs (in the consumer space anyway) is that every new game that comes out is aimed at the consoles which are quickly reaching the point where they are slow by cell phone standards.

If game devs ignored the XBox/PS3 and coded for something on the order of an i5 2500/GTX 470 or higher then there would start to be a reason to upgrade again. Instead they are writing games for a slow P4 era CPU and a Geforce 7600... its no wonder any modern PC doesn't break a sweat playing virtually every game.


RE: Any wonder
By NellyFromMA on 10/23/2012 4:51:55 PM , Rating: 2
I think resolutions started tapering off between a paradigm shift from CRT to LCD took place, when consumers realized a unit of general purpose computing that did most everything they want (so no need to upgrade), and the general lack of benefits of bumping resolutions when reaching a particular threshold.

Add to this the fact that manufacturing more pixels inherently decreases yields (for cutting edge units) and basically what you saw happen to computers is the same thing you've seen happen to everything else... It became general and disposable.

Thanks, capitalism! lol jk


RE: Any wonder
By someguy123 on 10/23/2012 5:51:06 PM , Rating: 2
Borderlands 2 is hardly a demanding game even maxed out. Why do people insist that budget builds will run everything maxed out? Sure you could play games like max payne 3 at 20~40 fps running 1200 resolution with a $150 gpu but I hardly consider this the pinnacle where you've hit the performance wall. This really has very little to do with CPU anyway considering games have moved significantly more towards GPU processing.


RE: Any wonder
By maugrimtr on 10/24/2012 9:30:22 AM , Rating: 2
It's interesting to see how a game tied to the CPU heavily can upset the current status quo where a 5 year old CPU can run any modern game from 2012 so long as you have a suitable GPU.

Planetside 2 (an MMO FPS releasing in November) is just such an upset. It's heavily bound to the CPU since it needs to track conceivably hundreds of players and their interactions in an FPS setting. Forget Battlefield 3. BF3 maxes out at a mere 64 players.

http://www.tomshardware.co.uk/forum/358699-13-upgr...

PS2 is a game currently bogged down in performance complaints. AMD is actually worse off due to its single threaded performance (the developers are working on this - presumably to add more threads so AMDs new shiny multithreaded superiority can be leveraged - more games need to follow suit!).

For me, PS2's currently running beta has been an absolute lag fest whereas BF3 runs silky smooth with graphics set to high. CPU bound games make you realize just how many people have years-old CPUs coupled to 2012 released GPUs. Consoles' inferior specs have rendered CPU upgrades mostly pointless for gaming.


RE: Any wonder
By Reclaimer77 on 10/24/2012 6:40:13 PM , Rating: 2
Heh you should have played Planetside 1. Fun as hell, but extremely ambitious to the point that only monster computers (yes it CAN play Crysis) could handle the massive battles that numbered in the hundreds of players. "Lag" was a way of life.

I'm going to give PS2 a shot, but I have no illusions about it running smoothly.


"Mac OS X is like living in a farmhouse in the country with no locks, and Windows is living in a house with bars on the windows in the bad part of town." -- Charlie Miller














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki