Print 70 comment(s) - last by corduroygt.. on May 4 at 9:57 PM

The final Phenom II, the X4-980 Black Edition hardly leaves the tech community with sweet memories of its processor family as it gets handily beat by Intel's closest comparable Sandy Bridge chip.  (Source: Anandtech)

Intel's closest-priced Sandy Bridge chip is cheaper, faster, and uses much less power than AMD's latest design.  (Source: Intel via Newegg)
Clocked at 3.7 GHz, the quad-core Phenom II X4 980 gets beat in performance and price by lower-clocked Intel chips

Even as Advanced Micro Devices Inc. (AMD) finally starts to look competitive with its low-end Fusion CPU+GPU systems-on-chips (SOCs), it still has no real answer in the high performance end.

This week it released what will likely be the final member of the Phenom II family (45 nm), the high-end counterpart to AMD's budget Athlon II line.  Both lines will begin a slow phase-out by June, being replaced with the chips bearing AMD's new architecture -- Bulldozer (32 nm).  

The new chip is dubbed the AMD Phenom II X4-980 Black Edition.

I. Performance

The final Phenom II has what it takes to be a decent performer -- if it were launched, say, two or three years ago.  

Its four cores are clocked at 3.7 GHz, though the memory controller, HyperTransport controller, and L3 cache run at just 2 GHz, without any type of special overclocking.  The chip fits neatly into AM3 socket boards and will be compatible with future AM3+ boards.

Performance-wise, the chip gets beat in numeric benchmarks (e.g. SiSoft Sandra 2011b) by its more expensive AMD hexacore brethren, the X6-1090T and X6-1100T.

It also gets blown away by Intel Corp.'s (INTC) lowest end Nehalem (45 nm) i7 processor, the Core i7-860.  Let that sink in for a minute -- Intel's slowest quad-core i7 processor, released in September 2009, can still beat AMD's fastest quad-core processor, released in May 2011.  

(To be fair, the i7-860 debuted at a higher price of $279, though its price later dropped substantially to near the X4-980's mark as Intel cleared inventory to prepare for Sandy Bridge.)

PC Perspective measured the X4-980 as being about 25 frames-per-second slower than the i7-860 in Far Cry 2 at high-resolution settings and 10 fps slower at "ultra" settings.

Looking at its closest modern Intel competitor, the i5-2400 (more on that later), the X4-980 is yet again left badly beaten, both in synthetic benchmarks and in games.  The i5-2400 (3.1 GHz standard, 3.7 GHz "Turbo Mode") -- a Sandy Bridge (32 nm) chip -- is anywhere from 15-20 fps ahead in numerous game titles, according to benchmarks by AnandTech.

But the bad news for the new processor performance-wise doesn't end there.  Under load it is estimated to consume around 164.6 watts, near 50 watts more than slim 115.2 watts the i5-2400 draws.

Its one saving grace performance-wise is that it proves a decent chip to overclock, being capable of being bumped to 4.3 GHz, without water-cooling, if you're careful.

II. Price

Both the i7-860 and the X4-980 have a bit of a common problem -- they're overqualified for most consumer workloads -- even casual gaming.  So while the Intel chip spanks the AMD chip in performance, the real question for most of its target audience boils down to the price.

The chip is expected to debut somewhere around $195 USD.  That price makes the situation a bit complicated, as AMD's own X6-1090T is only $5 more, at $200 USD and beats it in multi-thread optimized workloads (though in many casual games the X4-980) will come out ahead.

As the old core models aren't currently widely available, having been phased out by Sandy Bridge, its clearest competition in the Intel department is the quad-core i5-2400, which retails for $190 USD.

So to summarize, Intel's main competitor, the i5-2400 not only badly burns the newest member of the Phenom II in performance and power consumption, it also leaves $5 USD more in your wallet.

Of course, if you already have an AM3 board and aren't willing to upgrade, the price situation may shift in the new chip's favor -- though you still have to factor in the much slower performance.  Anand Lal Shimpi sums it up nicely, writing:

There's not a whole lot to say here about the Phenom II X4 980. AMD originally introduced the Phenom II architecture over two years ago to compete with Intel's Core 2 lineup. Intel has since been through one major microarchitecture revision (Sandy Bridge) and Phenom II is beginning to show its age. AMD is most competitive at the edges of its lineup. The Phenom II X6 offers a ton of cores at a budget if you have a workload that can use them, and the Athlon II at the low end is still quite desirable. Unless you're an existing Socket-AM3 motherboard owner a high end Phenom II X4 just isn't attractive. 

The Tech Report was even more blunt, with Scott Wasson writing:

All of which leaves us wondering what, exactly, is the point of this little product refresh. Yes, AMD's top quad-core product is a tiny little bit improved over the prior model, but even in the small picture, nothing much has changed at all.

AMD will hope to wash that bad taste out of consumers mouths with a final release of Athlon II chip(s) at the low end soon, which (hopefully) will be more competitively priced.  And for those who hope the best for AMD -- or at least hope for a competitive CPU market -- you can always look forward hopefully to the release of Bulldozer later this year, which could offer a legitimate competitor to Sandy Bridge at the high end, if AMD can compete price-wise.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Deja vu
By bug77 on 5/3/2011 10:01:19 AM , Rating: 3
It's like P4 vs Athlon XP. In reverse.

RE: Deja vu
By Aloonatic on 5/3/2011 10:12:35 AM , Rating: 5
vu jà-Dé?

RE: Deja vu
By Mitch101 on 5/3/2011 11:23:53 AM , Rating: 2
Thats the feeling you have done this before.

RE: Deja vu
By Queelis on 5/4/2011 2:13:53 PM , Rating: 2
uv àj-éD.

RE: Deja vu
By Motoman on 5/3/2011 10:13:31 AM , Rating: 3
Yeah, they go back and forth. The problem AMD faces is that even during the eras when they clearly have a superior product, the unwashed masses still buy Intel - because they don't know any better (and don't care either way). During the periods of AMD dominance, the Intel fanbois barely dry up at all...they just go into hibernation for a bit, and then come back out of the woodwork again like mad when Intel finally pulls ahead again. You don't ever actually get any of them to buy the best CPU on the market at that time.

For me, though, Intel's criminal activity put me off them forever - the penalties eventually levied on them in Europe and the US really don't even matter to Intel. But I for one refuse to support a company so blatantly willing break the law when it suits them to do so.


The final Phenom II has what it takes to be a decent performer -- if it were launched, say, two or three years ago.

...really? The latest Phenom II isn't even "decent" by current standards? The fact of the matter is that virtually no one, even hardcore gamers, would notice any perceptible subjective difference. And remember that the *vast* majority of users don't do anything that would even stress the lowest-end dual-core CPU - and frankly, would still be fine with a $40 single-core CPU. We are well into the era where machines and software that are several years old are more than capable to meet the needs of the average user. The industry is propped up on the success or failure of marketing campaigns...not solving actual user needs.

RE: Deja vu
By kaosstar on 5/3/2011 10:25:31 AM , Rating: 4
I was a computer salesman at Best Buy in college around 2004-2005 - back when AMD was pretty well having its way with Intel. Many, perhaps even most, customers refused to buy anything with an AMD CPU. They clearly knew more than me, since, even though they were coming to me for help, they would educate me that Pentiums must be superior to some no-name brand.

RE: Deja vu
By Motoman on 5/3/2011 10:30:02 AM , Rating: 4
Yup. Ask any person whether or not they think they are swayed by marketing propaganda, and they will tell you "no." Then ask them what the best products on the market are in the related categories, and they'll probably tell you "Apple, Bose, Monster Cable...and Intel." least, during the proper time period, Intel can actually be said to have the technically superior product. The effect is the same though...and persists during the periods when Intel doesn't have the best product.

RE: Deja vu
By yomamafor1 on 5/3/2011 11:31:36 AM , Rating: 2
And that's has always been the problem for AMD. Regardless of their products, they have always underestimated the importance of marketing. Even when the K8 was dominating in all sectors, and AMD was raking in millions of profits, they still refused to invest in broadening their brand.

In my opinion, before K8, you can say that Intel engaged in an unfair competition to keep AMD down. After K8 (and especially 2006), AMD has no one else to blame but themselves (well, maybe Hector).

RE: Deja vu
By cfaalm on 5/3/2011 4:25:42 PM , Rating: 2
I think AMD can only turn this around if they either stay on top or stay competitive (better than with Phenom II) for two or three upgrade periods, 10 to 12 years. If they lose out within that time frame like they did with the first Phenom, they'll stay in the same position.

AMD is getting in a better financial position gradually, now also on the Apple train firmly. That should help them in the R&D deparment.

Bulldozer should have been here last year. Oh well, you all heard that before... Still no official benchies, damn.

RE: Deja vu
By Joz on 5/3/2011 7:47:44 PM , Rating: 2
its worse when family members with no training, never look up reviews, etc... think they know better as well.

its like my brother saying Mac's are superior to PCs (even Linux ones) in every way. I laughed and told him to fix his computer then, and he went and spent $150 to do so. I could've fixed that faulty ram module for $20.

RE: Deja vu
By Aloonatic on 5/3/2011 10:31:33 AM , Rating: 3
It's funny. My brother-in-law has just bought a sandbridge i7 laptop and moans about how slow everything still is, and that it's not really any faster to use than his old E6x00 (can't remember which one) machine.

He would have been far better off spending the difference between the i7 and a lower performing processor, on a SSD drive rather than a HDD.

The good news or Intel is that people still think that the processor is the only place where they will ever find performance increases in their day to day use of computers.

As an aside, he's also really confused/frustrated by the whole sandybrigde/nvidia gpu switching thing too. I doubt he's the only one.

RE: Deja vu
By Motoman on 5/3/2011 10:40:29 AM , Rating: 5

The VAST majority of PC users/consumers believe that the CPU is what determines the speed of their computer. Along with how big the hard drive is, naturally. Because clearly a 2Tb hard drive makes my interwebs go more fastly than a 500Gb drive, right?

This is why there are so many frustrated would-be gamers in this & dad go to BBY and buy the $400 Dell special because it has a kwad-kore processor! It's 1337! And since it has Intel graphics, it won't play anything more interesting than Solitaire. And would-be PC gamer throws his hands up in the air and goes and plays his Ecks Bocks instead. Because when he complains to ma & pa about the abysmal performance of his new pee cee, said parental units refuse to even believe him - since they just bought him that new komputor with the kwad-kore processor, clearly he is just making it up.

RE: Deja vu
By corduroygt on 5/3/2011 12:05:57 PM , Rating: 1
I moved from a C2D 1.73 GHz to an i5 at 2.66 Ghz (Turbo up to 3.3), with the same exact SSD, and the difference is significant in day to day usage.

RE: Deja vu
By Jedi2155 on 5/3/2011 1:10:14 PM , Rating: 2
Since moving to an SSD, I discovered once you take out I/O performance as a bottleneck, the CPU is the next biggest one in the system. Increasing my overclock on my i7 920 from ~3 GHz to 3.6 GHz actually gave a noticeable increase performance.

I feel that the increase in CPU performance is being masked by a slow HDD.

RE: Deja vu
By jabber on 5/4/2011 6:16:28 AM , Rating: 2
I've been building 60GB SSD equipped PCs for customers recently.

Been pairing them with the bottom 3GHz Athlon II CPU and these boxes really rock. The customers love them. Awesome is the word I get in feedback.

They get USB3.0 and a HD4250 GPU to play Farmville with and they are very happy.

Other than a few enthusiasts, who really needs Intel? Not many, its only the fact they use mass marketing to shove themselves into the public conciousness that they survive.

RE: Deja vu
By mattclary on 5/3/2011 12:56:58 PM , Rating: 2

Processor wars are so 2002. For the AVERAGE consumer, the difference between a low-ish end CPU and top end is almost invisible. The amount of RAM and hard drive access are the limiting factors nowadays.

I have a small repair business and use AMD exclusively because they are CHEAP, and they get the job done. And I'm using dual core chips, not the very bottom end single core, cache-lobotomized stuff.

RE: Deja vu
By Da W on 5/3/2011 10:31:55 AM , Rating: 2
Doesn't matter, people want better graphics now. Better GPU. Llano will be this Phenom II piece, at 32 nm, with at least twice the GPU of Sandy bridge. There will be a differenciation and a choice. Pure CPU performance doesn't mean alot in this day and age where people believe they will do all their work on a phone.

RE: Deja vu
By Motoman on 5/3/2011 10:36:43 AM , Rating: 2
No, people don't want better graphics now.

People who know what they're doing want better graphics now.

That is such a small subset of the general consumer base that it kinda doesn't matter.

RE: Deja vu
By corduroygt on 5/3/2011 12:07:13 PM , Rating: 2
I doubt that the casual PC user wants or needs better graphics, if the only games they care to play are angry birds, solitaire, and farmville.

RE: Deja vu
By Motoman on 5/3/2011 5:33:18 PM , Rating: 2
...and to the casual user who never does anything more interesting than Angry Birds, the CPU doesn't matter either.

RE: Deja vu
By corduroygt on 5/3/2011 10:04:03 PM , Rating: 2
The same casual user might be compiling code, simulating electrical circuits, working on Excel spreadsheets, while running a VM, which can all tax the CPU heavily. Not to mention regular web pages with lots of flash and javascript need CPU power to render. When you remove the HDD bottleneck with a SSD, CPU performance becomes the number one thing that improves general responsiveness and application performance. Unless you're the graphics wh0re kind of gamer (again, a small minority), consoles are better and more cost effective for gaming.

CPU's are used for everything where GPU's are pretty much only used for games. PC gaming is a small niche of customers and GPU is only important for them and no one else. That's why CPU performance is still important, and I wouldn't advise anyone to buy a mobile AMD cpu because they suck pretty bad compared to Intel's offerings.

RE: Deja vu
By 3PoL on 5/4/2011 2:24:30 AM , Rating: 2
PC gaming is a small niche of customers and GPU is only important for them and no one else.

Except for perhaps people who use these GPUs for work related stuff, such as database acceleration, scientific/technical calculations, etc.

RE: Deja vu
By jabber on 5/4/2011 6:21:20 AM , Rating: 2
Yet to come across a customer that uses a VM or compiles code etc.

In 5 years of business, the most taxing task is gaming (maybe one customer in 50 is a power gamer) and video editing (one customer in a 100).

The real world of mass computing needs far far less then you think. A 2GHz dual core, 2GB ram and a speedy HDD is all that most need.

In my experience 75% of the worlds computer users are still getting by with a single core 3GHz P4 with 512MB of ram.

RE: Deja vu
By Motoman on 5/4/2011 9:23:58 AM , Rating: 2
The same casual user might be compiling code, simulating electrical circuits, working on Excel spreadsheets, while running a VM, which can all tax the CPU heavily.


That's the exact definition of someone who categorically IS NOT a casual user. And a segment of the market that I would be willing to wager is seriously smaller than gamers.

RE: Deja vu
By corduroygt on 5/4/2011 11:32:43 AM , Rating: 2
Speed of rendering at heavy javascript/flash pages (such as newspapers) and general system responsiveness also improves with a faster CPU, especially when you already have an SSD. Far more people would notice that than the few who actually play 3D games on the PC instead of a console.

RE: Deja vu
By Motoman on 5/4/2011 8:20:32 PM , Rating: 2
...granted the same system otherwise, the user looking at Flash stuff etc. as you just noted won't be able to tell any subjective difference between a 2.0Ghz dual-core CPU and a 3.0Ghz quad-core. If you told them the dual-core CPU was the quad-core, they'd probably declare it was faster...

RE: Deja vu
By corduroygt on 5/4/2011 9:57:34 PM , Rating: 2
As I said before, I noticed a significant difference between a C2D at 1.73 Ghz vs. a Sandy Bridge dual core at 2.66 Ghz. The SB chews through opening applications and rendering web pages (using Firefox 3.6) much faster than the C2D. Both systems had the same exact SSD and a clean install of Windows 7 64-bit.

RE: Deja vu
By someguy123 on 5/3/2011 1:13:59 PM , Rating: 2
I agree with you about intel's business practices, but using video games as a benchmark for CPU performance is silly.

The new phenom really is a poor performer within its context (a CPU). Considering the price, it's an incredibly poor performer. It doesn't make sense to me to discount CPU progression just because current/consumer software is behind in implementation or focused on GPU.

The reality of it is that these CPUs will find their place sooner or later when it comes to average consumers (remember the days of immense hatred against all things multicore?), and they already fit in very well for those of us that use our computers for rendering. It also doesn't make financial sense to just withhold technology until software has caught up, so of course companies will be pumping out these CPUs into the mainstream, even if the mainstream finds them less useful.

RE: Deja vu
By Jeffk464 on 5/3/2011 11:00:21 AM , Rating: 2
Yup its basically a generation behind sandybridge. We will have to wait for bulldozer to see if AMD will be able to run with Intel. Llano will probably be a good solution for low to mid range pc's though do its good integrated graphics.

RE: Deja vu
By BSMonitor on 5/3/2011 11:04:59 AM , Rating: 1
It's like P4 vs Athlon XP. In reverse.

Actually Athlon XP was not significantly faster. It wasn't until Athlon 64 that AMD CPUs were actually superior and for a decent period of time.

XP surpassed the orginal P4's (willamette) but were in turn surpassed quite handily by P4 Northwood CPU's..

It was the Athlon 64 vs Northwood/Prescott where AMD processors were significantly faster across the board.

RE: Deja vu
By bug77 on 5/3/2011 11:19:35 AM , Rating: 1
Actually Athlon XP was not significantly faster.

Indeed, it wasn't. But it used much less power and was clocked about 1GHz lower.
And today Phenom II has the weaker architecture and AMD tries to make up for it by increasing the clock speed.

RE: Deja vu
By twhittet on 5/3/2011 1:52:37 PM , Rating: 2
I would actually be fine with just increasing the clock speed - if they moved to 22nm and made it not be a power hog. 22nm should also make it cheaper to produce.

RE: Deja vu
By theapparition on 5/3/2011 11:57:14 AM , Rating: 1
You are correct. Atlon XP's were actually not a good design that could easily go into thermal runaway without any sort of protection on them. They ran hot, and people did videos of cooking eggs on them.

Even in the heyday of the Athlon 64, the Pentium 4 Northwoods were very competitive. Only when Intel dropped Prescott on the world did AMD really have the upper hand in MOST benchmarks. everyone seems to have convienent memory loss, even the Pentium 4 Prescotts beat the latest Athlon 64s in video encoding. I'm not defending Prescott at all, as it was a terrible chip, but it was not such the lopsided "victory" that some seem to remember.

RE: Deja vu
By cfaalm on 5/3/2011 4:42:04 PM , Rating: 2
The only place where Intel took a real beating was in the multisocket servers benchmarks where the first incarnations of the Opteron scaled much better than anything Intel could throw at it. Even Intel admitted afterwards (during the Conroe era) that those Opterons were starting to look too interesting to major customers Intel had.

I'd agree that the P4s stayed pretty competitive, be it at high temperatures. Much more than Phenom I and II. It was sad to see how AMD lost their edge.

RE: Deja vu
By silverblue on 5/3/2011 6:12:05 PM , Rating: 2
The Athlon XP was still faster per-clock than Northwood in most workloads. Northwood just had a much higher clock speed which enabled it to properly compete. You have to remember that, clock-for-clock, P3 was still faster than P4. It took HyperThreading as well as far more memory bandwidth than the XP to actually do something about it, and even then, that wasn't guaranteed.

The Athlon XP couldn't scale much higher than the 3200+, so the Athlon 64 was the natural answer.

uh oh
By Pessimism on 5/3/2011 10:06:20 AM , Rating: 2
AMD needs Bulldozer to be competitive or else they may as well give up and put all their efforts into facing NVIDIA on the GPU front and abandon x86 altogether. Their other option would be to ride the coming ARM wave and produce something innovative there.

RE: uh oh
By Motoman on 5/3/2011 10:20:22 AM , Rating: 2
AMD has competed on price quite successfully during the times when Intel had technically better products. They're in no trouble.

RE: uh oh
By JasonMick on 5/3/11, Rating: 0
RE: uh oh
By Motoman on 5/3/2011 10:43:10 AM , Rating: 2
Yeah, the original K6 was right before I started building computers. The general impression was that it wasn't real great.

I had really good results building machines with K6-2 and K6-3 though - and for a while, even had fun building Cyrix machines. Ah, poor Cyrix, I knew you well...

Then when the original Athlon hit the market, it was like somebody sucker-punched the entire PC industry. Good times!

RE: uh oh
By StevoLincolnite on 5/3/2011 11:38:40 AM , Rating: 2
Ah, poor Cyrix, I knew you well...

When I had my Cyrix PR300 rig... I couldn't wait to see the ass-end of that chip! The floating point performance was incredibly abysmal!
Probably didn't help matters with it being paired up with only an S3 Virge DX/XG 4mb card either... End result was that 3D gaming was not fun on that rig.

I stuck it out as long as I could, the original StarCraft was all the game I needed for a couple of years after 1998, then I went with a Duron 800mhz+Geforce 2 GTS system.

Basically Intel Dominated right from the first Pentium.
NEC, IBM, Cyrix, AMD were all beaten by Intel, and it seemed that would continue with the Pentium 3... Until the Athlon came along.
As Intel and AMD raced to the 1ghz mark, Intel introduced the Coppermine core which integrated the L2 cache on-die which placed Intel at a distinct performance advantage.

Took AMD a little while to do the same to the Athlon, but when they did, they regained the Performance crown, then beat Intel to the "magical" 1ghz barrier.
Intel then faltered with the Coppermine around the 1.13ghz mark, The Coppermine core was quickly running out of steam... So then the Tualatin Pentium 3 arrived, essentially a die-shrink test with some improvements which would later be used in the Northwood chip. - And with some enhancements became the Basis of the Pentium M which would go on to Influence Conroe's design.

I try to pretend the Netburst architecture never happened though... It was all just a bad dream... The Stars architecture isn't a replica of that ill fated mistake in my opinion, not with the great pricing anyway.

RE: uh oh
By Gondor on 5/3/2011 1:33:22 PM , Rating: 2
The Stars architecture isn't a replica of that ill fated mistake in my opinion, not with the great pricing anyway.

But we have no idea what the pricing of Llano is going to be. What we do know so far is that it will be partially crippled Phenom II (aka Athlon II), coupled to a rather modest GPU (something HD5550ish, methinks) which will be eating into CPU's thermal budget in games so we can mostlikely kiss easy 3.5+ GHz overclocking goodbye, and this combination will probably get stuck with system memory with 128-bit bandwidth at best, requiring a new motherboard on top of everything (ergo: the combination is unlikely to get same great pricing Phenom II did).

AMD was able to milk the Phenom II cow for so long because it ran on existing motherboards and replacement boards were avaliable in every single price segment, catering to all profiles of users from the bottom up.

This advantage is gone with the new socket so they better bring something really impressive on the table price-wise otherwise people aren't very likely to want to downgrade their existing quad core Athlons/Phenoms II with HD5670 or whatever graphics and actually have to pay for that.

AMD is going to have to compete with their own products with Llano and they left themselves very little wiggling room in the past few years. The best 1st generation Llano will be competing against what ?

+ Athlon II x4 ($85 for 2.8 GHz model, $100 for the 3.0 GHz one)
+ HD5570 ($60 for 1 GB DDR3 model which is the closest thing I can find to Llano's 400 SP GPU)
- 1 GB of DDR3 taken away from the system (comes onboard with the HD5570)

Is their top of the line Llano likely to cost less than ~$140 ?

As the release day moves closer I'm more and more convinced that first generation of Llano will not be gamers' product at all so Bulldozer better be just as good as they say it is (= 50% faster than Deneb - hopefully they meant that per core rather than for the whole 8-core CPU). Llano will be great for OEM systems and laptops.

RE: uh oh
By Da W on 5/3/2011 2:21:55 PM , Rating: 2
I saw somewhere the entry model will be 2Ghz (with all your specs) eating merely 45W of power. This will be a laptop chip. So it's not all bad. I don't see Llano as a chip that i personnaly want, but it's the first time that AMD will have a competitive laptop offering.

RE: uh oh
By Motoman on 5/3/2011 5:37:46 PM , Rating: 2
Um, no point would I have suggested a Cyrix anything for someone who wanted to play games.

If you were typical ma & pa and just needed interwebz and email, Cyrix stuff was fine.

Even the C3 stuff after Via bought them really wasn't applicable to anything other than casual use. The thing is, the vast majority of users are casual users...and don't benefit from jawsome hardware.

RE: uh oh
By 91TTZ on 5/3/2011 1:56:16 PM , Rating: 2
By the time the K6 came out I'd been building computers for several years. The K6 was a big hit since it fit in the Socket 7 motherboards that were popular at the time and they outperformed all of Intel's socket 7 chips, and even the Pentium Pro. The K6 really was the chip that put AMD on the map as a serious contender.

Also, sites like Anandtech were abuzz with people who were surprised by the power of the chip.

RE: uh oh
By 91TTZ on 5/3/2011 1:51:35 PM , Rating: 2
The K6 was a very good product from AMD. It was probably the first product they had that outperformed anything that Intel had at the time. While Intel quickly countered with the P2, the K6 established AMD as a company that didn't just make clones of Intel products; they could actually make top-notch CPUs.

RE: uh oh
By silverblue on 5/3/2011 6:16:19 PM , Rating: 2
The K6-III was extremely good. However, it was also extremely expensive to manufacture.

RE: uh oh
By jabber on 5/4/2011 7:32:33 AM , Rating: 2
I loved my K6-III 400Mhz. Served me very well and was the first CPU I bought for my first full PC build. It was still cheaper than a 300-400Mhz Pentium chip.

It ended up in a laptop that had a P200MMX in it (oc'd to 233Mhz). I put it in there, it only ran at 233Mhz but it was faster than the Pentium.

RE: uh oh
By BSMonitor on 5/3/11, Rating: 0
RE: uh oh
By silverblue on 5/3/2011 6:43:47 PM , Rating: 2
Llano is a 32nm product with power gating which means it's already going to be cooler and far more frugal than a Phenom II as far as the CPU is concerned. The X4's problem is that AMD never brought out Zosma, which would've meant a power gated Deneb which utilised Turbo CORE. Had they actually done so, I think they'd be in a slightly better position now with the Phenom II line. They can partly rectify that with Llano.

I don't see how you can call K10 a failure. Yes, the original Phenom had the TLB bug, but to say it was bested by the Duos is a bit unfair. For a start, it was lower clocked, and being triple and quad core, it was meant for multitasking workloads. It goes without saying that software is generally optimised for Intel CPUs anyway (*cough* compiler naughtiness *cough*). In any case, all CPUs have bugs; Core 2 had loads of them. Phenom II was in a way rescuing a flawed design, but it was hardly unsuccessful. AMD's been very good at getting rid of its less than perfect stock, it's not as if they've gone to waste.

I think Core 2's speed owed a lot to the amount of fast L2 cache that they had. As stated before, Phenom II's L3 cache isn't full speed, and being exclusive isn't a major bonus.

Most people will find that their graphics cards (even if they have graphics cards) will be weaker than Llano. Remember that the vast majority of graphics card sales are NOT performance models, and that until recently, Intel had about half of the total graphics market with products as unremarkable as the GMA 500.

I'm not 100% sure we're getting exactly the same performance from Llano as we did from Deneb or even Propus. For a start, each core will have 1MB of L2 cache and not the 512KB seen with Deneb and Propus. AMD has been using Turbo CORE for a while now and it could very well make it onto Llano, meaning faster performance. The memory controller is likely to have been worked upon especially considering Intel's lead in this traditionally strong area for AMD. Basically, some of Phenom II's deficiencies are likely to have been addressed, though knowing AMD, they likely don't have the resources to go mad with K10.5, either that or the time to have brought it out well before now. I don't think there's a "maybe" that Llano's GPU will outpace Sandy Bridge's, but you also have to consider that a low-mid range discreet GPU at 32nm isn't going to be very expensive nor hungry. There's no PCB to worry about, for a start.

I want to see this thing in Hybrid Crossfire, and the potential benefit of Sideport memory if they choose to implement it (512MB+ please AMD).

RE: uh oh
By yomamafor1 on 5/3/2011 11:46:51 PM , Rating: 2
I don't see how you can call K10 a failure. Yes, the original Phenom had the TLB bug, but to say it was bested by the Duos is a bit unfair. For a start, it was lower clocked, and being triple and quad core, it was meant for multitasking workloads.

I think he was referring to the fact that Barcelona was supposed to outperform C2Qs by 40% across the board. But we all know that not only the first K10s were very late to the market, while only having 80~90% of its competitor's performance, and generally 10~20% higher in power consumption.

I wouldn't necessarily say the K10 family was a failure, but it is difficult to argue that Barcelona, at least the ones designated for consumers, was a failure. The Denab (K10.5) should be what K10 be when AMD launched them.

RE: uh oh
By silverblue on 5/4/2011 7:11:28 AM , Rating: 2
Good points. The 40% boast is rather laughable. I only mentioned Duos because he did, though. :)

RE: uh oh
By bug77 on 5/3/2011 11:00:18 AM , Rating: 2
I'm not holding my breath for Bulldozer either.

Last time AMD was completely mum about its upcoming product was before they launched Phenom. And we all know how it turned up.

RE: uh oh
By BSMonitor on 5/3/2011 11:34:03 AM , Rating: 2
Great point. K10 is a power hog now, and they are going to add up to 400 ATI SPs to the die.

Clock speeds of ~ 2GHz on the CPU cores?

What did Phenom launch at 1.8GHz?

Forget the names of the companies. The engineers at Intel are simply kicking ass right now.

RE: uh oh
By StevoLincolnite on 5/3/2011 11:47:18 AM , Rating: 2
They get a die-shrink to 32nm which will assist in reducing power consumption, not to mention additional Turbo and Cool'N'Quiet enhancements.

The turning point of "Fusion" will be when AMD can off-load some x86 processing to the GPU portion of the CPU.
Stuff like floating point calculations are actually well suited to being GPU processed. - That will have an effect of massively increased performance and hopefully reduced power consumption.
Such a thing won't be implemented now however.
At the moment we are just seeing an evolution with IGP's, as in they are being moved from the chipset to the CPU but AMD's vision is for the CPU and GPU to work together on tasks seamlessly.

RE: uh oh
By Jeffk464 on 5/3/2011 11:57:13 AM , Rating: 1
K10 power usage should get a lot better with Llano. Its dropping to 32nm and is suppose to have good power gating. I think it will a good solution for casual users - productivity and multimedia. the fact of the matter is for most users cpu power has outpaced their needs especially since MS stopped making each version of windows a bigger hog than the last.

RE: uh oh
By cfaalm on 5/3/2011 5:05:04 PM , Rating: 2
Your name says it all :) I don't think that AMD is going to leave x86 anytime soon and call it quits. If Bulldozer is all they claim it is, the heat is on once more in PC-land.

If AMD had any substantial ARM efforts going, they wouldn't have fired Dirk Meyer, obviously.

While AMD getting into ARM isn't unthinkable, I believe they'd be best served with a good focus on their CPU/GPU stuff right now. So much ground to gain still. Intel will someday produce a better (on die) grahpics chip, no doubt. They'd better be ahead of that game. If they do AMDs can get in a position to acquire an ARM designer like they did with ATI. It may be interesting to see how and if ARM fuses with x86.

There is no "bad taste in consumer's mouths"
By Beenthere on 5/3/2011 10:31:30 AM , Rating: 1
AMD provides excellent products at competitive prices. Few people need the latest and greatest over-priced toy-of-the week PC hardware. The 3.7 Ghz. Phenom II X4 is for those who want to play with OC'ing. There isn't a snowball chance in Hell of any InHell product ever being purchased by me.

RE: There is no "bad taste in consumer's mouths"
By AssBall on 5/3/2011 12:41:37 PM , Rating: 2
The overclocking headroom isn't even there.

InHell? Sounds like there is a bad taste in your mouth.

RE: There is no "bad taste in consumer's mouths"
By Beenthere on 5/3/2011 1:43:01 PM , Rating: 2
InHell has been convicted numerous times for violation of anti-trust laws and numerous times for tax evasion. IMO they are an evil, greedy company that uses their illegally gotten monopoly to exploit consumers. No ethical person could support a company that operates like this.

By silverblue on 5/3/2011 6:50:37 PM , Rating: 2
Unfortunately, they aren't the only company ever whose behaviour has been called into question and they certainly won't be the last. Most companies will try something if they think they can get away with it.

By AssBall on 5/3/2011 10:10:33 AM , Rating: 2
It also gets blown away by Intel's lowest end Nehalem... release September 2009

I might be wrong, but I thought the first Nahelems were out in '08.

RE: Nahelem
By JasonMick on 5/3/11, Rating: 0
RE: Nahelem
By MrTeal on 5/3/2011 10:43:47 AM , Rating: 2
Intel had Lynnfield and Bloomdfield in the market concurrently. Bloomfield and LGA1366 were the higher end socket, and it was replaced by Gulftown. Lynnfield was replaced by Clarkdale.

Lynnfield wasn't a refresh, it was extending the first generation of Nehalem into the mainstream marketplace with less memory channels. All the Nehalem chips have the memory controller on die.

RE: Nahelem
By MrTeal on 5/3/2011 10:35:47 AM , Rating: 2
They were.
It also gets blown away by Intel Corp.'s (INTC) lowest end Nehalem (45 nm) processor, the Core i7-860. Let that sink in for a minute -- Intel's slowest quad-core i7 processor, released in September 2009, can still beat AMD's fastest quad-core processor, released in May 2011.

The i7-860 isn't Intel's lowest end Nehalem chip. There's plenty of i5s and i3s that are slower. It's not even the lowest end i7, the i7-620UM would be cheaper and considerably slower than it.

Good for upgraders
By Slaimus on 5/3/2011 11:59:28 AM , Rating: 2
There are plenty of people with AM2/AM3 boards with dual cores that would get a very nice upgrade with this.

My same era Core2 duo had very limited upgrade options, and I ended up paying a lot for a slower Core2 quad. The Q9550 is still $289 at newegg!

RE: Good for upgraders
By MrTeal on 5/3/2011 12:57:21 PM , Rating: 2
Why would you buy a Q9550 for $289 when for $325 you could get a i5-2500K and a $100 P67 mobo that would blow it out of the water? If you don't care about overclocking, you could even get an H67 mobo + i5-2400 for $265 and it would still be faster on every benchmark on AT bench than the Q9550.

RE: Good for upgraders
By Slaimus on 5/3/2011 1:37:46 PM , Rating: 2
This was 8 months ago. I did not want to get new CPU/RAM/MOBO and reinstall OS.

i5 is not cheaper
By Shadowmaster625 on 5/4/2011 10:20:34 AM , Rating: 2
... if you have to buy a new motherboard

Shark attack
By smilingcrow on 5/4/2011 6:22:47 PM , Rating: 2
Hey, the Fonz only drives the fastest; Intel CPUs.
They give me enough speed to jump sharks.

And so it goes
By YashBudini on 5/3/11, Rating: 0
"We can't expect users to use common sense. That would eliminate the need for all sorts of legislation, committees, oversight and lawyers." -- Christopher Jennings

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki