backtop


Print 66 comment(s) - last by CityZen.. on Apr 26 at 11:12 PM

A wounded AMD looks to release its first 12 core processor in Q1 2010

Many fondly recall the megahertz race -- the 90s phenomena in which Advanced Micro Devices and Intel raced to have the highest-clocked processor.  Over time, designers realized such a blind race was foolish, and that it was conceding far too much in efficiency and heat.  Now a similar race is heating up over the number of cores in a desktop processor, but only time will tell whether the race is the path of good design, or another blind charge.

Intel already has a four-core 45 nm desktop processor (Nehalem/i7) and a six-core server processor (Xeon) on the market.  It plans to roll out an eight-core server processor (Xeon) in Q4 2009. 

However, it may fall behind in the core race (though still presumably ahead in die-shrinks) if AMD is able to deliver on its planned release schedule.  AMD plans to release its six-core 45 nm processor, codenamed Istanbul in June.  The chip, like Intel's 6-core beast, is geared for the server market. 

But that's far from AMD's biggest news.  AMD has announced plans to beat Intel to 12 cores, releasing both 8 and 12 core processors, codenamed Magny-Cours, in Q1 2010.  It has also announced that it will in 2011 roll out its 32 nm Bulldozer core, which will feature up to 16 cores, running on the new Sandtiger architecture.  In short -- AMD plans to beat Intel in the core race.

Patrick Patla, an AMD vice president and general manager of its server unit states, "We are not ducking performance.  We want to do top-line performance with bottom-line efficiency."

Intel, meanwhile, remains confident that it can deliver equivalent performance with fewer cores via Hyper Threading.  Like NVIDIA, Intel is pursuing a slightly more monolithic design with fewer, but stronger processor cores.  Intel spokesman Nick Knupffer states, "We are confident we will stay far ahead on performance--and with fewer cores--do so in a more cost-effective, manufacturing-friendly manner.  This will be the first time in history where less is more."

Even if AMD can beat Intel in performance, it will still be in dire financial straits until it can translate that performance into sales.  AMD took another big loss in its recently reported fiscal quarter, just the latest in several years mostly in the red. 



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Just as pointless as the mhz war
By therealnickdanger on 4/23/2009 9:48:31 AM , Rating: 5
The way I see it, there is no "magic bullet" and each pursuit of such a bullet brings about great innovation, even if it is not the intended one. The MHz-Warz brought about amazing per-clock efficiencies as people pulled away from the space-heater mentality. This birthed the multi-core fad. The current Core-War will likely bring about another innvotion that will steer the next fad. As long as we keep seeing 40%+ performance improvements with each new generation of chips, I don't really care what they do!


RE: Just as pointless as the mhz war
By DeepBlue1975 on 4/23/2009 10:24:09 AM , Rating: 3
I don't see a practical application of that many cores in desktop environments RIGHT NOW.

But some years into the future, natural-like speech recognition and truly complex AI algorithms making use of all those cores as part of a built-in neural net can pave the road to really cool applications.

The term "supercomputer on a chip" comes to mind. :D


RE: Just as pointless as the mhz war
By TomZ on 4/23/2009 10:39:53 AM , Rating: 3
quote:
I don't see a practical application of that many cores in desktop environments RIGHT NOW.
These processors will target server applications, where they will find synergy with trends in that market towards virtualization and energy (operating cost) reduction.

I agree with you - usage on the desktop will only find use with pretty specialized applications for the first few years probably.


RE: Just as pointless as the mhz war
By MrPeabody on 4/23/2009 11:07:12 AM , Rating: 5
quote:
These processors will target server applications, where they will find synergy with trends in that market towards virtualization and energy (operating cost) reduction.


Gaah! The buzzwords! I see your point, but I just can't take the buzzwords!


RE: Just as pointless as the mhz war
By TomZ on 4/23/2009 11:13:00 AM , Rating: 3
Possible future career in marketing?


By Silver2k7 on 4/24/2009 10:57:22 AM , Rating: 2
DirectX 11 will bring on better multi core support for gaming.. eventually the 6 and 8 core designs will trickle down to desktops.


By DeepBlue1975 on 4/23/2009 4:17:45 PM , Rating: 2
Of course these are server parts, but I think it's not gonna take so much longer till 6 or 8 cores come to desktop once those 12 core monsters are introduced to the server market.

Neural net on a chip... Sounds really exciting as a possible future. CPUs would have to start being called e-brains instead :D


RE: Just as pointless as the mhz war
By croc on 4/23/2009 7:26:23 PM , Rating: 2
I'm not sure how you are defining 'synergy'. What makes multi-core CPU's beneficial in the server space is reduced overhead. This comes in several flavours, one being reduced costs in energy, one being reduced footprint, and last but not least, reduced costs in licensing. A per-core Oracle license is not cheap. But Oracle gives a discount on CPU's with multiple cores per socket. VMWare used to license by the socket, but I am sure that has changed. Citrix also licenses per application-seat, how you set up your servers is up to you.

So if you add up all of the cost savings, it's huge to a company's OPEX accounting.


By atlmann10 on 4/24/2009 4:07:50 PM , Rating: 1
The thing that is holding the computer market back performance wise now is not the hardware. Which is way way ahead of the software. If say Microsoft coded Windows to use a specific core for say all background ops, and one for all OS graphic ops etc. Then They programmed office to use different CPU cores for different things and game programmers, network apps, net browsing apps and everything else was coded to make use of CPU's in such a manner as well. You could run windows singularly speaking on a 1/4 (or however many cores you had) of the CPU ability. This would enable a 4 core 1ghz to outrun a 4ghz CPU today at 1/4 of the energy usage and heat production, thereby needing less energy to cool it as well. We would also gain 4 times the CPU capabilities on a 4 core processor right.


By CityZen on 4/26/2009 11:12:01 PM , Rating: 2
quote:
I don't see a practical application of that many cores in desktop environments RIGHT NOW.


Real time ray tracing, anyone?


RE: Just as pointless as the mhz war
By DCstewieG on 4/23/2009 10:54:08 AM , Rating: 5
quote:
The MHz-Warz brought about amazing per-clock efficiencies as people pulled away from the space-heater mentality.

Actually I'd argue the MHz war delayed efficiency. It wasn't until AMD starting killing the P4 with the Athlon 64 (even with the XP) that Intel moved away from the relatively inefficient NetBurst architecture that seemed designed purely for higher clock speeds. It was practically a wasted generation for Intel in terms of efficiency. Luckily for them they were able to roar back, ironically enough by working from their previous generation architecture!


RE: Just as pointless as the mhz war
By kkwst2 on 4/23/2009 1:30:49 PM , Rating: 2
Maybe. I think the real lesson is that it's competition that spurs innovation in the marketplace.

But it probably wasn't wasted. We learned a lot about scaling frequencies and what doesn't work. Hyperthreading was developed then, even if it's current implementation is not exactly the same, I'm sure they're leveraging that experience.

Going back to things that worked well in the past is not that ironic, in fact it's pretty expected. It's part of the learning process. It's hard to learn without making mistakes and having to back up sometimes.

Now, certainly Intel stayed with the Netburst architecture too long, probably fueled by non-technological forces. But it doesn't make that period wasted.

Let's just hope that AMD stays competitive enough to keep Intel pushing the envelope. For my modeling applications, the difference in performance between the previous generation Xeon and Nehalem is quite remarkable.


RE: Just as pointless as the mhz war
By 16nm on 4/25/2009 10:11:22 AM , Rating: 2
No, I think the lesson is that when AMD had the opportunity to increase capacity to meet the high demand for their chips and to better compete against Intel, they should have taken it instead of buying a low profit graphics hardware company. The ATI purchase could have waited. Intel obviously couldn't have.


RE: Just as pointless as the mhz war
By kkwst2 on 4/26/2009 8:07:46 PM , Rating: 2
Except that has nothing to do with the conversation, which was on innovation and scaling, not on business decisions.

That being said, their window was pretty tight. You can't ramp capacity overnight. By the time they ramped up, the window would have almost been closed and they would have been sitting on even more overhead than they have.


By lagitup on 4/23/2009 5:18:49 PM , Rating: 2
Hang on. Think about a virtualized server environment, which is likely what this chip is going to be targeted at. You could give each of 12 virtual machines their own, private execution core (thats a bet perfect worldish, but the point remains) and replace the 24 PSUs (redundancy ftw) with 2, cutting down power costs, replace the 12 sets of hard drives with 1, reducing overhead thanks to the need for only one set of mechanical parts, etc etc. At least in the enterprise/server space, this is probably a good thing.

Imagine how much cheaper game hosting would be if they could run n(3) (where n is however many they can run on a quad core now) instances of the game on a single physical machine, as opposed to just n?


RE: Just as pointless as the mhz war
By omnicronx on 4/23/2009 1:05:25 PM , Rating: 2
quote:
As long as we keep seeing 40%+ performance improvements with each new generation of chips, I don't really care what they do!
But thats exactly his point, with current tech this just is not possible. In fact there will reach a point where with current tech, too many cores will actually have negative effects.

Just go Google it, with current memory technology, over 8 cores can actually result in a negative impact. The Memory bandwidth just is not there, especially in the desktop environment. It will be interesting to see the results of these 12 core variants, as my guess is they will be limited to the server market. Desktop users just won't see an improvement.


RE: Just as pointless as the mhz war
By kkwst2 on 4/23/2009 1:43:04 PM , Rating: 3
But that's exactly HIS point. He's saying that once we hit the core wall it will force some other innovation in how to scale performance and then we'll beat that horse to death and continue to improve performance.

That's how it works. We keep pushing current technology until there's a need to do something differently. It's only when we hit the wall that people really start trying to think outside the box and more importantly that the people making the architecture decisions (which usually aren't the same as the people innovating) start listening.

They've been saying for years that we're going to hit a wall, but each time someone comes up with a way around it. Now maybe we will hit one eventually, but I'm betting it won't be next year.

Memory bandwidth has been dramatically improved with Nehalem. It should scale pretty well to 12 cores. My applications scale pretty well to 8 cores on Nehalem (Xeon) - much better than the previous architecture. It's certainly not like they don't understand the problems of core scaling and they're doing things to try to solve those issues.


RE: Just as pointless as the mhz war
By omnicronx on 4/23/2009 1:48:23 PM , Rating: 2
And I agree, but we won't see the benefit in the desktop world for some time to come. We've already had the MHZ war, then the war of the cores, now both Intel and AMD have on die memory controllers, they are kind of running out of options. My guess is on die memory controller per core, not shared per chip, but that won't reach the desktop market for some time, its just too expensive (nor is it currently needed).
quote:
Memory bandwidth has been dramatically improved with Nehalem. It should scale pretty well to 12 cores.
How many times must I say its not merely the bandwidth, latency is disgustingly high the more cores you go. Just think about how complex it is to split a job between four cores using a shared memory controller let alone 12.


RE: Just as pointless as the mhz war
By kkwst2 on 4/26/2009 8:44:57 PM , Rating: 2
quote:
How many times must I say its not merely the bandwidth, latency is disgustingly high the more cores you go.


How many times are you going to be wrong? That's a generalization. Sometimes it's true. Obviously in general it's better to have low latency, but it depends on the task how important memory latency is.

And Nehalem has significantly lower latency (under most situations) and higher bandwidth, so my statement holds.

It also depends on the talk how complex it is to split between cores. My application scales quite well to 50-100 cores.

People keep talking about either desktop or server applications. There are a LOT of technical people utilizing 8+ cores all the time in both "desktop" and cluster computing. It's certainly a niche market compared to servers, but I guarantee you it's still a lot of money. These Nehalem chips are revolutionizing how much FPU power you can get out of a small cluster.


"I f***ing cannot play Halo 2 multiplayer. I cannot do it." -- Bungie Technical Lead Chris Butcher

Related Articles
AMD Posts Another Massive Quarterly Loss
April 22, 2009, 10:53 AM
AMD Offers Roadmap for Future Processors
November 14, 2008, 1:44 PM
Hello AMD Socket G34
July 16, 2008, 5:38 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki