backtop


Print 66 comment(s) - last by CityZen.. on Apr 26 at 11:12 PM

A wounded AMD looks to release its first 12 core processor in Q1 2010

Many fondly recall the megahertz race -- the 90s phenomena in which Advanced Micro Devices and Intel raced to have the highest-clocked processor.  Over time, designers realized such a blind race was foolish, and that it was conceding far too much in efficiency and heat.  Now a similar race is heating up over the number of cores in a desktop processor, but only time will tell whether the race is the path of good design, or another blind charge.

Intel already has a four-core 45 nm desktop processor (Nehalem/i7) and a six-core server processor (Xeon) on the market.  It plans to roll out an eight-core server processor (Xeon) in Q4 2009. 

However, it may fall behind in the core race (though still presumably ahead in die-shrinks) if AMD is able to deliver on its planned release schedule.  AMD plans to release its six-core 45 nm processor, codenamed Istanbul in June.  The chip, like Intel's 6-core beast, is geared for the server market. 

But that's far from AMD's biggest news.  AMD has announced plans to beat Intel to 12 cores, releasing both 8 and 12 core processors, codenamed Magny-Cours, in Q1 2010.  It has also announced that it will in 2011 roll out its 32 nm Bulldozer core, which will feature up to 16 cores, running on the new Sandtiger architecture.  In short -- AMD plans to beat Intel in the core race.

Patrick Patla, an AMD vice president and general manager of its server unit states, "We are not ducking performance.  We want to do top-line performance with bottom-line efficiency."

Intel, meanwhile, remains confident that it can deliver equivalent performance with fewer cores via Hyper Threading.  Like NVIDIA, Intel is pursuing a slightly more monolithic design with fewer, but stronger processor cores.  Intel spokesman Nick Knupffer states, "We are confident we will stay far ahead on performance--and with fewer cores--do so in a more cost-effective, manufacturing-friendly manner.  This will be the first time in history where less is more."

Even if AMD can beat Intel in performance, it will still be in dire financial straits until it can translate that performance into sales.  AMD took another big loss in its recently reported fiscal quarter, just the latest in several years mostly in the red. 



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Just as pointless as the mhz war
By SublimeSimplicity on 4/23/09, Rating: 0
RE: Just as pointless as the mhz war
By therealnickdanger on 4/23/2009 9:48:31 AM , Rating: 5
The way I see it, there is no "magic bullet" and each pursuit of such a bullet brings about great innovation, even if it is not the intended one. The MHz-Warz brought about amazing per-clock efficiencies as people pulled away from the space-heater mentality. This birthed the multi-core fad. The current Core-War will likely bring about another innvotion that will steer the next fad. As long as we keep seeing 40%+ performance improvements with each new generation of chips, I don't really care what they do!


RE: Just as pointless as the mhz war
By DeepBlue1975 on 4/23/2009 10:24:09 AM , Rating: 3
I don't see a practical application of that many cores in desktop environments RIGHT NOW.

But some years into the future, natural-like speech recognition and truly complex AI algorithms making use of all those cores as part of a built-in neural net can pave the road to really cool applications.

The term "supercomputer on a chip" comes to mind. :D


RE: Just as pointless as the mhz war
By TomZ on 4/23/2009 10:39:53 AM , Rating: 3
quote:
I don't see a practical application of that many cores in desktop environments RIGHT NOW.
These processors will target server applications, where they will find synergy with trends in that market towards virtualization and energy (operating cost) reduction.

I agree with you - usage on the desktop will only find use with pretty specialized applications for the first few years probably.


RE: Just as pointless as the mhz war
By MrPeabody on 4/23/2009 11:07:12 AM , Rating: 5
quote:
These processors will target server applications, where they will find synergy with trends in that market towards virtualization and energy (operating cost) reduction.


Gaah! The buzzwords! I see your point, but I just can't take the buzzwords!


RE: Just as pointless as the mhz war
By TomZ on 4/23/2009 11:13:00 AM , Rating: 3
Possible future career in marketing?


By Silver2k7 on 4/24/2009 10:57:22 AM , Rating: 2
DirectX 11 will bring on better multi core support for gaming.. eventually the 6 and 8 core designs will trickle down to desktops.


By DeepBlue1975 on 4/23/2009 4:17:45 PM , Rating: 2
Of course these are server parts, but I think it's not gonna take so much longer till 6 or 8 cores come to desktop once those 12 core monsters are introduced to the server market.

Neural net on a chip... Sounds really exciting as a possible future. CPUs would have to start being called e-brains instead :D


RE: Just as pointless as the mhz war
By croc on 4/23/2009 7:26:23 PM , Rating: 2
I'm not sure how you are defining 'synergy'. What makes multi-core CPU's beneficial in the server space is reduced overhead. This comes in several flavours, one being reduced costs in energy, one being reduced footprint, and last but not least, reduced costs in licensing. A per-core Oracle license is not cheap. But Oracle gives a discount on CPU's with multiple cores per socket. VMWare used to license by the socket, but I am sure that has changed. Citrix also licenses per application-seat, how you set up your servers is up to you.

So if you add up all of the cost savings, it's huge to a company's OPEX accounting.


By atlmann10 on 4/24/2009 4:07:50 PM , Rating: 1
The thing that is holding the computer market back performance wise now is not the hardware. Which is way way ahead of the software. If say Microsoft coded Windows to use a specific core for say all background ops, and one for all OS graphic ops etc. Then They programmed office to use different CPU cores for different things and game programmers, network apps, net browsing apps and everything else was coded to make use of CPU's in such a manner as well. You could run windows singularly speaking on a 1/4 (or however many cores you had) of the CPU ability. This would enable a 4 core 1ghz to outrun a 4ghz CPU today at 1/4 of the energy usage and heat production, thereby needing less energy to cool it as well. We would also gain 4 times the CPU capabilities on a 4 core processor right.


By CityZen on 4/26/2009 11:12:01 PM , Rating: 2
quote:
I don't see a practical application of that many cores in desktop environments RIGHT NOW.


Real time ray tracing, anyone?


RE: Just as pointless as the mhz war
By DCstewieG on 4/23/2009 10:54:08 AM , Rating: 5
quote:
The MHz-Warz brought about amazing per-clock efficiencies as people pulled away from the space-heater mentality.

Actually I'd argue the MHz war delayed efficiency. It wasn't until AMD starting killing the P4 with the Athlon 64 (even with the XP) that Intel moved away from the relatively inefficient NetBurst architecture that seemed designed purely for higher clock speeds. It was practically a wasted generation for Intel in terms of efficiency. Luckily for them they were able to roar back, ironically enough by working from their previous generation architecture!


RE: Just as pointless as the mhz war
By kkwst2 on 4/23/2009 1:30:49 PM , Rating: 2
Maybe. I think the real lesson is that it's competition that spurs innovation in the marketplace.

But it probably wasn't wasted. We learned a lot about scaling frequencies and what doesn't work. Hyperthreading was developed then, even if it's current implementation is not exactly the same, I'm sure they're leveraging that experience.

Going back to things that worked well in the past is not that ironic, in fact it's pretty expected. It's part of the learning process. It's hard to learn without making mistakes and having to back up sometimes.

Now, certainly Intel stayed with the Netburst architecture too long, probably fueled by non-technological forces. But it doesn't make that period wasted.

Let's just hope that AMD stays competitive enough to keep Intel pushing the envelope. For my modeling applications, the difference in performance between the previous generation Xeon and Nehalem is quite remarkable.


RE: Just as pointless as the mhz war
By 16nm on 4/25/2009 10:11:22 AM , Rating: 2
No, I think the lesson is that when AMD had the opportunity to increase capacity to meet the high demand for their chips and to better compete against Intel, they should have taken it instead of buying a low profit graphics hardware company. The ATI purchase could have waited. Intel obviously couldn't have.


RE: Just as pointless as the mhz war
By kkwst2 on 4/26/2009 8:07:46 PM , Rating: 2
Except that has nothing to do with the conversation, which was on innovation and scaling, not on business decisions.

That being said, their window was pretty tight. You can't ramp capacity overnight. By the time they ramped up, the window would have almost been closed and they would have been sitting on even more overhead than they have.


By lagitup on 4/23/2009 5:18:49 PM , Rating: 2
Hang on. Think about a virtualized server environment, which is likely what this chip is going to be targeted at. You could give each of 12 virtual machines their own, private execution core (thats a bet perfect worldish, but the point remains) and replace the 24 PSUs (redundancy ftw) with 2, cutting down power costs, replace the 12 sets of hard drives with 1, reducing overhead thanks to the need for only one set of mechanical parts, etc etc. At least in the enterprise/server space, this is probably a good thing.

Imagine how much cheaper game hosting would be if they could run n(3) (where n is however many they can run on a quad core now) instances of the game on a single physical machine, as opposed to just n?


RE: Just as pointless as the mhz war
By omnicronx on 4/23/2009 1:05:25 PM , Rating: 2
quote:
As long as we keep seeing 40%+ performance improvements with each new generation of chips, I don't really care what they do!
But thats exactly his point, with current tech this just is not possible. In fact there will reach a point where with current tech, too many cores will actually have negative effects.

Just go Google it, with current memory technology, over 8 cores can actually result in a negative impact. The Memory bandwidth just is not there, especially in the desktop environment. It will be interesting to see the results of these 12 core variants, as my guess is they will be limited to the server market. Desktop users just won't see an improvement.


RE: Just as pointless as the mhz war
By kkwst2 on 4/23/2009 1:43:04 PM , Rating: 3
But that's exactly HIS point. He's saying that once we hit the core wall it will force some other innovation in how to scale performance and then we'll beat that horse to death and continue to improve performance.

That's how it works. We keep pushing current technology until there's a need to do something differently. It's only when we hit the wall that people really start trying to think outside the box and more importantly that the people making the architecture decisions (which usually aren't the same as the people innovating) start listening.

They've been saying for years that we're going to hit a wall, but each time someone comes up with a way around it. Now maybe we will hit one eventually, but I'm betting it won't be next year.

Memory bandwidth has been dramatically improved with Nehalem. It should scale pretty well to 12 cores. My applications scale pretty well to 8 cores on Nehalem (Xeon) - much better than the previous architecture. It's certainly not like they don't understand the problems of core scaling and they're doing things to try to solve those issues.


RE: Just as pointless as the mhz war
By omnicronx on 4/23/2009 1:48:23 PM , Rating: 2
And I agree, but we won't see the benefit in the desktop world for some time to come. We've already had the MHZ war, then the war of the cores, now both Intel and AMD have on die memory controllers, they are kind of running out of options. My guess is on die memory controller per core, not shared per chip, but that won't reach the desktop market for some time, its just too expensive (nor is it currently needed).
quote:
Memory bandwidth has been dramatically improved with Nehalem. It should scale pretty well to 12 cores.
How many times must I say its not merely the bandwidth, latency is disgustingly high the more cores you go. Just think about how complex it is to split a job between four cores using a shared memory controller let alone 12.


RE: Just as pointless as the mhz war
By kkwst2 on 4/26/2009 8:44:57 PM , Rating: 2
quote:
How many times must I say its not merely the bandwidth, latency is disgustingly high the more cores you go.


How many times are you going to be wrong? That's a generalization. Sometimes it's true. Obviously in general it's better to have low latency, but it depends on the task how important memory latency is.

And Nehalem has significantly lower latency (under most situations) and higher bandwidth, so my statement holds.

It also depends on the talk how complex it is to split between cores. My application scales quite well to 50-100 cores.

People keep talking about either desktop or server applications. There are a LOT of technical people utilizing 8+ cores all the time in both "desktop" and cluster computing. It's certainly a niche market compared to servers, but I guarantee you it's still a lot of money. These Nehalem chips are revolutionizing how much FPU power you can get out of a small cluster.


By 457R4LDR34DKN07 on 4/23/2009 10:06:48 AM , Rating: 3
Once they hit the performance wall they will likely be working toward photonic computing, and ditching silicone for graphine.


RE: Just as pointless as the mhz war
By marvdmartian on 4/23/2009 10:08:34 AM , Rating: 1
So does this mean that 5 blade razors really aren't more efficient or better performing than the old 2, 3 or 4 blade varieties??

Who'da thunk? ;)


RE: Just as pointless as the mhz war
By kkwst2 on 4/23/2009 1:46:14 PM , Rating: 2
F*** everything, we're doing five blades!

Classic.


RE: Just as pointless as the mhz war
By parge on 4/23/2009 10:16:23 AM , Rating: 2
Most developers aren't even optimizing games for 4 cores, let alone 16, and I can't see that changing in a year. Adding another 12 cores that do nothing doesn't seem to be the way to go unless they are going to work hand in hand with developers to really start using them more efficiently.


By Exedore on 4/23/2009 10:25:22 AM , Rating: 2
These are targeted at the server platform, not for home users and games. Servers can very well use that many cores, and packing that many cores on a single motherboard can save a lot of space.


RE: Just as pointless as the mhz war
By fishbits on 4/23/2009 10:40:42 AM , Rating: 5
There are more uses for a PC than gaming. I'm a gamer and welcome any advances on that front, but there's far more to computing than the next verson of Quake.

Besides, what do you think will drive the optimization of multi-core gaming if not the increase in typically available cores on customer's systems? It's like you're saying there should be no advances in gaming graphics because many games don't currently require top-of-the-line video cards in SLI before they will launch.

Instead, the more penetration increasingly powerful hardware has in homes, the more incentive there is for developers to code for it. But, the good news is that you're able to sit these hardware advances out if you're not interested.


RE: Just as pointless as the mhz war
By parge on 4/23/2009 11:09:00 AM , Rating: 2
I understand, and I'm not against progress, but what I am saying is that for a majority of users 2 or 4 cores is more than enough, and games are probably the most common thing people use that really pushes the hardware. Not many people are doing video encoding on everyday basis, compared to those playing video games. Remember the second part of my post, that if they are going to do this, they need to work hard with developers to make sure they can code the games to take advantage of it. As much flak as Nvidia take for Physx implementation in games, you can't fault their efforts to work with developers to optimize for their drivers and CUDA to make it happen. It just seems dissappointing that we have had Quad core processers for so long and Anandtech are still recommending Dual Cores for mainstream gaming rigs simply because bang for buck, your not going to get much extra for your extra 2 cores.

Like you say though, if these CPUs are server orientated then fair enough, but I would still like to make the above point.


RE: Just as pointless as the mhz war
By TSS on 4/23/2009 11:01:26 AM , Rating: 2
my friend still plays wow on a athlon 3500+ and a ati 9700 card. and wow is played by millions, most of which probably won't have that much different specs. also that same arguement has been put forth since the athlon x2.

i really don't think games are the driving force behind this. so they'll just have to play catch up, sooner or later. and they will, or the market dies.

the way the gaming market is now it'll probably be later. lets just hope for the best.


By gamerk2 on 4/23/2009 11:56:54 AM , Rating: 2
The problem the Devs made was that they coded assuming a maximum of two cores. Now that its clear that multicore is the way of the future, engines will be devolped that scale according to the number of CPU processing cores.

Such programs have existed for servers for ages, that scale with upwards of 80% efficency. PC's will catch up withint the next 2-4 years.


RE: Just as pointless as the mhz war
By nafhan on 4/23/2009 10:58:09 AM , Rating: 2
The MHz war stopped when there were diminishing returns. The current "processor cores per package" wars will stop when there is diminishing returns. At that point, we will move onto something else. Until then, there's no reason to stop!
That said, these chips will be targeted at markets where a large number of cores make sense (i.e. not your typical desktop).


By SublimeSimplicity on 4/23/2009 12:05:53 PM , Rating: 2
Correction the MHz war stopped when the marketing for more clock cycles hit diminishing returns.

As long as people (which there are many posting comments here) believe that a core for each thread is not a waste, the marketing returns for more cores will continue.

The reality is that because of memory latency, a large portion of the clock cycles go to waste because the core is stalled waiting on memory to be fetched. People don't realize this because Task Manager shows the CPU utilization at 100% even if 70% of that time was spent with the CPU stalled.


RE: Just as pointless as the mhz war
By omnicronx on 4/23/2009 12:58:55 PM , Rating: 2
The OP is 100% correct, anything over 8 cores right now is pretty much useless in the desktop world. It is just too hard to deal with latency and the cores will be constantly waiting to perform actions because of the memory bottleneck.

If anything we should be shifting back to the MHZ wars, or perhaps its time to go back to thinking about efficiency. They are also quickly approaching the wall of how much smaller we can make our CPU's using current technology. After 28nm, things will start to become interesting, that is for sure.

I would also like to point out that two cores is more than enough for 95% of the population. Heck, an in order processing Atom netbook is more than enough for many people (10+ year old tech), why on earth you think ramping up in cores is a good idea is beyond me, its just not cost efficient, especially for a company like AMD who have constantly been in the red for the past year or two.


RE: Just as pointless as the mhz war
By TomZ on 4/23/2009 1:53:26 PM , Rating: 3
quote:
The OP is 100% correct, anything over 8 cores right now is pretty much useless in the desktop world.
It's been posted a dozen times already - this is not a desktop processor; it's a server processor. Can we stop beating that dead horse?


By omnicronx on 4/23/2009 2:12:39 PM , Rating: 2
But that's not what this discussion is about. What happens in the server world very much so affects the desktop world.

The technology behind these chips will go into the next desktop variant, so discussion about how these cores will scale is very much so worth talking about.


By SublimeSimplicity on 4/23/2009 2:33:44 PM , Rating: 1
Oh, I was under the impression that servers used RAM just like desktop machines do. I didn't realize there was a completely different utopia form of DDR that has no latency that is only put in servers.

My apology for wasting everyone's time.


RE: Just as pointless as the mhz war
By knutjb on 4/23/2009 7:22:44 PM , Rating: 2
Could be AMD has figured out how to make 12 cores work effectively with the memory. After all Intel followed AMD with the integrated memory controller and look what AMD did with GFX cards. Some of those ideas maybe transferable, perhaps they are sorting out a leap in technology. AMD is smaller than Intel, it doesn't make them stupid or inept, time will tell on that. All of this is speculation, as usual I'll believe the tech when I can buy it.


you know...
By meepstone on 4/23/2009 10:49:59 AM , Rating: 4
whats funny is people are saying here that its useless to have that many cores.

Always coming from the people who know nothing about it and talk like they are some know-it-all genius, yet they have no answers themselves. Congradulations to these people and their pointless posts.




RE: you know...
By xti on 4/23/2009 11:15:03 AM , Rating: 5
translation: these arent supposed to be $100 chips to play crysis. its supposed to be target the server markets where volume is lower but prices are huge in comparison.

if AMD/Intel just made chips for WoW players, we would barely be seeing dual core chips today.


RE: you know...
By meepstone on 4/23/09, Rating: -1
RE: you know...
By SlyNine on 4/24/2009 12:26:27 AM , Rating: 2
I think if you took another look, and were not so afraid of getting attacked. You would have seen that what he says is in parallel with what you're trying to say.

I got out of it, these chips do have a place, the fact that Crysis doesn't get a frame rate boost doesn't mean IT professionals wont have use for them.


RE: you know...
By omnicronx on 4/23/2009 1:44:21 PM , Rating: 1
quote:
Always coming from the people who know nothing about it and talk like they are some know-it-all genius, yet they have no answers themselves.
No, Whats funny are the people saying more cores the better without backing up their statements.

With current CPU's DDR3 can't handle 12 cores, DDR4 won't handle 12 cores, the bandwidth just is not there. So unless we have on die memory controllers for every single core, and fast enough memory to handle it, 8+ cores will be useless in the desktop environment for some time to come. Furthermore Intel has just made the venture to on die memory controllers (shared between cores), and AMD does not really have the capital right now to raise the price of their chips substantially.

This is not the old argument of 'Hardly any programs are multithreded, thus multicore is not needed', this is actually a hardware issue.


RE: you know...
By TomZ on 4/23/2009 1:58:15 PM , Rating: 2
Somehow I think the architects of these chips will already be aware of these types of issues.


RE: you know...
By omnicronx on 4/23/2009 2:07:36 PM , Rating: 2
Then why are there no desktop variants on the horizon?

In the past when either AMD or Intel released a server chip, a desktop variant was already on their roadmap.

This is just not the case this time around.


RE: you know...
By TomZ on 4/23/2009 2:33:51 PM , Rating: 1
quote:
Then why are there no desktop variants on the horizon?
Because 99% of desktop computer users have no use for so many cores, so therefore no market exists yet for that on the desktop. No market means no product.

Anyway, if a memory or related bottleneck exists, then it will show up in server applications right away.


RE: you know...
By tmouse on 4/23/2009 2:06:42 PM , Rating: 2
I agree, another issue is businesses do not keep upgrading every time something new comes out, so this will be far less effective than the megahertz wars. Growth requiring additional server capacity is FAR slower than desktop turnover (which can be driven by bloated software). In the current economy, which I doubt will get much better in at least the next 5 years, most companies will not be doing major hardware purchases. Lower costs and more energy efficiencies (to save dollars) will be a far greater selling point than who has the most cores. I do not know if AMD could survive the wait for that kind of war to pay off.


that's why...
By swizeus on 4/24/2009 3:33:24 AM , Rating: 2
intel really urged to revive the hyperthreading eventhough the technology failed in P4... it's a real cost cutting (by seeing the trend now, intel can beat 12 cores AMD with just an 8 hyperthreaded-core processor -- that's 4 less). Intel just need to make each core reach a maximum efficiency (which i think has been obtainable in i7 for now). At least for some span of time they can get financial advantage over AMD. Just hope AMD can really come up with something so they can once again beat intel at the top... i am one of the AMD fan anyway, but more of a realist




RE: that's why...
By TomZ on 4/24/2009 9:16:21 AM , Rating: 2
Huh - how was HyperThreading a failure in P4? When it arrived, that improved OS responsiveness and increased overall performance. And now it is doing the same in i7, at the expense of just a tiny bit more logic.


RE: that's why...
By SublimeSimplicity on 4/24/2009 9:43:10 AM , Rating: 2
Tom knows what he's talking about. HyperThreading was and is a brilliant concept. It "failed" in the P4 days because it increased the logic core utilization by increasing the instructions per clock. During the MHz wars Intel and AMD relied on a low efficiency in this regard to keep the chip cool. When they enabled HyperThreading the cores got hotter faster, because they were doing more computations per MHz. So even though they were faster real world, Intel had to lower the clock speed they sold them at, which was terrible for marketing.

Now with memory latencies much higher than they were then, it's even more useful. Depending on the memory access rate of an application, you can get anywhere from 50%-100% of the gains of a full second core with HyperThreading.


Reality
By fishbits on 4/23/2009 10:26:36 AM , Rating: 2
"Over time, designers realized such a blind [megahertz] race was foolish, and that it was conceding far too much in efficiency and heat."
Reality: CPU designers hit a wall with how far they could ramp up clock speeds, making them look elsewhere for gains.

At any rate, bring the cores. I can use them, personally. Cheaper multi-socket solutions too, while you're at it. There is still a lot of low-hanging fruit in software/OS design that could capitalize on systems with more cores available.




RE: Reality
By murphyslabrat on 4/23/2009 7:29:26 PM , Rating: 3
Actual reality: AMD started being a major threat performance-wise, and Intel realized that people weren't buying Megahertz anymore.


Arms race = good
By Finnkc on 4/23/2009 10:47:17 AM , Rating: 2
However you see it,

It is an arms race and in the end good for us.




RE: Arms race = good
By TomZ on 4/23/2009 11:14:03 AM , Rating: 2
Agreed, with the caveat being that we want both sides standing in the end. As the cliche goes, we don't want Intel to have 100% of the market.


Amdahl's Law
By TeXWiller on 4/23/2009 2:35:13 PM , Rating: 2
You can have some quality time at http://www.cs.wisc.edu/multifacet/amdahl/ . Input a log function as the performance function ("log(x)"). Input 16 for the BCEs. Press Draw. No wonder six core parts are the next step for AMD and Intel.




RE: Amdahl's Law
By whydoyoucare on 4/23/2009 2:46:24 PM , Rating: 2
Thanks mate!
Not so much for the link, but the link that I found once I was there, the google tech talk looks real good!


Geez Mick, get a dictionary already!
By C'DaleRider on 4/23/2009 4:39:15 PM , Rating: 1
quote:
....the 90s phenomena in which.....


The word you were looking to use is phenomenon, the singular "version" of the word, which would describe a speed race quite well.

Phenomena is plural.

Get a dictionary already.




By SlyNine on 4/24/2009 12:34:48 AM , Rating: 2
After all that, I could swear I heard crickets. The Grammar police strikes again.

I'm a repeat offender.


you know...
By meepstone on 4/23/2009 10:49:59 AM , Rating: 2
whats funny is people are saying here that its useless to have that many cores.

Always coming from the people who know nothing about it and talk like they are some know-it-all genius, yet they have no answers themselves. Congradulations to these people and their pointless posts.




How many cores are needed?
By m1j on 4/23/2009 11:37:02 AM , Rating: 2
For anyone who thinks having more cores is a waste let me explain. Every program running on a computer runs in its own thread. Each thread needs CPU time. If you have only 4 cores then the many programs running in your task tray all have to share those 4 cores. The virus program, email, explorer . . .
16 cores is still not enough because there are some programs that are multi threaded and now use more than one core.

Even if all you do is gaming this is how it would work. Your game would now get to use a complete core without having to share it because there would be 11 to 15 other cores to handle all those little programs that are always running.

I wish they (Intel/AMD) would stop playing around and start looking at 256 or more cores. Tech companies never push innovation, just marketing.




Can AMD Deliver?
By jcbond on 4/23/2009 11:59:38 AM , Rating: 2
I'm glad they have a roadmap, because I have just about always used AMD. But they don't have the capital to keep up in R&D, and Intel sets a fast pace. They've been bleeding money, especially since they badly overpaid for ATI, and Intel righted their processor design direction (which AMD MUST have been able to see coming). I question whether they will be around. I question if they will have the wherewithal to execute their roadmap. And I bet IT departments the world over are looking at their future with a skeptical eye.




I hope
By Chernobyl68 on 4/23/2009 12:11:47 PM , Rating: 2
I hope this announcement isn't just a paper announcement to prop up the stock value.




First time...
By jemix on 4/23/2009 1:01:13 PM , Rating: 2
quote:
This will be the first time in history where less is more.


Interesting... Doesn't less/smaller die size = greater performance?




Lost its way
By JosefTor on 4/23/2009 2:13:24 PM , Rating: 2
I have always been a fan of AMD when they were up-and-coming and gaining market share against Intel. Once they seemed like they had the superior processor designs (I believe it was K6 and maybe K7), they started acting a lot more like Intel which to me is why they can't drive sales.

I liked AMD because they would use the same chipset designs for very long periods of time and even when their next generation chip would come out, it would still be backward compatible so I could buy a motherboard, a set of RAM, and just keep buying new AMD processors. I really want a new processor in my computer because it is quite old (with socket 939), but I don't want to buy a new motherboard, a new RAM set, and go through that whole process again because I only like the top equipment and great motherboards cost $200+. So... I would have bought a new processor or two because I'm excited about these new technologies, but haven't due to the large up front cost (well... at least for a college student). And since AMD no longer has the long backward compatibility going for it, they either have to be faster, or cheaper. Kind of a hard place for a company going up against a bigger and meaner company.




By whydoyoucare on 4/23/2009 2:29:10 PM , Rating: 2
to everyone that keeps saying memory bandwidth wont keep up with the increased core count...

stop parroting what you've read before and read the specs!!!

this is TWO istanbul chips with TWO dual channel memory controllers... the memory bandwidth bottleneck with the 12core magny-cours chip will be IDENTICAL to that in the istanbul 6core chip




Core race 2.0
By PKmjolnir on 4/23/2009 3:07:56 PM , Rating: 2
From looking at both AMD and Intel roadmaps I'd say that the core race is quite close to gaining a new dimension.

That of specialized cores, the GPU core is right around the corner and once the first generation of C+GPUs flex their benchmark scores i suspect we might see the addition of a broad selections of more or less useful benchmark scorefarming cores until we have about as many cores in our future processors as we have registers in the current ones.




Why...
By cscpianoman on 4/23/2009 4:26:09 PM , Rating: 2
I don't know why AMD is continuing to one up Intel, it will never happen for an extended amount of time.

AMD has a poor mindset with this. Intel, right now, has a a huge advantage with a large R&D and time on their side. Intel probably has a "killer" chip in the wings, waiting for if and/or when AMD releases their next generation chip. In other words, Intel is not going to be trumped again. AMD should just accept that fact and work to reduce costs and stay in a competitive margin. Probably won't happen because both have the mentality of holding onto the "crown." But one can hope that AMD pulls through on this.




bah
By MrPoletski on 4/24/2009 7:05:03 AM , Rating: 2
Bring out consumer quantum computers already!




“We do believe we have a moral responsibility to keep porn off the iPhone.” -- Steve Jobs

Related Articles
AMD Posts Another Massive Quarterly Loss
April 22, 2009, 10:53 AM
AMD Offers Roadmap for Future Processors
November 14, 2008, 1:44 PM
Hello AMD Socket G34
July 16, 2008, 5:38 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki