backtop


Print 124 comment(s) - last by Justin Case.. on Oct 1 at 11:13 PM

New Phenom triple-core processor coming in 2008

AMD today updated its roadmap with another multi-core processor, to slot between its dual and quad-core processors – Phenom triple-core processors. The new Phenom triple-core processors feature three processing cores on a single die and based on AMD’s Barcelona architecture, which launched last week in Opteron form.

The new triple-core processors will feature similar specifications as its upcoming Phenom X2 and X4 brethren. The Socket AM2+ processors feature 512KB of L2 cache for each core and a shared pool of L3 cache. Essentially, the Phenom triple-core processors are quad-core variants with one core disabled. This allows AMD to simply disable one core on quad-core dies for maximum use of a single wafer.

AMD claims to be the only company to offer tri-core processors, which the company claims to bring “true multi-core technology to a broader audience.” AMD has not given the Phenom triple-core processors an official name yet. However, it wouldn’t be too surprising if the tri-core processors followed the current Phenon naming scheme and received the Phenom X3 name.

“With our advanced multi-core architecture, AMD is in a unique position to enable a wider range of premium desktop solutions, providing a smarter choice for customers and end users,” said Greg White, vice president and general manager, Desktop Division, AMD. “As a customer-centric company, AMD is committed to working with our OEMs to deliver compelling value propositions across their multi-core product families with capabilities that address their requirements and aspirations.”

Features unique to AMD’s Barcelona and Stars architectures such as split power planes and dynamic independent core speed adjustments remain supported on triple-core processors. Additionally, AMD Phenom triple-core processors support HyperTransport 3.0 for up to 16GB/second of I/O bandwidth.

AMD claims significant performance gains over dual-core processors with its triple-core processors in benchmarks such as SYSmark 2007 and 3DMark06, where gaming and digital content creation performance is key.

“A continued commitment to elegant design and innovative processor architecture is instrumental to revolutionizing the technology industry,” said Richard Shim, research manager for IDC's Personal Computing program. “The advent of triple-core processors is a valuable market opportunity for customers to deliver end users compelling solutions and further differentiate on the desktop.”

Expect AMD to launch its Phenom triple-core processors in Q1 2008. AMD plans to launch its quad-core Phenom X4 next quarter.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

What market segment is this aimed at?
By jak3676 on 9/17/2007 11:22:55 PM , Rating: 4
This seems to be a really odd marketing move.

Performance wise, there is still limited use for mainstream multi-core CPU's. We are starting to see some good mult-threaded apps, but its still a minority of support. Just about everyone will see a boost in moving to a dual core CPU, as this allows one core to focus on a single main thead (games, video editing, etc) while the other core handles all the background tasks, but many users may not even notice the difference from a single vs dual-core CPU. For this mainstream market there is little use for any more than 2 cores.

For the performance crowd (and I think there are a growing number of us), the tri-core system (gee that sounds like a Zelda reference) will probably not gain any huge following unless AMD prices them signifigantly cheaper than the quad-core CPU's. It will cost AMD the same amount to manufacture a tri-core CPU as a quad-core CPU so it seems like AMD will loose potential revenue with every tri-core CPU sold.

The only way I see this making scense is if AMD is having some production issues where some quad-core CPU's develop a glitch that only yields three functioning cores. Intel did have some similar issues with their original core CPU's. But if this is the case I'm guessing you won't see many of the tri-core CPU's being implemented.




RE: What market segment is this aimed at?
By jak3676 on 9/17/2007 11:35:43 PM , Rating: 2
Added thoughts

Disabling one core won't save you much in terms of wattage on AMD's current architecture. All of the cores run at the same voltage, although they can clock themselves lower for some saved wattage. At first I was thinking that this may allow you to cut the TDP to 3/4 of the original quad-core, but I'm guessing it would be more like 7/8 or 9/10 of the original TDP.

I don't see this being an added benifit for overclocking either. With the majority of your chip still running at full power, you'd end up with some odd heating patterns, but probably only very limited advantage in terms of overclocking. You may be able to squeeze a few more MHz out of it, but I think the dual-core versions will probably prove better for O/C than the tri's. We'll definately have to wait for some tests on this point.

The only advantage I can see would be that you end up with more efefctive L3 cache per core (same amount, just shared between fewer cores). I kinda doubt that there will be programs to take advantage of this, but it may have some use in programs that need to share large amounts of data in between cores. I think I'm reaching too far on this idea - maybe I'm just not seeing it.


RE: What market segment is this aimed at?
By sackland on 9/17/2007 11:53:56 PM , Rating: 2
Key wording here being "current architecture", if these CPUs were used on one of the new motherboards with split power planes I bet you would see a power improvement over quad core. In addition if they can do like the GPU guys already do and have fusible logic, they could fully disable a core at packaging and keep it from using power.

(nVidia and ATI can sometimes fuse or turn off unusable shaders and sections of the cores which is how we sometimes end up with lower SKUs of the products with lesser power and performance yet same clocks)


RE: What market segment is this aimed at?
By jak3676 on 9/18/2007 12:23:50 AM , Rating: 2
in terms of "current architecture", the split power plane is only a split between the memory controller and the cores - it is not a split between the cores.

from Anand Tech's article
quote:
Power

AMD has made numerous improvements compared to the K8 core:
The FPU unit can be turned off when not needed
Clock gating is implemented much better
Each core can run at its own frequency (but the voltage is the highest needed by either core)
Power for the core and memory controller are split


I do agree that there would be some power savings here, but we're not talking about disabling 1/4 of the chip. With ~ 460 million transistors for the entire CPU, the number of non-cache transistors in the Barcelona core is ~250 million total or ~62 million per core. So disabling one core cuts out just over 13% of the total transistors.


RE: What market segment is this aimed at?
By jak3676 on 9/18/2007 2:06:44 PM , Rating: 4
I don't really mind getting rated up or down for my usual rambling - but can someone explain why this post got rated down? This is straigt fact and numbers. If I quoted a number wrong, please let us all know.


By deeznuts on 9/18/2007 11:08:17 PM , Rating: 2
Fanboys, just ignore it. They sometimes act irrationally, so asking or thinking about it rationally isn't going to get you anywhere.


By mars777 on 9/19/2007 3:23:16 PM , Rating: 3
Maybe because individual cores can already save power and clock by themselves with cool&quiet.
This is only made to clock differently the IMC thus saving power when it's not 100% loaded with memory requests.


By jak3676 on 9/17/2007 11:57:31 PM , Rating: 2
I agree that it doesn't necessarily have to be a matter of having a core that won't work at all. If one of them won't clock as high as the other three it may make more sense to just disable it.

i.e. 3 cores are stable at 2.5GHz, but one of the cores isn't stable past 2.0GHz - so do you sell it as a 2.5GHz tri-core CPU or a 2.0GHz quad-core?

I would thik that AMD would be more interested in fixing the root level manufactuing problem though.

Another way to think about it, may be that AMD has been able to boost the clock speed across the CPU, but only by cutting power to one of the cores. (in other words there is not a problem with any of the cores, but if you disable any one of them, then you can clock the rest of the CPU higher.) If this is the case though it would seem to suggest that there is an archetectual problem with heat dissipation though.


By praeses on 9/18/2007 12:53:13 PM , Rating: 2
As jak3676 said:
quote:
you end up with more effective L3 cache per core (same amount, just shared between fewer cores)

(corrected typo)

If this is marketted correctly this could be quite significant for price/performance. Approach wise, its not much different than the Radeon 9500 of yesteryear although we probably are not going to be able to re-enable the core on select cpus. I do not recall many people complaining about the 9500 even if they weren't planning on unlocking those pipelines.


RE: What market segment is this aimed at?
By Treckin on 9/17/2007 11:36:43 PM , Rating: 4
The tri-core I would guess is simply an attempt to avoid a 'core war'...

If AMD suddenly offered two Barcelona dies on one chip, people would bash them. If they offer only 3 cores in one die, people bash them.

If they offer the first x86 quad core, people bash them...

I think I just died a little on the inside...


RE: What market segment is this aimed at?
By afkrotch on 9/18/07, Rating: 0
RE: What market segment is this aimed at?
By shiznit on 9/18/2007 2:26:50 AM , Rating: 3
you are correct in saying the intel offered the first x86 quad core, but Yorkfield is NOT native. It is 45nm but still 2 dies like the current quads.


By retrospooty on 9/18/2007 10:23:29 AM , Rating: 2
Yup... Nahalem, in late 2008 will be Intel's first native Quad.


RE: What market segment is this aimed at?
By z3R0C00L on 9/18/2007 11:57:44 AM , Rating: 2
And this matters because......?????!!!!!


RE: What market segment is this aimed at?
By Alpha4 on 9/18/2007 2:14:32 PM , Rating: 2
He was just correcting AFKrotch.


RE: What market segment is this aimed at?
By deeznuts on 9/18/2007 11:10:46 PM , Rating: 2
I wouldn't say correcting AFKrotch, but clarifying.

The two-die approach Intel has taken has not been proven, on the consumer desktop side, to be a hinderance.


By mars777 on 9/19/2007 3:32:42 PM , Rating: 2
Yeah, I'm wondering too... why hasn't AMD offered processors with two dies in one package.

For Example:

They will use Barcelonas with one core damaged for X3 Phenoms...

And they should use Barcelonas with 2 damaged cores for X6 barcelonas with two dies in one package!

With the current state of TDP they could manage it staying in a reasonable watt limit. With a six core processor they could rise up their share price, no matter how it performs, because it has 6 "threads".


By Targon on 9/18/2007 6:30:53 AM , Rating: 2
With Vista doing a bit more behind the scenes, dual-core is already seen as very important, if not essential. Now, with this in mind, adding just one more core over the current dual-core processors MAY help a lot of end-users in terms of overall system performance, even if current single-threaded applications see no significant improvement.

In addition to this, as multi-threaded applications finally begin to show up in the next year or two, having the extra core will give a bit of a boost without having to pay a price premium for a quad core processor.

So, the market segment this is aimed at is for those who would be talked into getting a quad-core system but looking at the price, would want something a little cheaper.

It may also be that AMD has found that their fab process has a good number of processors that have failed due to one bad core, and this would be a way to sell the "defective" processors. There really is no way to know at this point if this is the case or not(if the three-core fade from the roadmap quickly, we can speculate that it was fab problems that inspired this).


By encryptkeeper on 9/18/2007 9:37:25 AM , Rating: 4
It will cost AMD the same amount to manufacture a tri-core CPU as a quad-core CPU so it seems like AMD will loose potential revenue with every tri-core CPU sold.

Sure it costs the same thing to manufacture a triple core as it does a quad core. But selling those triple cores is better than throwing them away.

When cores are manufactured, they are placed into CPU packages depending on several variables, like processor supply and demand, cost, and most importantly, which CPU's pass certain QC tests. Take Intel for example. The core from a Celeron 400 series and a C2D extreme are manufactured the same way, but the core that eventually became a Celeron only passed very few of the QC tests and the core that became an Extreme passed several high stress QC tests. That's basically what it means when you see "such and such processor with this much cache disabled". Why throw the chips away if they can still be used, even if it's not at their top efficiency. That's why Intel pushed themselves to get rid of single core P4s, 800 and 900 series processors and 300 series Celerons. Their fab process is way more efficient now that they basically manufacture the same chips over and over.

If bench tests are good for the tri-cores, you'll probably see Intel put them out too.


By jay2o01 on 9/18/2007 10:23:02 AM , Rating: 3
This is a well aimed move for AMD. After initial barcelona numbers surfaced it was clear AMD couldn't compete clock for clock. Think for a second how throwing another core in the mix (hopefully at similar prices to the dual core) will change this picture significantly.

Its also a smart move going forward, especially as they begin to think about making octal cores. Yields at the 45nm node will probably not be good for AMD, 5,6,7 core AMD chips coming in 2h 2008...lol


By imperator3733 on 9/18/2007 11:10:25 AM , Rating: 3
quote:
It will cost AMD the same amount to manufacture a tri-core CPU as a quad-core CPU so it seems like AMD will loose potential revenue with every tri-core CPU sold.


It seems to me that these tri-cores would be quad-cores that have one deffective core. For chips like that, AMD has two choices.

1) Disable the deffective core and one other core and sell it as a pseudo-Kuma dual core at the dual core price

--or--

2) Disable only the deffective core and sell it as a tri-core at a price somewhere between similarly clocked dual and quad cores.

By releasing tri-cores, AMD can choose option 2, getting more revenue due to the slightly higher price.


The INQ was right after all...
By MDme on 9/18/2007 1:01:31 AM , Rating: 1
just had to get it out of my chest....




RE: The INQ was right after all...
By Justin Case on 9/18/2007 2:11:49 AM , Rating: 3
That might have something to do with the fact that they actually go hunting for news instead of sitting at home waiting for the manufacturers' press releases to arrive in their inbox. ;)


RE: The INQ was right after all...
By James Holden on 9/18/2007 2:43:12 AM , Rating: 3
Nobody remembers when you're right. People remember when you're wrong.

Justin, if you aren't living proof of that, I don't know who is.

http://www.dailytech.com/article.aspx?newsid=3400


RE: The INQ was right after all...
By johnsonx on 9/18/2007 12:24:27 PM , Rating: 2
yep, Rydermark is the first thing that comes to mind every time I see Justin Case post anything. Thanks to that, and other.... epsiodes...., my BS detector starts to beep quietly as soon as the name comes up.


RE: The INQ was right after all...
By Justin Case on 9/18/2007 6:38:19 PM , Rating: 1
I absolutely stand by my comments to Rydermark: if the author had said "this is a real-time render of a 3D scene", I would have called bullshit. Without knowing what that image was supposed to be, you simply cannot conclude that the author's claims (which were related to the precision of some nVidia shaders) were true or false.

In fact, even if the author had claimed that image was a real-time 3D render (which, as I said then and repeat now, would be complete bullshit), that still would not tell us anything about the fundamental claim about the precision of the shaders.

But since the benchmark's author did not say what the image was supposed to be, for all you or I know, it could simply be the result of running a pre-rendered bitmap through some shader code, using a manually defined mask.

Unfortunately it seems a lot of people like to comment on issues related to pixel shaders without even understanding what a pixel shader is. Pixel shaders operate in pixel space, not on vertices. In games they are typically applied during the rendering process, but there's no reason why they can't be applied to a pre-existing bitmap.

Saying that "it looks just like a filter was applied in Photoshop, which proves it wasn't a pixel shader" is a demonstration of ignorance; a "filter" applied to a group of pixels (i.e., a bitmap) is exactly what a pixel shader does. In fact, a lot of (2D, pixel) filters used by high-end compositing applications (Shake, AFX, etc.) are implemented as... you guessed it, GPU pixel shaders.

Of course, it's easier to make a lot of noise and vote down posts that point out your ignorance than it is to actually go learn something about it. But the problem with that approach is that you continue to be ignorant.


RE: The INQ was right after all...
By CyborgTMT on 9/18/2007 8:48:47 PM , Rating: 2
quote:
Without knowing what that image was supposed to be, you simply cannot conclude that the author's claims (which were related to the precision of some nVidia shaders) were true or false.

Fudo:
quote:
SOME TWO weeks back we promised you screenshots to back up a story about a fudge on Nvidia benchmarks.


Seems pretty clear to me they are supposed to be screenshots of the benchmark.

quote:
But since the benchmark's author did not say what the image was supposed to be, for all you or I know, it could simply be the result of running a pre-rendered bitmap through some shader code, using a manually defined mask.

Taking an image and applying a manual mask... Yah, that's called using Photoshop.

quote:
Unfortunately it seems a lot of people like to comment on issues related to pixel shaders without even understanding what a pixel shader is. Pixel shaders operate in pixel space

Where the pixel fairies have their kingdom and all is good in the universe.

quote:
Saying that "it looks just like a filter was applied in Photoshop, which proves it wasn't a pixel shader" is a demonstration of ignorance

Actually the fact that most of the stock images that they used to create those 'screenshots' were found on the web demonstrates your continued ignorance.

quote:
Of course, it's easier to make a lot of noise and vote down posts that point out your ignorance than it is to actually go learn something about it. But the problem with that approach is that you continue to be ignorant.


I know a significant amount of information when it comes to pixel shaders considering I've worked with quite a few over the years. I've had my hands in more game code than you could possibly imagine.


RE: The INQ was right after all...
By Justin Case on 9/19/2007 2:01:22 AM , Rating: 2
First, Fudo (who will believe just about anything, and misunderstands half of what he's told) is not the author of the benchmark. I was under the impression that the DailyTech article was about the benchmark (and specifically about the author's claims that the nVidia shader pipeline did not have the same precision as the ATI one), not about Fudo. What you're trying to do isn't even proper ad hominem (which would be to attack the benchmark's author); you're attacking the guy (Fudo) who said the other guy (Rydermark author) had said that nVidia's shaders had reduced precision, as a way to dismiss (or avoid addressing) that claim.

Second, tons of benchmarks use stock / public domain images that can be found on the web (does the name "Lenna" ring any bells?). Your point was...?

Third, "screenshot" is not the same as saying "a 3D render". Lots of benchmarks include 2D tests, and there is no fundamental difference between running a pixel shader on a 2D bitmap or on a 3D scene. In fact, it's perfectly possible to create a pixel shader benchmark that doesn't work on or display any images at all. A "pixel shader" is just a program meant to be applied to pixel values. The "pixels" are simply (sets of) numbers. You can run it, time it, compare the result of the calculations to the expected value, and you have all the data you need, even without displaying any pictures (let alone 3D scenes).

If, as you claim, you've "had your hands on tons of game code" (more than I can possibly image - whoah! you must be John Carmack's smarter brother), you must know that.

In fact (as I mentioned in the original thread), if anyone at DT was really interested in Rydermark's claims (instead of using that just as an excuse to attack the Inquirer), all they had to do was write such a program and check the results. I'm sure they had access to the cards mentioned by the Rydermark guys. But it was pretty obvious they couldn't care less about that (besides, nVidia probably advertises on DT).


RE: The INQ was right after all...
By CyborgTMT on 9/19/2007 2:36:19 PM , Rating: 2
quote:
First, Fudo (who will believe just about anything, and misunderstands half of what he's told) is not the author of the benchmark ..(blah blah blah).. as a way to dismiss (or avoid addressing) that claim.

1 - I was not attacking the benchmark or the creators. I was attacking the INQ and you for believing that crap.
2 - The creators of the benchmark themselves have stated that they never sent those pictures to Fudo and they see no difference between the two companies in their benchmark.
quote:
Second, tons of benchmarks use stock / public domain images that can be found on the web (does the name "Lenna" ring any bells?). Your point was...?

I'll get to that with my answer to your next point.
quote:
Third, "screenshot" is not the same as saying "a 3D render". Lots of benchmarks include 2D tests, and there is no fundamental difference between running a pixel shader on a 2D bitmap or on a 3D scene.

My 'point' is THIS ISN'T A 2D RENDERING APP. It's a 3d graphics benchmarks. And there is a very fundamental difference to running on a 2d bitmap and a 3d scene as they use completely different shader calculations.
At this point I'm really tempted to write out exactly how pixel shaders - which isn't a program but a series of instruction sets - work but this off topic conversation has wasted too much space already. Besides if you don't get it by now it's not worth the time.
quote:
If, as you claim, you've "had your hands on tons of game code" (more than I can possibly image - whoah! you must be John Carmack's smarter brother), you must know that.

No, but coincidentally the first gaming software I ever worked with was the Quake engine.


By Justin Case on 10/1/2007 11:13:03 PM , Rating: 2
A shader isn't an "instruction set". An "instruction set" is a language. A shader is a sequence of instructions, also known as... a program. Pixel shaders have been around for a long, long time (much longer than you probably think).

If you never worked on a game before the Quake engine was available, then I've probably "had my hands on more game code" than you could possibly imagine.

Oh, and your #2 statement is simply false.


RE: The INQ was right after all...
By johnsonx on 9/19/2007 4:02:12 PM , Rating: 2
beep....beep...BEEP! BEEP! BEEP! BEEP! BEEEEEEEEEEEEEEEEEEPPPPPPPPPPPPPPP................. ..


RE: The INQ was right after all...
By TomZ on 9/18/2007 10:10:26 AM , Rating: 2
Justin, if you are affiliated with Inq, you should disclose it here.


RE: The INQ was right after all...
By crystal clear on 9/18/2007 11:26:28 AM , Rating: 2
Just in case- if you Justin are as Tom says,then admit it if it is true-NO CRIME committed.

Tom has the unique ability exposing people correctly-like the guy using DUAL USER NAMES.....remember.

Great work TOM.


RE: The INQ was right after all...
By TomZ on 9/18/2007 1:15:00 PM , Rating: 2
I can't take credit for this, someone else made the possible association in the thread linked above.


By CyborgTMT on 9/18/2007 2:08:38 PM , Rating: 2
Hehe, I love a good INQ bashing. That thread brought back some fun memories :).

Justin - you're just a lap-dog so don't feel bad when people with real knowledge around here hit you with a rolled up newspaper.


RE: The INQ was right after all...
By Justin Case on 9/18/2007 6:54:17 PM , Rating: 2
No, Tom, as I've told you before, I'm not. I just happen to like news sites that actually publish news, as opposed to sites that have slowly become "press-release replicators".

If all I want is to read AMD's official press releases, I can go directly to AMD's site. The point of IT journalism is to go looking for the news and tell them to the end users before they are made publicly avilable by the industry. Knowing about a new product 24 hours in advance can be the difference between making a lot of money or losing it. And, in that aspect, the Inquirer (and the Register, and a couple of other real news sites - including Anandtech, in the old days) has been quite good to me over the years. And, also from that point of view, the more people ignore it or fail to understand its articles, the better. You continue to get your "news after the fact", and I'll continue to get my "unfounded and premature speculation"... that turns out to be right 8 times out of 10.

P.S. - Why on Earth would I want to use "dual user names", as the numbskull below suggests? Do I look like I have any problem speaking my mind with this one...? Feel free to check my IP. Check your own, while you're at it, maybe you'll find that you're suffering from schizophrenia.


RE: The INQ was right after all...
By TomZ on 9/18/2007 8:44:10 PM , Rating: 2
Well, thanks for clarifying. I don't think anybody is saying they think you have two user accounts. That was referring to somebody else who got caught in the act. And for the record, no, I only have one user account here, like anybody cares anyway.


RE: The INQ was right after all...
By crystal clear on 9/19/2007 3:28:37 AM , Rating: 1
Tom you put forward a question-

Justin, if you are affiliated with Inq, you should disclose it here.

A response of YES or NO is missing,rather you get a vague response evasive in nature


By rdeegvainl on 9/19/2007 5:29:11 AM , Rating: 2
Not siding with anyone here,
but did you read his first word in the post, the one that says "NO"


By crystal clear on 9/19/2007 3:23:12 AM , Rating: 1
Nobody accused you of using 2 usernames !-while you're at it, maybe you'll find that you're suffering from schizophrenia.

Numbskull should be your username !


bad quad core = tri-core
By michal1980 on 9/17/2007 11:08:02 PM , Rating: 2
could this be marketing spin saying hey, our 4 core cpu's sometimes have a dead core, lets sell them at 3?




RE: bad quad core = tri-core
By ghostbuster on 9/17/2007 11:38:08 PM , Rating: 2
For AMD's sake I hope that their yields are good enough so that they don't get that many chips with a dead core to justify launching a new product line. I think it's more likely that by turning off the slowest core they can hit higher clocks and to AMD that might make more business sense than selling sub-2GHz quad-cores.


RE: bad quad core = tri-core
By z3R0C00L on 9/18/2007 11:59:40 AM , Rating: 2
If they're releasing a whole sku that will be dependent on one core being defective then you've got your answer. Many cores are probably defective.


RE: bad quad core = tri-core
By Goty on 9/18/2007 12:26:13 AM , Rating: 4
That same gameplan has worked for the GPU industry for the last five years or so, why not apply it to the CPU market?


RE: bad quad core = tri-core
By Alpha4 on 9/18/2007 2:14:02 AM , Rating: 3
Sony & IBM have already implemented that practice with the Cell. They stated expected yields of around 20% loss with 8-core chips so they marketed the Cell as a 7 core Processor.


RE: bad quad core = tri-core
By imperator3733 on 9/18/2007 11:12:54 AM , Rating: 2
You mean 7 SPE processor, right? (1 PPE + 8 SPEs as designed vs. 1 PPE + 7 SPEs in PS3)


RE: bad quad core = tri-core
By Alpha4 on 9/18/2007 2:33:18 PM , Rating: 2
Definitely yes, you're right. 7 Synergistic processor elements or whichever as opposed to 8.


RE: bad quad core = tri-core
By FITCamaro on 9/18/2007 6:50:53 AM , Rating: 3
Pretty much. Its not a bad move either. I mean with them having a native quad core architecture, they're going to have chips where one core is bad. So instead of just throwing those chips out or disabling another core to make a dual core, they're going to sell them as a tri-core processor. Gives them another product to sell and they can sell it for slightly more than a dual core.


RE: bad quad core = tri-core
By FastLaneTX on 9/18/2007 11:01:43 AM , Rating: 2
Those who do not remember history...

Intel made gobs of processors that were supposed to be 486DXs. Some had a bad FPU and were sold as 486SX; others had a bad CPU and were sold as 487SX. Intel and AMD have refined that practice to making all their processors with the maximum amount of cache and selling the ones that partially failed validation as reduced-cache variants. Same thing they do with speed-binning.

So apparently AMD's model of a "native" quad-core chip has resulted in some that only have three working cores. Better to sell them than throw them away, right? They can price them low, and gain marketshare, because it's basically free money. It's not as big a deal for Intel today since they only make two cores per chip, but I bet you'll see the same from them once they go native. We'll be seeing 5- to 7-core chips from AMD soon after they ship a native 8-core.


RE: bad quad core = tri-core
By johnsonx on 9/18/2007 12:09:26 PM , Rating: 3
quote:
others had a bad CPU and were sold as 487SX.


Sorry, but that's wrong. The 487 'math co-processor' was a complete logical replacement for the on-board 486SX; for all intents and purposes, a 487 was actually a 486DX. Once you installed it, the 486SX became dormant. So naturally, there couldn't be any defects in the 487 core or it wouldn't work.

quote:
Intel and AMD have refined that practice to making all their processors with the maximum amount of cache and selling the ones that partially failed validation as reduced-cache variants.


Also incorrect. While that does occur, it is completely incorrect to say they make 'all' their processors that way. In reality very few are made that way, as it would be extremely wasteful of wafer space, and therefore money.


Only company with Triple core processor? Yeah right
By Azsen on 9/18/2007 7:11:48 AM , Rating: 2
"AMD claims to be the only company to offer tri-core processors"

http://en.wikipedia.org/wiki/Xenon_%28processor%29

The IBM processor in the Xbox 360 is triple core.




By FITCamaro on 9/18/2007 9:34:35 AM , Rating: 2
If you're even comparing the two chips you need to have your head examined.

And last I checked, you can't stick that piece of crap (I love my 360 btw) in a PC and run an OS on it. It's an extremely simple in-order PowerPC processor with horrible branch prediction. Same thing as the Cell's PowerPC core.


By TomZ on 9/18/2007 10:29:02 AM , Rating: 2
I think you missed the point. The article said,

AMD claims to be the only company to offer tri-core processors, which the company claims to bring “true multi-core technology to a broader audience.”

Notice this is not qualified to X86-only. Sure, this tri-core might be orders of magnitude better/faster/whatever, but AMD is clearly wrong to claim theirs is the only tri-core processor.


By siberus on 9/19/2007 12:30:11 AM , Rating: 2
"AMD claims to be the only company to offer tri-core processors, which the company claims to bring “true multi-core technology to a broader audience.”

Amd is not claiming to have the only 3 core processor, they're claiming to have the only one slated for sale to users/system builders. Ibm Isn't offering their 3core processor to the masses. As far as i know Ibm's 3core is only in the 360. So the claim is correct at this point in time.


By TomZ on 9/19/2007 9:17:40 AM , Rating: 2
Huh? In your own quote "AMD claims to be the only company to offer tri-core processors" - where is that statement qualified to apply to users/system builders?!?


By siberus on 9/19/2007 10:33:51 AM , Rating: 2
The point of my quote was to pinpoint the word offer guess i should have underlined it. To offer and to have are two totally different things. Tell me which computer store i can buy a 3core IBM chip and mobo at and i will say that amd isn't the only one planning to offer 3 core chips. If they said they were the only one to have a 3core ppl would cry blasphemy on them in a heartbeat.

“With our advanced multi-core architecture, AMD is in a unique position to enable a wider range of premium desktop solutions , providing a smarter choice for customers and end users,”

"AMD is committed to working with our OEMs to deliver compelling value propositions across their multi-core product families with capabilities that address their requirements and aspirations.”

Im a casual poster so by nature it must make me a lazy one. So maybe I should have thrown those quotes in the first time to cover my bases? It wasn't the crux of my argument though so i guess i didn't feel the need.


By TomZ on 9/19/2007 12:25:44 PM , Rating: 2
I don't know the part number of the IBM processor, but you can buy these all day long at electronics distributors:

http://www.infineon.com/tricore/

I also wanted to point out that "offer" doesn't mean "available at a retail level." Just because IBM doesn't sell the processor retail and there are no PC motherboards for it doesn't mean that IBM doesn't offer a 3-core processor.

Over and out...


By siberus on 9/19/2007 3:34:14 PM , Rating: 2
quote:
The TriCore® is the first unified, single-core, 32-bit microcontroller-DSP architecture optimized for real-time embedded systems. The TriCore® Instruction Set Architecture (ISA) combines the real-time capability of a microcontroller, the computational power of a DSP, and the high performance/price features of a RISC load/store architecture, in a compact re-programmable core.


LOL it has a misleading name it's still single core :S


By Flunk on 9/18/2007 10:37:36 AM , Rating: 1
"And last I checked, you can't stick that piece of crap (I love my 360 btw) in a PC and run an OS on it."

Of course not it uses a different instruction set. That doesn't make the chip useless. This statement is as stupid as saying the Core 2 is useless because it won't run programs written for Sun SPARC.

"It's an extremely simple in-order PowerPC processor with horrible branch prediction. Same thing as the Cell's PowerPC core."

Now you are comparing the Cell and Xenon processors? Using similar sets of instructions does not make the architectures the same. Xenon is a symmetric multiprocessing system with 3 cores while the Cell is an asymmetric multiprocessing system with 1 PPE (Somewhat similar to the cores in Xenos) and 8 simplied SPEs with a limited set of capabilities.

I think that's enough if I keep typing there will be an entire article here. Do you even know anything about how microprocessors work?


By deeznuts on 9/18/2007 11:17:28 PM , Rating: 2
Azsen, apart from ripping the cpu out of the 360 (preferably a dead modded one that has no warranty so no 360 has to die!) where can one purchase one of these 360 CPUs? If you can't, then AMD is still technically correct in saying they are the only ones to offer it, yeah?


Simple questions & economics
By crystal clear on 9/18/2007 10:15:09 AM , Rating: 2
AMD in a big hurry to get their products out into the market should ask themselves these questions.

"From a product offering, a company can offer three cores, but the question is, Where will the vendors put it, what will the price from the whole system be, and what markets will the vendors target?" Shim said. "Differentiation never hurts in the PC industry, but it comes down to economics and what the ultimate offering will be when the product is offered in a system."



http://www.eweek.com/article2/0,1895,2184441,00.as...

Is there sufficient market for such type CPUs ?

What does AMD gain by introducing such a processors ?

Will the prospective buyers be impressed enough to buy it ?

A lot of questions for a company that cannot afford to make more mistakes-A big RISK at a bad time.........




RE: Simple questions & economics
By TomZ on 9/18/07, Rating: -1
RE: Simple questions & economics
By crystal clear on 9/18/2007 11:01:28 AM , Rating: 2
Hi there-very happy to hear from you-always a pleasure.

Hectic period here in Israel-Jewish festivals.

From Rosh Hashana to Yom Kipur(this friday) to Succoth, week after a week.

"Your analysis is correct"


RE: Simple questions & economics
By BitJunkie on 9/18/2007 12:54:21 PM , Rating: 2
...Other than the fact that if they are astute in their pricing they will be only marginally more costly than dual core CPUs, meaning they then control that market segment. Intel can't compete because of their technology and then are forced to fight it out in the quad core market, thus reducing their advantage. So ultimately they stand a chance of fragging the dual core market, and forcing Intel to make their money off of quad cores

It may be debatable technically - but by the time the marketting monkeys have done their thing, AMD are potentially going to be in a superior position at the price point we have associated with dual core cpus - if they can sell tri-core aka crap quad-core CPUs at a healthy margin, then they are going to be looking good. It's about the best move they could make - I'm well impressed.


By crystal clear on 9/18/2007 1:12:10 PM , Rating: 2
quote:
I'm well impressed.


yes but as I said earlier-

Will the prospective buyers be impressed enough to buy it ?



Anyway nice reading your comments.


RE: Simple questions & economics
By TomZ on 9/18/07, Rating: 0
RE: Simple questions & economics
By BitJunkie on 9/18/2007 4:12:59 PM , Rating: 2
I guess the way I see it is that 3-core AMD processors will be competative in terms of price and performance with Intel 2-core processors. With the right marketing spin they may gain some advantage and dominate the high volume low margin sales using silicon that would have been thrown away. "Three for the price of Two" anyone?

That premise being that "Next year 2-core will be only for cheapie PCs" will become "Next year 3-core will be only for cheapie PCs". As for what happens to intel 2-core CPUs at that point, who knows, maybe they'll be used to make recycled building materials? :)


RE: Simple questions & economics
By TomZ on 9/18/2007 4:49:23 PM , Rating: 2
I agree, and I see what you're saying. I guess a lot depends on how things price out in the end. I just think that since Intel doesn't have a 3-core (yet?), they may peg their 4-core processor at that same price point. I guess we'll see.


RE: Simple questions & economics
By CyborgTMT on 9/18/2007 5:11:51 PM , Rating: 2
The 3 core offering can also be competitive to high end quads if AMD handles this the right way. I'll use some simplified numbers since this is all speculation at this point anyway.

Lets say a 3.0 GHz quad core sells for $1000 while a step down quad at 2.8 sells for $900. With the 2.8 GHz part that is the highest stable speed across all 4 cores but, what if 3 of those cores can run at 3.4 GHz? AMD can cut out the 2.8 core and sell it as a tri-core running at 3.4 which should match or beat the 3.0 quad in performance in a lot of apps. This will allow AMD to sell what was originally a $900 part for $1000 or more.


RE: Simple questions & economics
By CyborgTMT on 9/18/2007 5:24:07 PM , Rating: 2
Another thought I just had.... If AMD 'kills' the cores in a way that the end user can change this will really be interesting. On one end of the spectrum if you can reactivate a 'dead' core you can get a quad version cheaper. Granted that is if the last core wasn't cut off because it's completely dead. I would be more willing to bet that the 'bad' core can't hit the target GHz within the right thermal envelope. Nothing a good 3rd party cooling solution can't fix. Or you can take it to the other extreme where a fully functional high end quad can have the slowest core disabled and overclocked much higher.


RE: Simple questions & economics
By zornundo on 9/20/2007 10:22:03 AM , Rating: 2
I wonder if its not a case of capacity increasing with every die shrink. As dies shrink and yield goes up, Intel may be stuck with so many processors that they only want to offer quad-core goodies or they will oversupply the market with dual-cores.


RE: Simple questions & economics
By CyborgTMT on 9/18/2007 1:38:19 PM , Rating: 2
quote:
Is there sufficient market for such type CPUs ?

That depends on the pricing. If AMD keeps dual-cores cheap and prices the tri-cores slightly higher there is definitely market room for them.
quote:
What does AMD gain by introducing such a processors ?

Increased revenue from what would otherwise be trash. Lets say only 70% of every quad wafer is good but 25% of the remaining will run as tri-cores, you're only losing 5% of the wafer to defects.
quote:
Will the prospective buyers be impressed enough to buy it ?

If a tri-core can be binned at 2.8-3.0 because 3 of the cores can run at that speed while the fourth can only hit 2.0 and the price point is below what a quad of the same performance many people will buy it. In reality very few programs will take advantage of a quad over a dual core right now anyway. I can see these being marketed towards gamers where higher core speed is more important than the number of cores. Take this example, if a tri-core running at 3 GHz matches up in (synthetic) benchmarks to a quad running at 2.6 and is at a cheaper or equal price point, the gamer is going to buy the tri.


Interesting
By Nubsicles on 9/17/2007 11:01:48 PM , Rating: 2
While at first glance I took this to be a bad marketing scheme on AMD's part - this could add another great performance/price bracket to the market




RE: Interesting
By BWAnaheim on 9/17/2007 11:09:52 PM , Rating: 5
To me, it sounds more as if all of the quads are not passing verification testing, so AMD is taking the chips with three good cores and calling them "tri-core". It would make sense if they could use this extra product to increase overall yield.


RE: Interesting
By kamel5547 on 9/18/2007 12:29:29 AM , Rating: 2
Agreed, this smacks of deffective quad cores needing to be sold at a higher price point than simple dual cores. They probably could ahve just dumped them out as dual cores by disabling another core, but this gives better pricing.


RE: Interesting
By SlyNine on 9/19/2007 4:41:33 AM , Rating: 2
Or the same pricing with a competitive advantage. I can see the Tri Core being a high end budget CPU. Witch would probably sell very will if priced right.


RE: Interesting
By FITCamaro on 9/18/2007 9:20:51 AM , Rating: 2
You think Intel doesn't have some cores of its dual core dies go bad? There will always be defects in production.

It's good business to sell as much of your produced product as possible. By reselling these as a triple core processor instead of throwing them out, they can sell all the different Barcelona chips at a lower price.


RE: Interesting
By docmilo on 9/19/2007 2:08:43 AM , Rating: 3
http://www.wired.com/techbiz/it/news/2007/09/trico...
This interview states they are bad quads.

It makes sense even though I was just hoping that they were preping a cpu for fusion. Make a good 3 core cpu with room for a gpu once the bugs are all worked out.


RE: Interesting
By RW on 9/18/07, Rating: -1
RE: Interesting
By TomZ on 9/18/2007 10:12:04 AM , Rating: 1
I think the CPU market right now already has too many options. I mean, geez, do we really need 50 different x86 compatible CPU models from only 2 different manufacturers?


RE: Interesting
By SlyNine on 9/19/2007 4:43:44 AM , Rating: 2
But if a better option comes out and has the right marketing. I can generate market share, and since the mid to high end budget CPU sector is the bread and butter. I can see this doing very well.


name for 3-core
By kintoy on 9/18/2007 4:09:31 AM , Rating: 5

AMD should name the 3-core proc Tri-Athlon

AMD is selling 3-core chip because Intel cant. Intel's "quads" are actually dual dual-core. And they dont have Direct Connect




RE: name for 3-core
By GeorgeOrwell on 9/18/07, Rating: -1
RE: name for 3-core
By kintoy on 9/18/2007 4:16:58 AM , Rating: 2

vaporware? its not Q1 of 2008 yet. geez. r u an Intel hack?


RE: name for 3-core
By TomZ on 9/18/2007 10:04:40 AM , Rating: 2
"Vaporware" is a product that's been talked about but not released yet, especially applied to products that are far behind schedule. So it's pretty fair to characterize Phenom as vaporware since it meets all these criteria.

geez. r u an AMD hack? :o)


RE: name for 3-core
By z3R0C00L on 9/18/2007 12:14:50 PM , Rating: 2
nice reply..:)

He sounds like the typical partisan hack making assumptions without verifying the facts. In other words a partisan Democrat or Republican hack..:p

Different subject, same tactics.


RE: name for 3-core
By kintoy on 9/19/2007 2:55:27 AM , Rating: 2

Phenom will only become vaporware if it isnt available in the 4th quarter, Intel hacks.


RE: name for 3-core
By SmokeRngs on 9/21/2007 5:46:58 PM , Rating: 2
quote:
a product that's been talked about but not released yet


That's not vaporware. That is an announcement of an upcoming product.

quote:
"Vaporware" is a product that's been talked about but not released yet, especially applied to products that are far behind schedule. So it's pretty fair to characterize Phenom as vaporware since it meets all these criteria.


Actually Phenom doesn't fit the requirements for vaporware. The release of Barcelona proves that the product is real. Considering the major architecture between Barcelona and Phenom are the same (as it was with the A64 and Opteron), it means Phenom is an actual physical, working product. However, it has not been released yet.

Phenom, while having its release date pushed back, is not way behind schedule. Late launches are nothing new in the CPU industry. While pushing back the launch and availability is regrettable, it's not a criteria of vaporware.

If you want examples of vaporware, look up Bit Boys and Duke Nukem Forever.


Masterstroke
By ctoit on 9/17/2007 11:51:44 PM , Rating: 5
Like most, I thought that this was an odd move on the part of AMD at first glance. But after thinking through it, I think its a masterstroke.

For all of you who keep saying that there are limited mainstream applications out there for multi-core CPUs, you obviously do not use spreadsheets like I use spreadsheets, and you obviously have not felt the power of Excel 2007 running multi-core. So for people like me with Excel formulas running into the upper 100MBs on a routine basis, multi core and Excel 2007 is a gods gift.

So back to AMD. They are the only kid in the block with native quad core. And yes, maybe many of them fail some test, and instead of junking them, why not market them as tri-core. Also, the battle for QuadCore is only beginning and Intel already had a long head start, so AMD will find teh battle for Quad Core supremacy tough to win anyway.

But the genius stroke is actually this. AMD have obviously lost ground to Core2Duo with the Athlon X2s and is unlikey to win that race anytime soon. But hey, if they can push out an X3 for the same price (or not much difference) as a Core2Duo, then its a no brainer as far as what I will choose. One more core will get my Excel 2007 running another 30% faster, period.

And likewise on the mass-market, with mum and pops shopping for their next family PC or the kiddies summer back to school PC. Geez, this cost $199 and have 2 cores and this cost $200 and have 3 cores, mmmm, what should I choose?

So Kudos to AMD. Hope you start gaining some traction back from Intel. For an AMD fan since K6, I was finally converted by Core2Duo. But i look forward to AMD retaking the crown.




RE: Masterstroke
By z3R0C00L on 9/18/2007 12:17:05 PM , Rating: 2
And a Native Quad Core is important because...... like what actual advantage does it bring to the consumer?


RE: Masterstroke
By TomZ on 9/18/2007 1:21:02 PM , Rating: 2
None, it is primarily an AMD marketing distinction at this point.

I think it is more sensible to switch over to native quad core once yields are high enough to support them. Otherwise MCM-style quad cores are fine. Therefore it becomes a question of which is cheaper to produce.


RE: Masterstroke
By theapparition on 9/18/2007 1:07:18 PM , Rating: 2
quote:
And likewise on the mass-market, with mum and pops shopping for their next family PC or the kiddies summer back to school PC. Geez, this cost $199 and have 2 cores and this cost $200 and have 3 cores, mmmm, what should I choose?

Sorry, but the mass market doesn't buy boxed processors. And it's pure speculation of AMD's price structure for these X3's.
For the mass market lemmings, that little "Intel inside" sticker goes a long way. People are even willing to pay more for it. Explains why Intel still dominated during the Athlon64 years.


Why AMD is doing this
By Justin Case on 9/18/2007 2:08:49 AM , Rating: 5
Why is AMD selling 3-core CPUs?

The short answer: because they can.

The slightly longer answer: because they can and Intel can't.

AMD's Barcelona / Phenom has trouble scaling beyond a certain speed, but yields are actually pretty good at the speeds that work. Sure, if a 4-core CPU has one dead core, they can sell it as a 3-core CPU, but most 3-core Phenoms will be perfectly functional 4-core models with one core deliberately disabled.

This lets AMD sell a product that Intel can't (Intel's design can do 2 or 4 cores, but doing 3 would require a lot of changes), at a price point somewhere between dual and quad-core CPUs.

If a 3 GHz dual-core costs $200 and a 3 GHz quad-core costs $1000, anyone with, say, $500 to spend on a CPU is going to buy the $200 model (and either keep the rest or spend it on other components). That's $300 less in profit for the CPU manufacturer. So the smart move is to add a $500 model to your lineup. That can be a very slow quad-core model (which will only perform well in servers, so home users won't buy it), a very fast dual-core model (which they can't make) or... a model with 3 cores.

This is something that GPU makers have figured out a long time ago: the more models you can put out without increasing production costs, the higher your profits will be.




RE: Why AMD is doing this
By GeorgeOrwell on 9/18/07, Rating: 0
RE: Why AMD is doing this
By Justin Case on 9/18/2007 6:08:20 PM , Rating: 2
Why do you hate our troops?


RE: Why AMD is doing this
By TomZ on 9/18/2007 8:41:24 PM , Rating: 2
quote:
Why do you hate our troops?

So this is the kind of PC BS rhetoric we have to hear any time anybody mentions something even remotely related to the military? I got an earful of this kind of crap in another thread from someone else, and it's way off-base.

OK, let me explain to you what the OP meant, although I think it is pretty obvious. The Vietnam War was considered by most to be a political and/or military failure for America. It went on for years and cost many lives and much money. The analogy is to some of AMD's products that are running late and putting AMD in the hole to develop, and the OP is expressing some concern that the performance may not be worth what has been spent, possibly relating to recent mediocre benchmarks.

Nobody is saying they "hate our troops"? Clear?


RE: Why AMD is doing this
By Justin Case on 9/18/2007 10:53:32 PM , Rating: 1
Are you just being an asshole as usual, or do you really have no concept of sarcasm? Why don't you go back to your hobby of rating down people's posts on unrelated threads? It's also lame, but at least it generates less HTML.


conflicting information
By phatboye on 9/18/2007 12:05:26 AM , Rating: 4
OK according to xbitlabs the "X3" is not a X4 with one core disabled

http://www.xbitlabs.com/news/cpu/display/200709142...
quote:
The new triple-core microprocessors will feature its own design and will not be quad-core chips with one core disabled, according to the web-site.


but according to this site it is. Or am I reading this wrong because it says essentially?
quote:
Essentially, the Phenom triple-core processors are quad-core variants with one core disabled.




RE: conflicting information
By jak3676 on 9/18/2007 12:28:54 AM , Rating: 2
Interersting - the article said it may not be a quad-core with a core disabled, but it may be its own design that could be turned into a dual core if one of the three cores had issues.

Everything I read on this seems to somehow tie it back to a fail-safe in case there are issues with one of the cores.


RE: conflicting information
By Justin Case on 9/18/2007 11:03:54 PM , Rating: 2
While AMD may have some plans for "native" 3-core chips (or chips with 3 x86 cores plus an on-die GPU), at this moment that would not make any commercial sense. Producing two different designs would be far more expensive than simply fusing off one of the cores in a quad-core chip.

That X-bit Labs article simply quotes from "Hard Tecs 4U", which I had never heard about until now, and the "Hard Tecs" article doesn't mention any sources.


Traditional AMD
By profounder on 9/18/2007 3:09:04 AM , Rating: 1
Really not stunned at the news since AMD is born with the ability of such a kind to boast around the planet about something small thing thay have made. And this time, a real strange thing they've made.

AMD is really great at making thins happen. HA. They can create triple cores processor today, no one dare guarente they would bring on stranger ones to the already confusing market. Maybe when AMD deploys Ect-core under the same way of Intel's combination, they will be much more creative on these craps.

It would be a really amused thing to watch on the market then when AMD brings on Sect-core, Cinq-core, and com'n, the thirteen cores. YAHA.! Maybe what to be the grandest thing would be the future ultimate "UNSURE" processors, which would contain "not sure how many" cores. Imagine, someday in the future, we go to the computer store and buy a piece of AMD's unsure CPU.

We will pray before we open the wrap. After the installation, we boot the system and BANG!..we cried out with a "WOW" with such a great luck to have a fortunate cinq core AMD CPU whose name will be "BLACK LISTED" 'cause it is the best description they truly worth. Or on the other day, we went to the same store again to shop for another AMD again to install it on the same board again just to hope we can "WOW" again only to be so disappointed with a discovery of just having got a triple core AMD.
yEp! In future, AMD could become a company selling CPUs while dumping them with GAMBLING. People will be greatly interested in buying AMD stuff then and would have chances to enjoy themselves while challenging themselves with the feelings of "lucky or unlucky."

We have no idea about what Intel will have to do then...They might be having to be crazy while crying inside.

AMD = A Marketing Devil
AMD = A Manufacturing Dull

I kinda hope this is not the route they will follow in the future days. i hope it is just me joking...




RE: Traditional AMD
By Justin Case on 9/18/2007 6:35:43 PM , Rating: 2
Actually AMD's manufacturing is pretty good these days (Intel is about 6 months ahead, but that's not new; the fact that AMD has managed to maintain - and even slightly narrow - that gap, despite a much smaller budget, is a testament to the quality of their engineers).

Their marketing, on the other hand, still sucks. Did you know AMD's new chips support SSE5, which includes several 128-bit instructions? If their marketing department had a clue, they'd be screaming about how they have SSE5 while "the competition only has SSE4", and how they support 128-bit instructions while "the competition has barely caught up with AMD64".

SSE5 is completely useless outside of some very specialized situations, of course (same goes for SSE4, for that matter). But in the hands of a good marketing department it could be a great tool.

Instead, what do they do? First they announce 4-core chips and then... 3-core ones. At least do it the other way around and it'll look like you're moving forwards, not backwards...


RE: Traditional AMD
By murphyslabrat on 9/24/2007 3:05:31 PM , Rating: 2
quote:
Did you know AMD's new chips support SSE5, which includes several 128-bit instructions?

Actually, SSE5 was for bulldozer. And, while they might have some of those instructions available, they are probably not going to see use for a little while.


The 3!
By Etern205 on 9/17/2007 11:01:12 PM , Rating: 1
AMD should also introduce the cpu with the speed of at least
3GHz!




RE: The 3!
By pattycake0147 on 9/18/2007 3:11:45 AM , Rating: 2
In the AnandTech preview here http://www.anandtech.com/cpuchipsets/showdoc.aspx?... they said
quote:
With 2.5GHz in hand today, we'd expect Phenom to be at or below 2.6GHz by the end of the year, with 3.0GHz coming sometime in 2008.

As great as 3.0 would be with this architecture for performance, what they need to do is get the new lineup out whether its 2.0, 2.5, or whatever. They just need something competitive to challenge Intel.


RE: The 3!
By GeorgeOrwell on 9/18/07, Rating: 0
RE: The 3!
By murphyslabrat on 9/18/2007 7:07:22 PM , Rating: 1
I found this incredibly funny, and I am an AMD guy. Just because he's criticizing AMD for their horrible customer-service track record in recent times doesn't mean he should be crucified.

Free speech is a fundamental part of our society. Susan B Anthony, Martin Luther King, and Art Bell were all very vocal. The validity of their claims varied, but they had the right to speak.


AMD Phenom triple-core
By Takashoo on 9/18/2007 3:19:52 AM , Rating: 2
Wafers are circular, microprocessor chips are rectangular, except at the edges of the wafer?




RE: AMD Phenom triple-core
By KernD on 9/18/2007 10:37:30 AM , Rating: 2
I don't think so, you would need a redesign of the core to do this because the interconnect and memory link is on the edge of the chip, you would be missing part of the HT link? or part of the DDR2 interface?

It's probably not a problem to be missing an HT link because only multi processor Opteron use more than 1, but I still think it's broken or slow core that are deactivated to make these.


Nice move
By fikimiki on 9/18/2007 2:09:11 AM , Rating: 1
I would like to move to another world and see the same people commenting Intel 3-core introduction.
If true, AMD is going to offer another nice CPUs.

So what are you going to do for example:

Buy AMD X3 2.4Ghz for 139$
Buy Intel C2Q 2.4GHz for 189$
Buy Intel C2D 2.4GHz for 129$

The price close to dual-core will push some people to buy 3-cores instead of two. If multi-core world exists than
2 is better than 1
3 is better than 2
4 is .....
well?




RE: Nice move
By gregjet on 9/20/2007 5:16:31 AM , Rating: 2
I have a totally sideways view of this. AMD now owns ATI. ATI have been working with ring architecture for quite a while. 2cores don't need ring bus, 3 cores with duplex communications give THE fastest interCPU communication possible on a ring architecture. After that there is at least one core between the originator token cpu and the futhurest cpu. By optimising CPU architecture geometry ( communicating duplex in 2D only, that is NOT video 2D but connection 2D), you have the fastest cpu geometry possible. My bizarre 2cents worth. Makes the purchase of ATI make more sense though


AMD says....
By iGo on 9/18/2007 12:45:48 AM , Rating: 2
Hello users... Threesome Anyone?? ;)

No technological marvel here, though it's a very good marketing move.
Fact is, AMD stands different here from Two-Core or Quad-Core race, get rid of bad Quad-Core samples in it's stock by selling remaining 3 cores instead of junking them off. And like someone posted here, gives another options to users if they place it around the same costs of Core 2 Duos.

Good to see that AMD is putting it's game somewhere, technologically or Marketing wise.




Why 3 cores, because they can't get 4 to work.
By ChipDude on 9/18/07, Rating: 0
By Expunged on 9/18/2007 1:24:39 PM , Rating: 3
Sounds like an Intel Celeron to me... It wasn't that long ago when Intel would take a chip with L2 cache errors and disable half the cache to make a Celeron. The same logic would apply to cores.

For as sarcastic as you seem in your post it actually is a smart move if they can sell them. If it is just a disabled core (I see there is some arguing over this point) then chips that would be scrap could be sold to help their bottom line. Right now AMD needs all the help they can get on the bottom line so this might not be a bad move.


Bone headed move?
By wordsworm on 9/18/2007 2:56:24 AM , Rating: 2
Most of you guys know I'm a supporter of AMD initiatives and have a high opinion of their product and vision. However, this decision really has me scratching my head. I just don't see how they could save money per chip sold on the manufacturing end: If there are 4 cores, surely it costs the same whether there are 3 active cores or 4. By the time they're ready to start shipping Phenom processors, it's likely the Q6600 will be going for less than $250. Maybe someone here can enlighten me.




Second thought
By wordsworm on 9/18/2007 8:00:14 AM , Rating: 2
I can think of one scenario where going for 3 cores on a chip might prove to be a good move. I was thinking about how having the option of working with multiples of 3 or 2 could make for better real estate choices. 9 cores on a chip would probably work out better than 8. Furthermore, 3 cores fit on a circle quite efficiently. I don't know the dimensions of what they have to work with, but it seems a plausible conjecture based on no evidence whatsoever. What I remember from pre-calculus is that a circle is nature's most efficient shape. I wonder if any of this has any bearing on their decision to make the move to a 3 core design or if I'm just off the deep end.




Fusion?
By clovell on 9/18/2007 10:35:18 AM , Rating: 2
Has anyone considered that this may be some kind of move to set the stage for the Fusion platform? The idea just keeps nagging me, but I can't seem to make it fit.




By Roy2001 on 9/18/2007 12:10:07 PM , Rating: 2
If a quad core CPU has defects and is not working, then there is chance to make it work as a tripple core CPU. That's the whole point. 284 sq mm is just too large for production cost.




Same as above, but with...
By bupkus on 9/18/2007 1:06:30 PM , Rating: 2
I agree with those posts about not wasting 3 good cores when 1 out of 4 is bad, with a small augmentation.
Suppose 1 out of 4 isn't bad, but just bins at a lower than desired frequency. Cutting out the runt of the litter may allow a higher binning and higher price.
I'm not saying to kill a perfectly good quad, only to operate on a dog dragging a slow leg.
Ok, so the metaphors ARE a little sick. :p




Old hat
By Oregonian2 on 9/18/2007 2:00:22 PM , Rating: 2
Selling parts that really are larger parts with portions disabled in order to improve production yield is as ancient as semiconductor manufacturing. Hasn't been as openly popular of late but used to be rampant in the olden days, even non-binary memory sizes. It's not a matter of 'ramping up a new product cycle', it's changing the ink printer in the chip testing station and making a new line in the brochures, etc. Not a big deal.




Typo
By mars777 on 9/19/2007 10:08:02 AM , Rating: 2
processors followed the current *Phenon* naming scheme and received the Phenom X3 name.




Decent enough move.
By gochichi on 9/23/2007 9:43:24 AM , Rating: 1
People going through the internal changes necessary to program for 2 cores are probably looking at the fact that quad-cores are already extremely cheap and therefor are actually working on increasing performance via 2+ cores, whatever that number may be. Also, if actual single apps start coming out for 2 cores more and more, then having an extra as a throw in is quite nice.

I just don't know how the pricing magic is going to work out on this. A 3 core processor sounds like a sub-$175.00 part. These new price points are crazy, I mean even right now a dual-core for $90 that is not too shabby is just nutts. AMD was pushed into 3-core space b/c they have to deliver value enough to stay affloat in this crazy world. A world, where I'd much rather be designing after market CPU coolers than being pitted against this day's Intel.

When Intel already has a quad-core (I really could care less about the details, it performs as a computer with 4 CPUS... that's all I care about) for $275.00 it's leaving very little room to work with. This next generation is going to have to be all about value, or not that many fish will byte.

$1000 part? Are you serious, who's buying those? The $300.00 parts are going to be just crazy if AMD enters that market again (there's essentially no competition right now).

Frankly, I was expecting 8-core processors to be the norm by the end of 2008. I guess I was wrong. I'm still hopefull that 8GB of RAM becomes the norm. I'm sure feature creep can keep up :)




"You can bet that Sony built a long-term business plan about being successful in Japan and that business plan is crumbling." -- Peter Moore, 24 hours before his Microsoft resignation

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki