Print 91 comment(s) - last by excrucio.. on Jun 1 at 11:12 PM

Even before AMD's 65nm K10 architecture hits store shelves, the company is talking about the 45nm shrink

This summer, AMD will announce its first major architectural change since the introduction of the K8 architecture in 2003.  This new architecture, dubbed K10, will first make an appearance in the server space, with the introduction of the Barcelona-family processors.

K10 features a native quad-core design that incorporates shared-L3 cache, HyperTransport-3 support and backwards functionality with AM2 motherboards.  However, the original K10 desktop and server processors will debut on the 65nm architecture -- a process AMD only started mastering in December 2006 with the launch of the Brisbane desktop CPU family.

In the second half of 2008, AMD will begin to migrate its K10 architecture to the 45nm node.  AMD explicitly mentions that its 45nm process technology utilizes silicon-on-insulator (SOI).  Intel's 45nm process node, slated for introduction later this year, uses conventional CMOS process technology. 

The halo AMD 45nm chip, Deneb FX, shares the same functionality as its 65nm counterpart, Agena.  Both families incorporate native quad-core designs and shared-L3 cache support.  Deneb FX goes one step further, adding support for DDR3 on the integrated memory controller.

However, the bulk of AMD's 45nm quad-core offerings will come with the Deneb (non-FX) family.  AMD suggests Deneb will be the first processor on the new AM3 socket.  Previous AMD documentation indicated that AM2 and AM3 would be forward/backward compatible -- yet AMD engineers claim the AM3 alluded to in 2006 is not the same AM3 referenced in the 2008 launch schedule. 

"At the time AM3 was the likely candidate to become AM2+," claimed one field application engineer familiar with AMD's socket migration. "[AMD] wanted to keep the socket name associated with DDR2 memory and backwards compatibility, but AM3 emphasizes DDR3 support."

After Deneb, and closer to 2009, AMD's guidance states that 45nm Propus and Regor will replace the 65nm Kuma and Rana mid-range productsPropus is very similar to Deneb: 45nm, shared L3 cache, AM3 package.  However, Propus will only feature two cores.  Regor is identical to Propus, but will not include shared-L3 cache support.

AMD's low-end single core Athlon 64 and Sempron appear consolidated with the introduction of the Sargas family.  Sargas is an optical shrink of the 65nm Spica core, with the addition of DDR3 support and AM3 packaging.  AMD's ultra-low end Sparta-family, slated for introduction this year to replace the Manila-family Semprons, has no successor.

AMD product managers are keeping details of their 45nm technology close.  However, this past January AMD and IBM jointly announced plans for high-k, metal gate transistors on future 45nm and 32nm processors. 

This past February, AMD senior vice president of technology development Douglas Grose claimed the company is still anticipating whether or not it will use high-k metal gate technology in later 45nm revisions or if the company will wait until 32nm.

Intel also announced its intention to debut high-k, metal gate technology on its 45nm node, but the company went one step further to confirm this new transistor technique will appear on the Penryn processor.  Intel guidance suggests Penryn will see its first retail availability late this year -- at least a year before Deneb.

Marty Seyer, AMD senior vice president, recently disclosed AMD's 45nm server offering slated for release in 2008.  Seyer stated that Shanghai, the 45nm successor to Barcelona, would feature additional cache and other performance enhancements. 

Seyer or Grose would not comment on what these performance enhancements, though features from AMD's server products typically appear on the desktop components as well.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Learn to crawl before you can walk
By frobizzle on 5/2/2007 7:58:58 AM , Rating: 4
Announcements like this, two generations up, are dangerous. If you deliver, great, but if not, you end up with a reputation of pushing vaporware.

Don't get me wrong - I want AMD to succeed. Competition drives advancement and better pricing. This is sounding like a desperate move.

RE: Learn to crawl before you can walk
By James Holden on 5/2/2007 8:10:04 AM , Rating: 5
Not having clear, forward looking architecture paths is what put AMD in a bind with Core in the first place. Kudos to them this time around.

RE: Learn to crawl before you can walk
By Verran on 5/2/2007 9:52:07 AM , Rating: 3
That's a good point. AMD seems to have been caught off-guard by Core2, which would indicate a lack of planning. Given how dominant their A64 arch was prior to Core2, I wouldn't be surprised if that was the case.

The problem though is that AMD has really failed to deliver substantially over the last year or so, and they're starting to turn into the boy that cried wolf. Future plan announcements are great, but in the face of a lack-luster 65nm "launch", delays on top of delays for K10, and at least another 6 months (probably much more) before anything new hits the desktop market, it seems like it's time for action, not words.

RE: Learn to crawl before you can walk
By PrezWeezy on 5/2/2007 1:39:22 PM , Rating: 2
AMD wasn't caught off guard. Intel and AMD have a very good idea what the other is up to. Intel announced about a year before C2D hit the shelves what they were up to. AMD just didn't act quickly enough because I think they were expecting a more loyal fan base. They were also very distracted with aquiring ATI. That took a lot out of them, both in time and money. I happen to prefer Intel, but I hope AMD comes out with something huge. I'd like nothing more than to see a see-saw effect with the performance crown. It just means better products for us. That and I want to see some of the software groups get programs out there that can use the powerhouses we will be having in our livingrooms.

RE: Learn to crawl before you can walk
By Oregonian2 on 5/2/2007 2:26:07 PM , Rating: 5
But a lot of Intel's announcements and new processors in the past turned out to be marketing fluff, little more. Useless multithreading and Clocks in the sky doing comparatively little. New bus structures that make things go only a tiny bit faster. Difference is that the Conroe chips actually delivered, and at a low price point. Something very different, IMO, than most of Intel's previous processor introductions. AMD may have been surprised by the fluff actually being true for a change.

By JackPack on 5/2/2007 4:40:02 PM , Rating: 3
No need to make excuses for AMD.

The introduction of the Pentium M in 2003 already proved that Intel could deliver. It was something that could perform on-par with Athlon 64 while being extremely frugal with silicon and power. It was obvious that Intel could bring the Pentium M to the desktop. And AMD's reaction was to expect Intel to continue treading the NetBurst path onto Tejas and beyond?

AMD has much bigger problems if they're assessing Intel's capabilities by just looking at Intel's announcements and marketing.

RE: Learn to crawl before you can walk
By Justin Case on 5/2/2007 2:39:14 PM , Rating: 5
Although a "fan base" is important for public perception (well, for the "geek" public, anyway), it's not really where most of the money is. The money is in big OEM deals, like the ones AMD managed to get with Sun, Lenovo, Dell, IBM, HP, etc.. As long as those keep coming in, AMD couldn't care less about having "fans". Fans don't buy a million CPUs six months in advance, which is what a stable business needs.

AMD knew what Intel was up to, just like Intel knew what AMD was up to with the K8 (they didn't react sooner due to internal politics - their engineers tried to warn management). The fact is, except for the very-high-end, AMD is more or less matched with Intel right now (after the latest price cuts).

Remember that during the K8 vs. Netburst days, AMD was charging insane prices for their CPUs. Intel's recovery has merely "corrected" that.

AMD's OEM deals didn't all turn out as well as they hoped, but still, you can find a lot more AMD-based systems from Dell, HP, etc., than you could 3 or 4 years ago, and that's a good sign for competition. Also, AMD took a major blow acquiring ATI, but in the long-term that was really their only option. Intel is probably going to spend more on GPU R&D than AMD paid for ATI.

Some people have blogs dedicated to posting negative propaganda about AMD (usually with very sensationalistic titles), but news of its death are greatly exaggerated.

By Psychless on 5/2/2007 5:02:10 PM , Rating: 2
Yes, fans are only really useful for potato chip companies. Taste unfortunately has nothing to do with AMD or Intel's chips and they generally care more about Dell and HP then people who build their own computers.

RE: Learn to crawl before you can walk
By PrezWeezy on 5/3/2007 3:50:17 PM , Rating: 2
You forget that AMD started as a grassroots movement (if you could use that term considering their "real" start as a secondary supply of Intel chips to IBM) because they had cheap parts and performed well. They survived on the gaming market with 2%, then 6%, then a growing percentage of the total sales until they had a product that gave equal performance for a much cheaper price. THEN they started gaining market share at an incredible rate. Their "fan base" kept them alive. Not to mention almost the entire GPU market for discrete cards is fan based.

I honestly dont remember AMD charging a ton for their products either. I remember that was the huge reason to go AMD, because Intel chips were $400 for a decent proc. AMD had almost 0 OEM support for a long time. They only recently started getting anything other than special orders from Dell and HP and the like (even though it did backfire on them).

Intel would never have developed a GPU if it wasn't for AMD buying ATI. That and nVIDIA's obvious unwillingness to join Intel in creating a graphics solution.

Overall I agree that an actual "fan" might not be a huge percentage, but I believe that over 45% (I think it was actualy higher but I don't remember exactly) comes from channel partners such as myself. Selling and recomending systems to customers. Anyone who says that OEM's are their only source of sales hasn't done the research to know that the channel from small builders is a huge stream for them.

I'm no naysayer of AMD, I happen to have a strong perfernce to Intel as anyone who has read my comments on this site can tell you, but I would like to see AMD continue on to keep kicking Intel in the pants.

RE: Learn to crawl before you can walk
By derdon on 5/4/2007 2:38:57 AM , Rating: 2
"Intel would never have developed a GPU if it wasn't for AMD buying ATI. That and nVIDIA's obvious unwillingness to join Intel in creating a graphics solution."

You seem to be a bit young. Intel had graphics card when nVidia was fighting with 3dfx over the performance crown. That was when ATI shipped unimpressive "Rage" GPUs (back then nobody would have called it a GPU though) and nobody expected anything off them really.
Intel has always been a player in the GPU field. They're not big among the hardcore graphics crowd, but the majority of the PCs is equipped with an Intel graphics chip.

By PrezWeezy on 5/4/2007 6:32:33 PM , Rating: 2
Sorry I misspoke. What I meant was their recent endeavors into what we are assuming will be a discrete graphics solution. I wasn't talking about built in video. We know they have that and it's quite good for its playing field, but I was referring to the announcement they made last month, I believe, when they were talking about a "high FP chip."

By Justin Case on 5/6/2007 11:26:33 PM , Rating: 2
I'm sure that if AMD wanted to keep that great 6% market share, they'd give their "fans" priority over Dell, IBM and HP.

If they wanted to grow beyond that, thay had to change the way they did business. Being cheaper and having some geek "fans" will only take you so far. What Ruiz realised was that AMD's sales weren't limited by price; going cheaper wouldn't make them sell more. What AMD needed was a bigger, preferably more stable market. And that meant bigger deals with OEMs. And since AMD is still limited by manufacturing ability, selling more to OEMs meant delaying things a bit for the channel.

In the long run, that was probably the right choice. Maybe it could have been executed better, but there's no way a manufacturing company can grow more than 30% in a year without some hiccups.

P.S. - Actually AMD was a bit more than a "second supplier". They helped Intel with the (physical) CPU design and manufacturing processes of early x86 CPUs. They didn't really start working on CPU architecture internally until the K5, though.

By MarkHark on 5/2/2007 7:36:59 PM , Rating: 3
AMD just didn't act quickly enough because I think they were expecting a more loyal fan base.

They would have it, had they not screwed their previously loyal customer base by keeping their prices astronomically high for as long as they could. Many of those AMD fans were expecting to jump on the AMD dual-core wagon as soon as they could grab a nicely-performing part for under $200. They waited in vain for over two years while the cheapest X2 parts costed over $300 and refused to drop at all.

Guess who was willing to first offer AMD's customers that magical, long-waited, sub-$200, dual-core cpu? Yep, you got it, it was not AMD. How could AMD expect their "loyal fans" not to jump on Intel's offer?

AMD conquered their legion of enthusiast customers by offering them good-performing parts at reasonable prices. They lost many of those very same customers by acting exactly like Intel used to do, that is, charging whatever they saw fit just because they could.

Don't get me wrong, if they could charge over $1000 for their top cpus, that's very good for them, I don't see a problem with that! What they forgot is that the majority of their customer base had always been on the "budget" side, and for those, $300+ was clearly not something they were happy to spend for an entry-level dualcore cpu. AMD itself neglected their customers, by pushing overpriced parts on them. If it were not for Conroe, AMD would still be charging $300+ for a 3800X2.

By the way, I'm one of the loyal customers AMD just lost, and I just got my first Intel cpu ever.

By flatout on 5/4/2007 8:33:18 AM , Rating: 1
SURE AMD YOUR RIGHT THERE WITH INTEL .. your superior processes, your amazing performance, your quick time to market, and great communication with the public. How could I have ever doubted you ..

RE: Learn to crawl before you can walk
By Pythias on 5/2/2007 8:43:24 AM , Rating: 2
Here's hoping AMD can keep Intel in its toes.....Intel's toes I mean.

RE: Learn to crawl before you can walk
By Crank the Planet on 5/2/07, Rating: -1
RE: Learn to crawl before you can walk
By Martimus on 5/2/2007 3:08:31 PM , Rating: 2
What amazes me is as AMD is catching up to Intel (gradual increase in market share despite recent events...

According to this article, AMD lost over 20% of their marketshare last quarter.

RE: Learn to crawl before you can walk
By mrkun on 5/3/2007 12:07:06 AM , Rating: 1
Umm, it just says Intel pushed them back under 20% total marketshare.

RE: Learn to crawl before you can walk
By osalcido on 5/3/2007 1:45:07 AM , Rating: 4
Obviously math is not your strongsuit.

If AMDs total marketshare was just over 25% for Q4 '06 and they're down to below 20% for Q1 '07... that's a 20% loss of total marketshare

By Justin Case on 5/6/2007 11:12:31 PM , Rating: 2
It's kind of pointless to compare two quarters. Let's say AMD's deal with Dell for 1H 2007 was sealed on Q4 2006. That means a huge number of CPUs "sold" on Q4 2006, and far less on Q1 2007. But in fact they are still supplying those CPUs.

If you want to have an idea of a company's real market share, you need to take into account longer periods (ex., 1 year), not individual quarters.

Some people like to give their "articles" (or blog entries) very sensationalistic titles but, in a business like this, a 20% difference in sales between two consecutive quarters means very little (one way or the other) in the grand scheme of things.

RE: Learn to crawl before you can walk
By Mitch101 on 5/2/2007 9:56:40 AM , Rating: 3
From what I have heard on the manufacturing side the shrink doesnt seem to be a problem on any of the fab companies.

Seems the process or materials that works at 65nm is the same at 45nm with little to no redesign changes needed. Call them all lucky as this is how Intel is able to go to 45nm so fast. Almost a free shrink. Overclocks are only slightly more roughly 10% than existing. Im not an engineer so I cant comment on why. Doesnt mean they wont get faster with time but out of the gate expect 10% more than existing chips before exotic cooling is applied and lower temps to start.

However at 32nm the engineering teams need to come up with some new tech but from what I am hearing Vertical chip layering might be more important here than shrinks and AMD partnering with IBM could provide a small advantage for AMD should they be able to hang in there that long. If verticle chips become the next gen then Intel might have to play catch up to IBM/AMD. Not everything will be resolved with shrink it again if layering can be done effectively.

On another front the K10 is faster than Penryn in non SSE4 benchmarks. Go ahead ask me where I heard this? Its an Intel leak not an AMD leak. Intel knows this and that is why they are trowing out SSE4 benchmarks out early but on non SSE4 benchmarks K10 is faster than Penryn. You heard me. As for why AMD is tight lipped is it needs to liquidate the excess Dell screwed them inventory. If AMD pre announces then AMD will end up holding excessive inventory on chips they might have a hard time unloading without losing a ton more money. Where the chips really shine is on the Multi-CPU server side.

So lets say AMD announces and releases benchmarks then their current chips K8 still in the channel stop selling all together. AMD doesnt have the inventory yet to do a full on release and the channel trickles in because no one wants current gen and they only want next gen K10. They will find themselves sitting on a ton of previous gen and barely able to keep up with the channel. Thankyou Dell! scumbags. So AMD chose to wait until they have a ton of chips in stock to flood the channel and hopefully eliminate the majority of K8 stock that Dell screwed them on.

As for AMD aquiring ATI as being a bad move doing this allows AMD to certify motherboards for server production something which would take them a very long time if they didnt own or make a motherboard chipset. NVIDIA was too expensive for AMD. ATI was the right price despite the R600 not being an NVIDIA killer. AMD will have server certified chipsets available immediately. This is more important than graphics because the server market is one of the first markets AMD received most of their profits from. From a gamers point it was a mistake from a Server perspective it was the right choice.

Lets talk R600. Its not a total disaster as they certainly can compete with the mid range line which is where the majority of chips sell and AMD has already said they will be competitivly priced. How can AMD do this well they own chip foundries and while the R600 will be made off site they can certainly make chips inhouse cheaper than NVIDIA giving them a price advantage that NVIDIA cant match because NVIDIA doesnt own chip foundries. As well as ATI's DX10 chipset is DX10 compliant with DRM (Puke) and physics able which has yet to be exploited for use. FRAMERATE DOES NOT MEAN SUPERIOR. If its over 60fps at any resolution the game plays excellent. If one does 80FPS and the next does 60fps it doesnt matter because your monitor refresh rate is still probably 60hz which is most LCD panels. Framerates are becoming a thing of the past and what will make the eventual difference is EYE CANDY and PHYSICS.

Let us not forget ATI makes chips for consoles and this will be an ongoing process for future as well. ATI is not a total bust.

Lastly I will leave you with the last rumor that in the X-Box portable gaming machine might beat the heart of an ATI chip. ;)

RE: Learn to crawl before you can walk
By mikecel79 on 5/2/2007 11:52:16 AM , Rating: 1
AMD will have server certified chipsets available immediately.

What "server certified chipset" did ATI ever produce? ATI and nvidia may make fine desktop chipsets but they do not belong in any server environment. The only chipsets I would ever trust in a real server environment are made by Intel, Serverworks (Broadcom), or IBM.

Even if ATI were to be building a server chipset it would not be available immediately. It would take months to be certified before any OEM would touch it.

RE: Learn to crawl before you can walk
By Mitch101 on 5/2/2007 12:58:31 PM , Rating: 2
And that process will continue with the addition of ATI providing certified chips as well.

Think about your statement. You trust Intel CPU with an Intel Chipset but you dont trust AMD with an AMD/ATI chipset?

The R690 is already Vista WHQL certified.
AMD/ATI chipsets will be server certified.

There are over 30 motherboard companies already signed up to use the AMD/ATI chipsets.

Its no different than Intel Creating server chips for Intel.

AMD is doing the same and will certify thier chipsets which will cut down on the time for server certification.

As I hear a 2.5ghz K10 Quad is about 33% faster than Intel's 3.0ghz quad core. This was in a benchmark the Intel chip usually wins. Now I wonder why Intel is quickly moving up its 45nm processors and releasing SSE4 benchmarks? Hmmm. 4x4 anyone?

There is a market for something that fast. Expect server demand to be high and chipsets to be certified fast.

RE: Learn to crawl before you can walk
By mikecel79 on 5/2/2007 1:40:12 PM , Rating: 2
Think about your statement. You trust Intel CPU with an Intel Chipset but you dont trust AMD with an AMD/ATI chipset?

I never said I trusted Intel CPUs on only an Intel chipset. We have many servers that run Intel CPUs on Broadcom, Intel, and IBM chipsets. I stated that ATI has no experience making SERVER chipsets. Why should I trust them to make a reliable server chipset?

The R690 is already Vista WHQL certified.
AMD/ATI chipsets will be server certified.

Vista WHQL doesn't mean anything in the server space. All it means is that the the drivers function correctly. Vista is no a server OS. There could still be underlying problems with the chipset. High end server use and is very different than a single person using a desktop.

There are over 30 motherboard companies already signed up to use the AMD/ATI chipsets.

Wow that's great. How many of them are creating motherboards for HP, Dell, IBM or Sun servers utilizing chipsets that were designed by ATI? Zero.

Its no different than Intel Creating server chips for Intel.

Actually it is. Intel has been doing this for over a decade now. ATI has only been desigining chipsets for a few years. AMD (not ATI) has made server chipsets in the past that have been reliable but ATI has NO experience in the server space. AMD did NOT purchase ATI for it's server chipset experience.

As I hear a 2.5ghz K10 Quad is about 33% faster than Intel's 3.0ghz quad core. This was in a benchmark the Intel chip usually wins. Now I wonder why Intel is quickly moving up its 45nm processors and releasing SSE4 benchmarks? Hmmm. 4x4 anyone?

How about a link?

By Mitch101 on 5/2/2007 2:05:06 PM , Rating: 2
As if links in the internet were what seperates truth from fiction. Google it there are a few around.

Obviously nothing I say is going to get a positive remark with you or the Intel crew obviously by the downward votes I am seeing on my comments so I can only suggest you wait for it.

This is my last post about this but all I will say is Intel is going to get smacked 6 ways to sunday.

Like the way the Opteron Smacked the P4, the Conroe Smacked the Opteron back, the K10 is going to smack the Xeon/Conroe. Thats not fanboism thats fact but you will know that soon enough.

By ObscureCaucasian on 5/2/2007 7:51:47 PM , Rating: 2
This is probably what he was talking about.... it isn't what I'd call the most reliable information, but it's about all we got.

RE: Learn to crawl before you can walk
By defter on 5/2/2007 1:12:21 PM , Rating: 2
You seem to forget that a large part, if not majority of Opteron servers are using NVidia chipset.

You probably haven't heard about this?:

By mikecel79 on 5/2/2007 2:16:49 PM , Rating: 2
Actually most of the more popular modern Opteron servers are now using the Broadcom (Serverworks) chipsets.

HP DL365 - No information found
HP DL385 G2 - Serverworks HT-2100 and HT-1000
HP DL585 G2 - nVidia NForce Professional 2200 and 2050 chipsets, and AMD-8132 chipset

IBM x3455 - Serverworks HT-2100 and HT-1000
IBM x3655 - Serverworks HT-2100 and HT-1000
IBM X3755 - Serverworks HT-2100 and HT-1000

Dell PowerEdge 6950 - Serverworks HT-2100 and HT-1000
Dell PowerEdge 2970 - Serverworks HT-2100 and HT-1000
Dell PowerEdge SC1435 - Serverworks HT-2100 and HT-1000

Sun Fire X2100 M2 - nVidia (not sure which one)
Sun Fire X4200 - AMD 8000

By raven3x7 on 5/2/2007 4:51:17 PM , Rating: 2
Nvidia is probaby the largst provider of workstation chipsets on the AMD platform

RE: Learn to crawl before you can walk
By ybee on 5/2/2007 12:19:34 PM , Rating: 2
Seems the process or materials that works at 65nm is the same at 45nm with little to no redesign changes needed.

Wait, Intel will use high-k/metal gate in its 45 nm process. Isnt it suposed to be a big change?

On the other hand they are not swithching to EUV lithograpy yet, so in this respect the transition is going indeed to be relatively easy. Is this waht you mean?

By Mitch101 on 5/2/2007 12:39:25 PM , Rating: 2
Well put as I said I am not an engineer I can only reinterate what someone said about the process of going from 65nm to 45nm as being minimal changes.

By Phynaz on 5/2/2007 12:21:04 PM , Rating: 1
Seems the process or materials that works at 65nm is the same at 45nm with little to no redesign changes needed. Call them all lucky as this is how Intel is able to go to 45nm so fast.

Intel is doing complete transistor material change at 45nm.

RE: Learn to crawl before you can walk
By whickywhickyjim on 5/2/2007 12:39:52 PM , Rating: 2
They will find themselves sitting on a ton of previous gen and barely able to keep up with the channel.

That's already happening. Nobody wants the bargain bin shite they're pushing on the market now. If they had only made new 939 chips on the 65nm or even 90nm process, their Q1 sales wouldn't be in the toilet.

By Dactyl on 5/2/2007 1:42:25 PM , Rating: 2
I'm going to upgrade my S939 CPU from 2.4GHz single core to 2.0GHz dual core in the next month. I wouldn't be doing this except for the price drop.

It would be nice if I could upgrade to a dual core K10 in this socket. There's no reason dual channel DDR1 couldn't feed a K10 dual core.

RE: Learn to crawl before you can walk
By FITCamaro on 5/2/07, Rating: -1
By kilkennycat on 5/2/2007 1:39:29 PM , Rating: 2
ATI does not own the Xbox360 chips. They licensed the complete design to Microsoft who handle the complete interface with TSMC. Unlike the original XBox design with nVidia, where nVidia supplied the chips. I believe ATi also gets a royalty fee for each Xbox360 sold. ATi also may get additional design fees for any required participation in a die-shrink. As for the arrangement between nVidia and Sony, I have no details, but since Sony is a perfectly competent hardware manufacturer, I suspect that the arrangement with nVidia was again design fees plus per-shipment royalties. Simplest for all parties involved.

By ObscureCaucasian on 5/2/2007 7:59:37 PM , Rating: 2
ATI has Wii and 360, nVidia has PS3. ATI has number 1 and 2, nVidia has #3. All the chips are manufactured by other parties, but they still get royalties. The gpu in the original Xbox was the only chip that a console manufacturer ever bought (wasn't fabbed by the console manufacturer). I guess nVidia started screwing MS on pricing too because they wouldn't lower their price as technology progressed.

Basically ATI is in much better position console wise.

AM2+ and AM3
By AnnihilatorX on 5/2/2007 7:56:38 AM , Rating: 3
So is AM2+ socket forward compatible?
And is AM3 socket backward compatible or not?

I seriously don't want another change of socket

RE: AM2+ and AM3
By rqle on 5/2/2007 8:05:05 AM , Rating: 2
AM2+ is new cpu with DDR2 support. Usually means limited mainboard support.

AM3 is DDR3 controller all together. Its seem they didnt want to mix up DDR2 and DDR3.

RE: AM2+ and AM3
By AnnihilatorX on 5/2/2007 8:16:04 AM , Rating: 2
Since the memory controller is on CPU now, I see no reason why a socket design cannot include a selector pin which determines the memory type through the motherboard and retain compatibility that way.

RE: AM2+ and AM3
By AnnihilatorX on 5/2/2007 8:19:52 AM , Rating: 2
The link "AM2 and AM3 would be forward/backward compatible" in the article implied AM2+ is AM3, and that AM2+ can support both DDR2 and DDR3

What happen to that now and why a new socket again in place of AM2+?

RE: AM2+ and AM3
By Furen on 5/2/2007 9:41:59 AM , Rating: 2
I believe it can support DDR2 and DDR3 CPUs but the memory support depends, obviously, on the DIMM sockets on the motherboard. The differences between DDR2 and DDR3 are not big enough to warrant completely different memory controllers, so memory controllers that can drive both DDR2 and DDR3 are possible.

RE: AM2+ and AM3
By Martimus on 5/2/2007 2:17:35 PM , Rating: 2
AM2+ supports HT 3.0, while AM3 supports DDR3.

RE: AM2+ and AM3
By Targon on 5/2/2007 8:38:09 AM , Rating: 2
AM2+ like the current AM2 based processors supports DDR2. The differences in sockets are based on HyperTransport 3 and allowing each core to run at a different power state. Socket AM2+ will support both AM2 and AM2+ processors, so I expect a quick transition to socket AM2+ by motherboard manufacturers since there is no downside.

When it comes to AM3, I suspect that the reason for the change is like the change from socket 939 to AM2, a different socket for a different type of memory supported by the processor. While I suspect that AMD COULD support both types of memory on the same processor, there would be a LOT of complaints if people with an AM2+ processor tried to use their chip in an AM3 motherboard and then found that it wouldn't work. That being the case, AMD would NEED to support DDR3 memory on AM2+ based processors just to avoid those sorts of complaints.

It might be nice if motherboard manufacturers would put in multiple banks of memory, some at DDR2 and some at DDR3, which might allow for this, but there hasn't been a motherboard with support for multiple memory types since the transition from 30 pin memory modules to 72 pin.

RE: AM2+ and AM3
By Rookierookie on 5/2/2007 9:17:43 AM , Rating: 2
but there hasn't been a motherboard with support for multiple memory types since the transition from 30 pin memory modules to 72 pin.

I was under the impression that many i915 boards supported both DDR and DDR2, and that many of the upcoming P35 boards will support both DDR2 and DDR3 as well.

It's a different story with AMD's integrated memory controller of course, but the problem doesn't really rest with the motherboard manufacturer.

RE: AM2+ and AM3
By MartinT on 5/2/2007 9:26:44 AM , Rating: 2
I was under the impression that many i915 boards supported both DDR and DDR2, and that many of the upcoming P35 boards will support both DDR2 and DDR3 as well.

Exactly. Another example, some people may remember it still, is the (in)famous ECS K7S5A board, supporting both SDR and DDR SDRAM back in the K7 era.

RE: AM2+ and AM3
By bubbacub616 on 5/2/2007 12:39:10 PM , Rating: 2
i loved that board - till it died!

RE: AM2+ and AM3
By darkpaw on 5/2/2007 9:20:09 AM , Rating: 2
Um, not true on the motherboard comment. There were plenty of boards that supported both SDR and DDR during that transition. AMD's use of the internal memory controller prevented that strategy with the conversion to DDR2, but there are plenty of Intel boards that support both DDR and DDR2 (usually 1 bank of each in an either/or configuration).

RE: AM2+ and AM3
By Hawkido on 5/2/2007 4:52:28 PM , Rating: 1
oops, Not true!

The use of the internal memory controller on the AMD CPU is optional. If there is a memory controller on the MB it can override the IMC on the chip. There were some NVidia MBs that did so. I think it lost out in popularity because of the increased cost of the IMC in the CPU plus the increased cost of a MC on the MB. As to there being no boards that supported it. I'll take your word on it. But the IMC wasn't preventing it, just making it cost prohibitive. The newer AMD chip may not give you the option anymore, but one of the selling points of the IMC as listed by AMD was that its use was Optional provided your MB had a MC on it.

RE: AM2+ and AM3
By darkpaw on 5/2/2007 5:02:28 PM , Rating: 2
That I did not know, thanks. Doesn't seem to matter if no one supports the option, but it is an interesting piece of information.

RE: AM2+ and AM3
By coldpower27 on 5/2/2007 10:24:57 PM , Rating: 3
I think that would kinda defeat the purpose of having the IMC on AMD's processors.

Could you provide a link of when this occurred as I have no recollection in the entire history of K8 where they used off board MC.

RE: AM2+ and AM3
By perrywilson78 on 5/2/2007 9:27:07 AM , Rating: 2
there hasn't been a motherboard with support for multiple memory types since the transition from 30 pin memory modules to 72 pin.

My wifes computer has VIA KT266 chipset and supports PC133 or DDR266 its only 4 years old. With every switch to new memory types has had a motherboard that supported the old and new.

RE: AM2+ and AM3
By Visual on 5/3/2007 4:53:40 AM , Rating: 1
only four years old? yeah right... it was more than 6 years ago.
oh you mean the computer not the chipset... well it was already ancient when you bought it then.

RE: AM2+ and AM3
By raven3x7 on 5/2/2007 5:00:24 PM , Rating: 2
According to preliminary info from AMD, AM3 processors will be compatible with AM2/AM2+ and that the memory controller on those chips will support both ddr2 and ddr3. Which one will be utilized will depend on the motherboard used.

RE: AM2+ and AM3
By CrystalBay on 5/2/2007 6:33:50 PM , Rating: 2
Don't worry AssRock will be to the rescue somehow

RE: AM2+ and AM3
By dfifo on 5/3/2007 1:35:40 AM , Rating: 3
Don't worry AssRock will be to the rescue somehow

I have to tip my hat to Asrock just because they support 3 CPU types on one motherboard. I don't think that's ever happened before. The last upgradeable socket slot I remember was when using a Slotkey in a Pentium slot1 to adapt to socket370. Asus had a socket 478->479 adapter, but that was more of a sidegrade than an upgrade.

For those who don't know,the Asrock 1689 is a Socket 754 and through the M2 upgrade socket also supports 939, AM2 and DDR2RAM. Granted, the daughter boards are 30 bucks a pop, but that's still a lot of mileage for 1 motherboard.

I wish $200+ motherboards would have this kind of flexibility. How nice would it be for something like the P5B Deluxe Wifi to have that kind of upgradeability.

RE: AM2+ and AM3
By encryptkeeper on 5/2/2007 9:58:08 AM , Rating: 2
So what does this article mean...DT is great, but I hate the vague articles. Can AM3 processors be used in AM2 motherboards? Or is AMD planning on AM3 processors being a whole new socket, or possibly 1207 sockets like the 4x4 platforms?

RE: AM2+ and AM3
By bigpow on 5/9/2007 10:15:25 AM , Rating: 2
I've been using AMD for the last 5 years and I'm coming back to Intel.

I'll go for Intel C2D -
AMD doesn't have a smooth upgrade path that allows me to keep the same mobo & RAM. Why would I go AM2? C2D is cheaper, overclocks better, faster, etc.

-It's sad.
it feels that AMD's attitude today is like that of Intel's 5 years ago; they just keep coming up with new socket, platforms, etc
This AM3 with DDR3 feels very familiar, remember Intel's Hyper-Threading?

-It's all about marketing
before 5 years ago, AMD was eating Intel's leftover from the floor, trying to capture as much market share as possible by maintaining backward compatibility with older CPU socket and RAM. It was GREAT!
I wonder if the old (& successful) AMD marketing team had left and joined Intel instead

-A crazy idea
I'm no expert, but I believe AMD should just make a cheap chipset that allows multiple (dual-core) processors, using the same old socket 939 + DDR and/or AM2 + DDR2
That would surely put a dent on Intel's armor

Vote of no confidence
By Rookierookie on 5/2/2007 7:57:04 AM , Rating: 1
How many would be willing to bet against a delay into 2009?

RE: Vote of no confidence
By darkpaw on 5/2/07, Rating: 0
RE: Vote of no confidence
By defter on 5/2/2007 10:08:06 AM , Rating: 3
In my opinion most important question is: when AMD's 45nm parts can reach higher clockspeed compared to their fastest 65nm parts?

For example, AMD's 65nm parts have been available since December and they still are running below the clockspeed of fastest 90nm parts (3GHz)...

RE: Vote of no confidence
By Master Kenobi on 5/2/2007 11:19:16 AM , Rating: 2
A valid point. It is unlikely AMD will make a 45nm launch in 2008. 1H'09 is much more realistic.

Also to the clockspeed comment, which is another valid point, clock speeds come with mastry of the process node and scaling the chips to take advantage of it. This takes time, and AMD's 65nm process is still in its infancy. Intel's 65nm process is an adult, it has been around in mass production for over a year now. Given the timetables and pending Intel can stick to their roadmap (So far they have beaten it) then they should be gearing up for the 32nm node around the same time AMD is moving 45nm chips in volume.

RE: Vote of no confidence
By ybee on 5/2/2007 12:22:14 PM , Rating: 2
Maybe never. Incresing process variations at smaller feature sizes make it more and more difficult to get higher clocks.

RE: Vote of no confidence
By melgross on 5/3/2007 6:12:43 PM , Rating: 2
That's not true. It was leakage and dissipation issues that led to that.

Intel's latest chips will be running at at least 3.2 GHz at the highest speeds upon release. They already showed Penyrn running at 3.33 GHz. It's just a matter of time before speeds rise above the highest Pentium speeds. Perhaps late in 2008 we will see that.

RE: Vote of no confidence
By coldpower27 on 5/2/2007 10:22:53 PM , Rating: 1
We will have to see given AMD's past track record so far, at least half way through the process node, so I doubt you will be seeing higher speeds from AMD's 45nm process till Early 2009.

It isn't a terrible thing to be using the newer process as cost savings measure though for the time being.

RE: Vote of no confidence
By raven3x7 on 5/2/2007 5:06:15 PM , Rating: 2
Exactly how did AMD slip with 65nm? They started production exactly when they had planned to. Whether that was later than they should have is a different matter. I fully expect the first AMD 45nm CPUs to appear by Q3 2008 at the latest.

RE: Vote of no confidence
By Phynaz on 5/2/07, Rating: 0
RE: Vote of no confidence
By Andrwken on 5/2/2007 1:00:47 PM , Rating: 2
lol. Very funny.

To little, to late
By 13Gigatons on 5/2/2007 9:02:29 AM , Rating: 1
2008 is just to late for AM3 to showup. It needs to be out in 2007 and have backwards compatiblity with socket AM2, DDR2 and DDR3. That's a tall order for a company that is fighting Bankruptcy.

I hope AMD can survive but at this point I think it would be better if someone bought them.

RE: To little, to late
By Regs on 5/2/2007 10:19:24 AM , Rating: 2
I'm more worried about what AMD plans to offer Desk Top users using 45nm. AMD tried for a long time to win over a deal with Dell and other retailers and OEMs, so I hope for their sake they do more than just add server technology to it.

Like how the K8 went from single channel to dual channel, 120nm to 90nm, and then single core to dual core. 2-5% difference between all of them for gamers and most other applications. Though media apps I admit got a very good boost over time.

RE: To little, to late
By Dactyl on 5/2/2007 1:46:08 PM , Rating: 2
AM3 is only useful as far as DDR3 is useful.

Right now, DDR2 modules are superior. They can hit almost the same clock speeds (bandwidth) which much better timings (latency).

DDR3 won't become better than DDR2 until 2008, and you won't see significant gains with DDR3 before the middle of 2008.

RE: To little, to late
By coldpower27 on 5/2/2007 10:21:07 PM , Rating: 2
In terms of performance yes, the good thing about DDR3 is the power consumption is reduced as there are power saving features plus the lower voltage.

RE: To little, to late
By Dactyl on 5/3/2007 12:59:52 AM , Rating: 3
If you're an enthusiast, you're going to want a 2900 or 8800 in your machine, and OC your processor by at least 100MHz. Saving a couple watts on your desktop's ram is a big yawner. Nobody is going to pay the price premium to get Bearlake within 6 months of its launch just to save watts on their ram.

AMD Barcelona all that great??
By Logikal on 5/2/2007 10:42:00 PM , Rating: 2
Can someone do me a favor?
show me a link for barcelona benchmarks ??

RE: AMD Barcelona all that great??
By Dactyl on 5/3/2007 1:02:40 AM , Rating: 2
LOLZ. I like deadpan humor the best. You are joking, right?

There is only one Barcelona benchmark, put out by AMD. Barcelona is 40-50% faster than Kentsfield in SPECfp, which is a floating point heavy program. Barcelona's main strength is its ability to do floating point calculations.

Other than that, it might be slower than a Kentsfield at doing everything else. It's impossible to say until AMD releases more benchmarks.

RE: AMD Barcelona all that great??
By Shintai on 5/3/2007 4:21:40 AM , Rating: 2
SpecFP_rate is more a memory bench than a FP bench...

By Dactyl on 5/3/2007 8:44:23 PM , Rating: 1
Thanks. That makes perfect sense.

It explains why, moving from Core 2 Duo to Core 2 Quad, Intel sees little improvement on that benchmark. Adding 2 cores doesn't change that there's just one memory controller being fed by FB-DIMMs (which together have great bandwidth but poor latency).

AMD's on-die memory controller is really nice. But if that's the only thing Barcelona has going for it...

By MartinT on 5/2/2007 9:23:42 AM , Rating: 1
I believe that the chances of AMD having 45nm products available in 2008 are slim to nonexistant. Heck, AMD hasn't even gotten its 65nm process to the place where they can produce their high-end SKUs on it.

45nm has already slipped considerably, from early 2008, as was initially claimed, now to H2/2008, and will continue to slip for the next 18+ months. And no surprise there - how are they going to hit their goals when they have to reduce CapEx and are completely reliant on IBM to do the dirty work?

But hey, maybe they'll pull another phantom launch - like they did with 65nm processors in Dec 2006.

By coldpower27 on 5/2/2007 10:19:46 PM , Rating: 2
I have always heard Mid 2008, just like I have heard Mid 2007 for Desktop and Server for K8L/K10 back in May 2006 last year.

There's been a minor slip for desktop but Q3 depending on when it is, could still be the middle of the year.

Q3 2008 would be on the tail end of Mid 2008, but that is expected when companies give vague information of "Mid Year"

Though that would still mean AMD is still at the minimum 3 Quarters behind Intel. Q4 2007 is the time for Penryn derivatives on the 45nm node.

By nerdye on 5/3/2007 11:37:53 PM , Rating: 2
Yes, penryn is coming, with the 1333mhz fsb, yet I'm running over that right now at 1400mhz fsb on my e6300 on a ds3 board (1400/4 = 350|350*7 = 2450)? but of course, penryn will be a beast of an OC'er. If hope comes through, AMD's newly updated integrated memory controller on the k10 and great branch prediction and efficiency will compete, man I hope so, cause AMD hasn't stated very high high hz ratings of their new chips. A 2.3 ghz native quadcore k10 may be 30% more efficient than conroe, but penryn cranking the clocks well above 3.0ghz with a much faster fsb than before is quite an opponent. Only time will tell. One thing's for sure though, AMD cpu's and AMD-ATI gpu's competing well is only a benefit for us consumers.

By fake01 on 5/6/2007 6:55:21 AM , Rating: 1
umm I'm pretty sure i read that Penryn will be up to 40% faster than Intels current fastest processor. Barcelona (K10) or whatever will be "over" 50% faster, making it "over" 10% faster than Penryn. Not to mention the agena (new name for AMD processors) releases later this year, at 1.9GHz was overclocked to 3.05GHz without any issues. So i think AMD might take back the lead for a while.

By jp7189 on 5/2/2007 4:38:30 PM , Rating: 2
IBM has had 45nm for quite a while now.

By raven3x7 on 5/2/2007 5:09:35 PM , Rating: 2
BS. AMD had working SRAM 3 months after Intel

By kilkennycat on 5/2/2007 6:30:22 PM , Rating: 2
Thanks for the link. Given the 3 month delay with respect to Intel's RAM announcement, I expect an announcement of a fully-working alpha-phase processor on IBMs 45nm process any day now......

Any links to an announcement from IBM that their April 2006 45nm test ram design also implemented new materials and technology to counter transistor leakage? No doubt the Intel ram already was using hafnium-gate transistor structures, considering the short time going from the ram design to a fully-functional processor.

By lamestlamer on 5/3/2007 4:04:34 AM , Rating: 6
Intel's 45nm process node, slated for introduction later this year, uses conventional CMOS process technology.

Conventional CMOS technology uses Metal on Silicon Field Effect Transistors or MOSFETs for logic operations. As process technology shrunk, it was necessary to ditch the metal and use a purely doped Silicon/Silicon Oxide transistor. SOI simply replaces a bulk silicon crystal used for fabbing with a very thin layer of silicon crystal supported by an insulating layer. Neither SOI nor non-SOI modern processes are technically CMOS technology, but they still use FETs, unlike TTL or other logics.

same old comparison's...
By hoyanf on 5/5/2007 8:14:34 AM , Rating: 2
From what i have been reading on comparison's on Intel vs AMD, i have yet to see anyone asking why i cant upgrade from a Celeron to Xeon without changing the mainboard... I started my 939 from a mere 3000+ to Opty 165... I think most of you have lotsa cash to spare and love to change mainboards processor by processor...

I think i have lost count how many chipset's have Intel changed from the 1st release of P4, not mentioning about changing of RAM from RAMBUS which is good to SDRAM then up again to DDR then up to DDR2 then within a year change to DDR3... hahaha I've only used SDRAM and DDR todate... dont think i wanna use DDR2 when it's going to be shortlived by DDR3... whatever happened to FBDIMM ??? another RAMBUS in lost ???

What i can say is that by me sticking with AMD i have saved so much till i can have 4 pc's, whereby 2 of which is converted to file/web/dns/dhcp/ldap/kerberos/nfs server... I have yet to regret my staying with AMD... The best thing is that they're sticking to what they have started with since the pentium days... sticking on to old boards longer than INTEL does...

You guys should really think which is important faster cpu or getting things done with less cost of upgrades... same as the bloated *ista... I'd prefer to use the extra's for multi-gpu, more ram and additional hdd, not only looking at cpu upgrades...

Fact is still fact, what's the use of multicore if the OS dont know how to make full use of it ??? Why you need multicore just to be multithreaded & multitask ??? Is it the problem of the OS ??? What does the OS do with extra processing capability, just 3D GUI effects ???

By flipsu5 on 5/16/2007 3:11:00 AM , Rating: 2
I think AMD always spends at least a few months time to study Intel's processor technology after its release. Then it applies its learning to developing or finetuning its own technology. Since Intel is first new x86 CPU to always get out there.

This time (45 nm) it may take more time since Intel is departing significantly from its own previous 65 nm technology. Withn new materials being used in the gates. It is taking longer for 45 nm to get released by Intel than 65 nm.

By excrucio on 6/1/2007 11:12:32 PM , Rating: 2
AMD 6000+ x2 takes on E6600
AMD FX-72 takes on the puppy EX6800

AMD had higher prices when C2D came out and it took customers mind too fast.

AMD was never behind, they are yes behind in technology improvements such as the 65nm and 45nm and the coming soon intel 32nm.

But thats not a big deal, AMD explore its processors to its max improving the living crap of those chips.

If you dont beleive me check tomshardware cpu chart.

EX6800 still a whooping 1K.
FX-72 is a nice 300 dollars processor.

the 6000+ x2 can take on the big boy where it was the bang for the buck E6600 no problem.

Intel just has much more share and is able to lower their price down to the ground.

facts facts facts...

AMD processors till the end baby. Optys are true performance
Let the xeons come we shall take em all down.

"We don't know how to make a $500 computer that's not a piece of junk." -- Apple CEO Steve Jobs
Related Articles
Intel Unveils "Penryn" Performance
April 17, 2007, 9:11 PM
AMD Talks Details on K10
April 14, 2007, 11:46 AM
Intel Pulls 45nm Xeon Launch Into 2007
February 21, 2007, 1:16 PM
AMD's 45nm Opterons Scheduled for 2008
February 5, 2007, 11:50 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki