backtop


Print 73 comment(s) - last by Calin.. on Jul 27 at 4:05 AM

Native quad-core en route

Yesterday during AMD's Q2'06 earnings conference call, AMD's President and Chief Operating Officer Dirk Meyer recapped the long term plans for the company.  Although the bulk of his comments were already stated in during the June AMD Analyst's Day, Meyer also added the tidbit that the company plans "to demonstration our next-generation processor core, in a native quad-core implementation, before the end of the year."  Earlier this year, AMD's Executive Vice President Henri Richard claimed this native-quad core processor would be called K8L.

Earlier AMD roadmaps have revealed that quad-core production CPUs would not utilize a native quad-core design until late 2007 or 2008. To put that into perspective AMD demonstrated the first dual-core Opteron samples in August 2004, with the processor tape out in June 2004.  The official launch of dual-core Opteron occurred on April 21, 2005.  On the same call Meyer announced that that the native quad-core would launch in the middle of 2007 -- suggesting the non-native quad-core Deerhound designs may come earlier than expected or not at all.

Just this past Wednesday, Intel one-upped K8L plans by announcing quad-core Kentsfield and Clovertown will ship this year, as opposed to Q1'07 originally slated by the company. 


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Why is everybody killing AMD????
By FirstoneVorlon on 7/22/2006 12:08:11 AM , Rating: 4
So, why is that now, after about 3 years AMD'64 architecture destroyed the Pentium IV with a design which is technically still superior to Intel's design, that everybody wishes to kill AMD??? AMD did something to the market Intel was not doing for AGES!!!!! They reinvented the mainstream microprocessor market, they created a piece of engineering artwork. AMD 64 design is elegant and much more intelligent than Intel's one. Of course, Intel's latest seems to be the best thing ever existed, which I don't believe it is and AMD will soon prove again to us it has a better design. Intel had to resort to TRICKS all the time to make it's processors competitive. SSE3, Hyperthreading, gigantic caches, not to mention a lot of cheating in compilers, tests and applications. I won't even mention Intel's business practices, dishonest is the least I can say about them. AMD64 would have crushed Intel's processors long ago if it wasn't for the people they paid not to sell AMD products, so, I don't want to buy stuff from a company that resorts to those methods instead of moving that money into engineering and developing capable products. The Pentium IV was a flop from start, Intel's new processor only corrects that. Where's Hypertransport??? where's the on-die memory controller and so forth??? I see a HUGE cache, now, let's move to applications that don't use cache very well. Intel took 3 years to play catch up with AMD, but AMD did not sit around, their timing is still a little behind for many reasons, but they will in less than a year demonstrate the K8L, which will surely kick intel's butt once again. The design is still superior, AMD has given us TRUE DUAL CORE and will give us TRUE QUAD CORE, not "GLUED" processors as Intel. Stop killing AMD and respect true engineering. If you have doubts about the best processor, go look at Cray's supercomputers, you won't find intels there.




RE: Why is everybody killing AMD????
By smitty3268 on 7/22/2006 12:35:00 AM , Rating: 2
It was the same after NVIDIA released the GTX7800. Everyone was predicting that ATI would go bankrupt within a month and that they would never be able to catch up, blah, blah, blah. Well, it turns out they did, and even recaptured the lead. AMD will be fine while they continue working on their next chips. It's not like this is an unusual position for them to be in - they're probably work better as an underdog.


RE: Why is everybody killing AMD????
By dagamer34 on 7/22/2006 1:04:30 AM , Rating: 2
Only problem is that ATi and nVidia's product cycles change once every 6 months compared to Intel and AMD's cycle of 1-2 years.

Oh, and there's this thing called planning for the future. Graphics companies don't have to do that as they just have to put more transistors in a smaller space, and voila, you got a bettter GPU!


RE: Why is everybody killing AMD????
By smitty3268 on 7/22/2006 1:19:22 AM , Rating: 2
quote:
Oh, and there's this thing called planning for the future. Graphics companies don't have to do that as they just have to put more transistors in a smaller space, and voila, you got a bettter GPU!


And how is that any different than on a CPU? I'd say graphics companies have to plan for the future even better because of how much the hardware changes - your basic CPU is still pretty similar to one from several years ago, just with extra cache, better speed, and lots of tricks to get rid of bottlenecks.

quote:
Only problem is that ATi and nVidia's product cycles change once every 6 months compared to Intel and AMD's cycle of 1-2 years.


True, but AMD has a lot more money and resources as well.


RE: Why is everybody killing AMD????
By masher2 (blog) on 7/22/2006 1:36:00 PM , Rating: 3
quote:
> "companies don't have to do that as they just have to put more transistors in a smaller space, and voila, you got a bettter GPU! "

...And how is that any different than on a CPU? "


Because a GPU runs an innately parallelizable process. Double the transistors on a GPU, and you double the performance. Maybe more, if you make other improvements...but the important issue is that, as long as there are less pipelines in a GPU than pixels on a screen, there's still a lot of scaling left.

A CPU is different. Single-core, the scaling situation was horrible....which explains monsters like Intel's EE and AMD's FX. Massive increases in transistor counts...but the only place to put them was extra cache, which translated only into minor performance increases.

Multicore chips give CPUs some scaling breathing room. But still, single-threaded performance is important, and most desktop applications will never be able to use dozens of cores at once. So long-term, single-threaded performance sitll needs to improve.


RE: Why is everybody killing AMD????
By Viditor on 7/23/2006 12:31:35 AM , Rating: 2
quote:
most desktop applications will never be able to use dozens of cores at once. So long-term, single-threaded performance sitll needs to improve


If Intel and AMD have their way, this isn't quite true...
A good example of why not is Intel's "Mitosis" project which uses speculative threading...for AMD there are only rumours at this point, but I would be shocked if they didn't have their own on-the-fly parralellism of single threaded apps.


RE: Why is everybody killing AMD????
By Tyler 86 on 7/23/2006 5:48:59 AM , Rating: 2
I can parallelize a very very simple application to insane levels, and if I had a processor with enough cores to run each thread in parallel, it would speed up that simple application.

*Extremely* simple optimizations that I can make in code anywhere can be made in a compiler, and that is what Intel's aiming at.

Intel isn't pushing EPIC (IA64 arch, explicitly parallel) hard on developers anymore, and instead, working on implicit threading...

Although the EPIC architecture is fantastic, likewise IBM's POWER, and other natively parallelized processors, like nVidia's and ATi's GPUs, they presented a problem upon their conception; the original interfaces would reflect the low level assembly.

Now, abstraction has taken root, and we have the 'Pixel Shader 3.0' specification for GPUs, with optimizations applying to every minute function...

GNU/Linux has made huge progress in the area of abstraction, and it's reflected in it's application on embedded processors, mainframes, and supercomputers...

However, Linux' goals are quite a bit different than your average graphics centric developments... Low-level optimization is still left soley to the individual developers, and is in no way part of the abstraction.
GCC is good, but it could be much better, and that is obvious, due to it's frequent updates.
The GNU Std. C Lib. is less frequently updated.
Abstraction optimizations to the standard C libraries, and the compiler, are key to performance under advancing architectures. With the advent of GCC 4.0, an entirely new abstraction capabilities emerged.

The point is, with the progress of abstraction, even 'Hello World' applications will one day use 2 cores efficiently, with a measurable performance advantage over an optimized single core equivelant... and then 4 cores, and then 8 cores, and so forth...

Microsoft's attempt at abstraction lead them towards managed code, asynchronous streams, garbage collection, and JIT (just in time) compilation ... This resulted in the .NET framework you hear so much about.
Now even 'Managed DirectX' has emerged.

If you're interested in seeing JIT-less abstracted .NET code in action, check out Microsoft's Singularity project.
http://research.microsoft.com/os/singularity/

It has it's quirks, but because native code can be compiled in a 'trusted' manner, it's performance exceeds that of today's Windows 2003 IIS integrated web server.

Obviously, Singularity's not the best way to go if you're looking for gaming, but it could one day be, just as the Microsoft desktop segment migrated from Windows 9x/ME to Windows NT/2000/XP kernels...

Singularity takes abstraction to an entirely new level, making efficient use of any core, and any architectures, and any improvements to come.

Optimizations introduced at the bottom scale all the way up to the top 'Just In Time', and vice-versa - it grows 'Just In Time'.

It gives the perspective on operating systems a new, almost 'organic' approach.


RE: Why is everybody killing AMD????
By Tyler 86 on 7/23/2006 5:57:16 AM , Rating: 2
When I say 'JIT-less', I mean Singularity has a compiled assembly base, but everything ontop of it - 90% to 95% of the entire operating system, even at boot - is compiled either at or before runtime - core elements being pre-compiled, but compiled to assembly nonetheless, every time...

It's effectively taking the JIT out of a JIT compiler...

... but I guess that's nothing really new, so nevermind.

Sure. It's all JIT.


RE: Why is everybody killing AMD????
By Tyler 86 on 7/23/2006 6:03:28 AM , Rating: 2
quote:
Advances in languages, compilers, and tools open the possibility of significantly improving software. For example, Singularity uses type-safe languages and an abstract instruction set to enable what we call Software Isolated Processes (SIPs). SIPs provide the strong isolation guarantees of OS processes (isolated object space, separate GCs, separate runtimes) without the overhead of hardware-enforced protection domains. In the current Singularity prototype SIPs are extremely cheap; they run in ring 0 in the kernel’s address space.


You're able to effectively run everything you 'trust' at kernel level ring 0 code. That is as close to the processor as you can get. It boggles the average software developer's mind.


RE: Why is everybody killing AMD????
By Tyler 86 on 7/23/2006 6:13:23 AM , Rating: 2
quote:
Singularity achieves good performance by reinventing the environment in which code executes. In existing systems, safe code is an exotic newcomer who lives in a huge, luxurious home in an elegant, gated community with its own collection of services. Singularity, in contrast, has architected a single world in which everyone can be safe, with performance comparable to the unsafe world of existing systems.


quote:
A key starting point is Singularity processes, which start empty and add features only as required. Modern language runtimes come with huge libraries and expressive, dynamic language features such as reflection. This richness comes at a price. Features such as code access security or reflection incur massive overhead, even when never used.


quote:
A Singularity application specifies which libraries it needs, and the Bartok compiler brings together the code and eliminates unneeded functionality through a process called "tree shaking," which deletes unused classes, methods, and even fields. As a result, a simple C# "Hello World" process in Singularity requires less memory than the equivalent C/C++ program running on most UNIX or Windows® systems. Moreover, Bartok translates from Microsoft® intermediate language (MSIL) into highly optimized x86 code. It performs interprocedural optimization to eliminate redundant run-time safety tests, reducing the cost of language safety.


Because the code is an abstract recompilable element, and the intention and boundries are clearly visible to the compiler, they can run natively at ring 0 at full throttle, after their initial compilation.

If you ever played with .NET, you know just how fancy, abstract, intricate, and easy C# is -- and you probably also know how painfully slow it can be due to it's 'management'. Even with 'unsafe' tags, things are hairy compared to C++ and native assemblies.
Singularity is effectively nativized C#.


RE: Why is everybody killing AMD????
By Tyler 86 on 7/23/2006 6:17:33 AM , Rating: 2
This is only a glimpse of the future.
Fortunately, for desktops; unmanaged untrusted abstractable JIT code can exist - just not in Singularity, which is intended as a server operating system.

"Yesterday's" JIT applications immediately recieve the benefits of "today's" processors. Cores, instructions, architectures. It's all in the upgradable compiler, and the upgradable libraries referenced.


By Tyler 86 on 7/23/2006 6:21:42 AM , Rating: 2
quote:
Aggressive interprocedural optimization is possible because Singularity processes are closed—they do not permit code loading after the process starts executing. This is a dramatic change, since dynamic code loading is a popular, but problematic, mechanism for loading plug-ins. Giving plug-ins access to a program's internals presents serious security and reliability problems [snip]... Dynamic loading frustrates program analysis in compilers or defect-detection tools, which can't see all code that might execute. To be safe, the analysis must be conservative, which precludes many optimizations and dulls the accuracy of defect detection.


http://msdn.microsoft.com/msdnmag/issues/06/06/End...


RE: Why is everybody killing AMD????
By masher2 (blog) on 7/23/2006 10:30:12 AM , Rating: 2
> "A good example of why not is Intel's "Mitosis" project which uses speculative threading..."

"Never" is admittedly too strong a word for any tech subject. I'll substitute "not within the next 25 years" instead.

As for Mitosis, remember that its still very far down the horizon, as it requires hardware support that isn't in existence yet. Furthermore, the amount of parallelism that can be extracted via Mitosis is rather limited. Diminishing returns sets in hard on anything over four cores.


By Viditor on 7/23/2006 7:56:07 PM , Rating: 2
quote:
As for Mitosis, remember that its still very far down the horizon, as it requires hardware support that isn't in existence yet

Actually, the hardware can be any multicore system (with some very minor tweaks)...it's really only the compiler that isn't ready yet.
In reality, Mitosis could be out by the end of next year, or it could be many years...it will depend on the compiler team.


RE: Why is everybody killing AMD????
By Viditor on 7/23/2006 8:12:44 PM , Rating: 2
quote:
Furthermore, the amount of parallelism that can be extracted via Mitosis is rather limited. Diminishing returns sets in hard on anything over four cores

Ummm...how could you possibly know about diminishing returns on a system that isn't built yet? In addition, from what I've read on the theory, the more cores you have the BETTER your returns...


RE: Why is everybody killing AMD????
By masher2 (blog) on 7/23/2006 10:45:10 PM , Rating: 2
> "Ummm...how could you possibly know about diminishing returns on a system that isn't built yet? "

From the same way one knows about the performance of any processor before it's built-- software simulation.

> "the more cores you have the BETTER your returns... "

You don't understand what's meant by diminishing returns. If you add cores, your performance rises...but by a ever-diminishing amount.

The Intel sims showed Mitosis achieving about a 2.5X speedup on a 4X system, with slightly more than half of that gain due simply to the side-effect of the other cores increasing the cache hits of the primary, from prerequesting data. That's pretty good scaling, but at 8 cores, the results are less impressive---about a 3.5X speedup. I didn't see any 16-core sims, but with that type of curve, it'd probably work out to just under 4X...which means you're only achieving 25% theoretical efficiency.




RE: Why is everybody killing AMD????
By Viditor on 7/24/2006 4:15:36 AM , Rating: 2
quote:
From the same way one knows about the performance of any processor before it's built-- software simulation

How can you do a software simulation when the Mitosis Compiler is nowhere near finished? Mitosis is predominantly a software driven enhancement...
quote:
at 8 cores, the results are less impressive---about a 3.5X speedup

Well, if you're looking at the same data I am (and it sounds like you are), then it's based on an early version of the Mitosis Compiler (Alpha version) from 2005...
Remember that they are still in the "proof of concept" phase for Mitosis, so you shouldn't expect it to look anything like the final product.


By masher2 (blog) on 7/24/2006 10:32:25 AM , Rating: 3
> "How can you do a software simulation when the Mitosis Compiler is nowhere near finished? Mitosis is predominantly a software driven enhancement... "

Software that requires hardware support. Mitosis won't run on current hardware. As for how you sim it, this research paper has the details:

http://portal.acm.org/citation.cfm?doid=1065010.10...

> " it's based on an early version of the Mitosis Compiler (Alpha version) from 2005..."

Not even an "alpha" version...just a research proof of concept. But the point is their simulations show a definite ceiling for the performance benefits of speculative threading. Of course, you could always postulate a breakthrough in basic theory-- but given what we know today, Mitosis isn't going to utilize more than 4-8 cores for a single-threaded process.




By bfonnes on 7/23/2006 11:59:16 PM , Rating: 2
And when they come out with a chip that can read your mind and know what you want to do before you do it, then it will be even faster, lol...


RE: Why is everybody killing AMD????
By cubby1223 on 7/22/2006 1:03:19 AM , Rating: 2
Umm - you're kidding, right?

Why is everybody killing AMD? We're not killing them. However, this is the first time where intel & AMD are entering into a cutthroat price-war where intel has a triple advantage - they've got the superior processor, lower prices, and a far greater bankroll to support these lower prices.

Sure many here want to see AMD succeed, but we're not stupid. Fanboys are those who support one company no matter the situation. And guess what, fanboys make up less than 0.01% of the computer buying market. For the vast majority of people, what is the best ratio of speed, power consuption, and price. Face it, that is why most all of us buy AMD. But tables are turning, for the first time ever, intel will have the speed, power, and price advantage.


RE: Why is everybody killing AMD????
By nerdye on 7/22/2006 1:48:00 AM , Rating: 2
I will always root for the underdog,
amd is fun to root for and I still use an xp 3200,
yet conroe is a beast and is a testiment to intel's
5 year absense from excellence and a short time before
amd strikes with k8l, will hyperstransport 3 be enough
to counter multiple 1333 mgz front side busses from intel in the quad-core market?


RE: Why is everybody killing AMD????
By bob661 on 7/22/2006 2:26:47 AM , Rating: 3
quote:
And guess what, fanboys make up less than 0.01% of the computer buying market. For the vast majority of people, what is the best ratio of speed, power consuption, and price.
And enthusiasts make up 1% of the computer market. J6P doesn't give a rats ass about which processor is faster nor do they care about who makes it. All J6P's care about are:
1. How much does it costs?
2. Can I read email and surf the web?
Anything else puts you in the 1% category.


By othercents on 7/22/2006 2:46:00 AM , Rating: 2
quote:
enthusiasts make up 1% of the computer market


This is so true. Dell sells to businesses and end users all the time. They are not selling XPS machines to these people, but they are selling low cost solutions. This is why most of their mid range computers still use single core processors and ATI x300 video cards or *shiver* Intel GPUs. For most end users this is a perfectly fine machines. Dell didn't even start using Dual Core in the mid range machines until April 06.

Processor speed is secondary to cost. Most end users want to know how much it will cost to surf the web. You build a $200 solution and they will buy it. Maybe AMD will start manufacturing web browsing computers to undercut Dell and Intel.

Other


By plewis00 on 7/22/2006 3:16:02 AM , Rating: 2
What a load of crap. Agreed, Intel made some bad moves in their time but Pentium 4 crap from the start? It was onto a winner during the Athlon XP days. Tricks?! So making your processors complete ops faster is a trick is it? News to me, perhaps we should just go back to the old days of simple processors and no optimization at all... all these extensions allow better processing speeds, Hyperthreading gave an advantage in many applications and caches are used everywhere and I'd still bet the 2mb cache on the smaller Conroe will do its job well. So by your reckoning AMD's 64-bit extensions (which, I'll add, are less used than SSE, et al) and memory controllers are tricks? Because it would seem that way when a processor (Core 2) without an on-die controller trounces the Athlon 64. No, Conroe doesn't correct Pentium 4, it redresses a lot of things and then some. You sound just like Sharikou, all mouth and no substance. You know NOTHING about K8L (as do most people as it's relatively undocumented) and yet you bang on about how great it is. Remember Intel with how great Prescott was on paper, and when it came out it was crap? Maybe K8L won't be, but there is a chance it won't be the saviour you expect. What are you, an AMD shareholder? Conroe isn't a glued processor, it was designed as dual-core from the start so I don't know where you get 'GLUED' from. Prescott/Cedar Mill cores are 'glued' but then, so are Athlon 64 cores, the only reason they work better is the HT link between them. As for the best processor, well, for the consumer it's the fastest and cheapest and then that would be Core 2 Duo. If AMD is so fantastic why are they offering 50% price cuts on their CPUs? Not bad chips by any means but no longer the best. Maybe they won't be running at a loss but when was cutting 50% of your sell price on your product good for profits? I like how you referred to the great conspiracy theory of Intel paying companies not to sell AMD, care to back that one up or did you hear about Dell and decide it was an industry-wide practice?


Quit your crying - you're making me nauseous
By DallasTexas on 7/22/2006 9:27:34 AM , Rating: 2
AMD got their butt kicked. Get over it and quit whining. The better product always wins and nobody cares about your "gee, pls love AMD anyway, BS". Your an embarrassment.

Netburst? It was the probably the most successful products EVER. Selling 500 million of these over 5 years is pretty good to me. Sure, three years later it ran it's course and Intel decided to ride that design for two more years. Guess what, it what clowns like you wanted - more megahertz.

Thank Intel for FORCING AMD to innovate when they refused AMD an X86 license if they copied their split transaction bus. AMD agreed and went their own path to innovation and their own bus. Thank Intel for that. After 25 years of copying Intel, they found out that gee, copying doesn't get me far. AMD has great designers (they hired 1/3 of them from Intel and acquired the rest fro Nexgen). Get a clue, junior, Intel is a cornerstone of American technology capabilities. If it wasn;t for Intel, you'd be speaking Chinese right now.


RE: Quit your crying - you're making me nauseous
By Teletran1 on 7/22/2006 11:08:43 AM , Rating: 1
quote:
by DallasTexas on July 22, 2006 at 9:27 AM

AMD got their butt kicked. Get over it and quit whining. The better product always wins and nobody cares about your "gee, pls love AMD anyway, BS". Your an embarrassment.

Netburst? It was the probably the most successful products EVER. Selling 500 million of these over 5 years is pretty good to me. Sure, three years later it ran it's course and Intel decided to ride that design for two more years. Guess what, it what clowns like you wanted - more megahertz.


The better product always wins....Netburst was the probably the most successful products EVER.

That proves it right there. AMD will be just fine. If Intel can push crappy Netburst(into flames) proccessors on people for the last couple of years then AMD can push thier now crappy K8s on people until they get a new competitive product on the shelves. Most people dont know anything about computers and most of them dont even know what a processor is. Throwing away the Pentium name was a big mistake for intel. Pentium>>>>>>Intel for brand recognition. Core 2 Duo has zero brand recognition. I mentioned it to 5 people at work and they had no idea what I was talking about. Mention pentium and at least some people have a clue.


By masher2 (blog) on 7/22/2006 1:39:08 PM , Rating: 2
> " AMD will be just fine. If Intel can push crappy Netburst(into flames) proccessors on people for the last couple of years then AMD can push thier now crappy K8s on people..."

Err, while I don't doubt that AMD will survive, your logic is flawed. Intel's "crappy" Netburst not only had the best brand name, marketing, sales, and distribution network, but it also did win the occasional benchmark. You can infer nothing about AMD's future from Intel's past.


RE: Quit your crying - you're making me nauseous
By Teletran1 on 7/23/06, Rating: 0
By masher2 (blog) on 7/23/2006 6:05:05 PM , Rating: 2
> "Intel just threw that "best brand name" out the window."

But they *had* that brand during the period you mention. And they _still_ have the Intel name itself. They could call their next chip 'The Stagnator', and it'd still sell, as long as it had the Intel logo on the box.

A company can succeed if it has the best product, or if it has the best sales/marketing/distribution network. When you have *neither* of those...you're in a bad position.

> "You dont have to agree with people but dont be a pissy jerk."

Please accept my apologies if my reply upset you.





By osalcido on 7/22/2006 4:26:22 PM , Rating: 2
quote:
If it wasn;t for Intel, you'd be speaking Chinese right now.


That's the most ridiculous fanboy statement I've ever heard. Congratulations


By Nanobaud on 7/22/2006 5:26:27 PM , Rating: 2
quote:
If it wasn;t for Intel, you'd be speaking Chinese right now.


I do speak Chinese right now. Guess I didn't get enough Intel when I was growing up.


RE: Why is everybody killing AMD????
By ecktt on 7/24/2006 2:35:58 AM , Rating: 2
quote:
The Pentium IV was a flop from start, Intel's new processor only corrects that. Where's Hypertransport??? where's the on-die memory controller and so forth??? I see a HUGE cache, now, let's move to applications that don't use cache very well.</qoute>

You can't be serious or you're just a n00b. While the IPC was low on the P4 when it cam out , it had the high clock speed to more than make up for it and dominate anything in its time. Not to mention the later release of Hyper threading, which made better use of existing silicon and brought a smooth desktop performance to the masses which was previously reserve for the rich. Granted the AMD64 came out hitting hard but the fact it ever since the original Athlon came out, both Intel and AMD have been playing leap frog with each other in terms of performance. FYI even without an integrated memory controller and no point to point interconnecting bus, Conroe seems to be performing quite well. If you asked me, since AMD has to use these 2 feature to perform as well as it does, it just might suggest that AMD64 wasn't that good of a design in the first place. And as for the cache, you obviously have done very little programming (if any).


Trying to step out of the shadows?
By lemonadesoda on 7/21/2006 7:42:45 PM , Rating: 2
AMD must be really feeling the pressure from Intel. Good luck in keeping up the spirit of competition!




RE: Trying to step out of the shadows?
By bob661 on 7/22/2006 2:15:31 AM , Rating: 4
quote:
AMD must be really feeling the pressure from Intel. Good luck in keeping up the spirit of competition!
Do you really think AMD just came up with this shit because of Conroe? You think someone spent a weekend at work after the Conroe launch and pulled K8L out of their ass on Monday morning?


RE: Trying to step out of the shadows?
By othercents on 7/22/2006 2:38:16 AM , Rating: 4
quote:
Do you really think AMD just came up with this shit because of Conroe? You think someone spent a weekend at work after the Conroe launch and pulled K8L out of their ass on Monday morning?


Actually I think AMD has always had the designs, but they have been keeping them away from the customers until Intel finally caught up with them. Now that Intel has they are going to pull the designs out like some sort of new invention that is supposed to revolutionize the industry. They will also start producting around the time that Vista will come to the market making K8L along with the latest ATI videocard and chipset the best solutions to buy for Vista.

Other


RE: Trying to step out of the shadows?
By DallasTexas on 7/22/06, Rating: -1
RE: Trying to step out of the shadows?
By stmok on 7/22/2006 10:21:44 AM , Rating: 1
You really have to get off your high horse.

What's this nonsense about "marketing hype" of intergrated memory controller and HyperTransport?

Intel is gonna be adopting a similar implementation to HyperTransport. Its called CSI.

And what about the integrated memory controller? Intel has done that before, they screwed up. The project/CPU was called "Timna", and it used a integrated memory controller for RDRAM. ("Timna" was the cancelled predecessor to "Banias"). Both designs were done by the same Israeli Intel team. With the "Banias", they used the PIII-S as the basis, and expanded on that. If you really think about it, Conroe is a descendent of the P6. (Pentium Pro, PII, PIII, Pentium-M, Core Solo/Duo).

Intel has improved the instructions per clock and prefetching such that Conroe is the "delaying mechanism" for future Intel CPUs that will integrate CSI and possibly a memory controller. (As in they've done enough to keep AMD back for now, while their R&D work on something new and experiment with new ideas).

Even an Intel engineer admitted an integrated controller is a good idea. (They aren't sure if they're really gonna need it yet...But they are considering it in a future design).

AMD's Hypertransport or Intel's CSI is a good idea as we increase the number of cores into the future. If you've bothered to look in the 4-core or CPU market, you'll see Intel's current bus design has a bottleneck in this scenario. (And you'll see a situation where Conroe-based solutions will be held back by the bus design).

That's why the Xeon version of the Conroe will have dual-independent buses! (one for each dual-core CPU). And its also why AMD is very likely to hold onto the 4-core/CPU market until K8L comes.

You really should get a good feel of CPU history and upcoming ideas before blabbering nonsense to make yourself feel better about Intel. Because you currently sound like a no clue fool.


By masher2 (blog) on 7/22/2006 1:29:02 PM , Rating: 2
> "Even an Intel engineer admitted an integrated controller is a good idea"

Actually, several Intel engineers have 'admitted' what anyone in the industry already knew. That an integrated memory controll is a classic tradeoff scenario...with good points and bad. Intel has toyed with them several times, but decided the benefits weren't worth the disadvantages. AMD decided the opposite. And-- given the differences in the architecture of each-- it can very well be that BOTH companies made the right decision.


RE: Trying to step out of the shadows?
By masher2 (blog) on 7/22/2006 1:26:12 PM , Rating: 1
> "The new "native quad" is a exactly that - a marketing stunt as if anybody cares how the quad sausage is made. Most people don;t care if it's 4 sausages linked together or one big turd"

Colorful languge...but your metaphor is essentially correct. Intel announces plans to deliver a quad core this year, so AMD has to respond in some fashion.


RE: Trying to step out of the shadows?
By Viditor on 7/23/2006 12:26:29 AM , Rating: 4
I think you guys need to understand what it takes to bring a processor into being...
AMD's Native Quad Core has been in development for almost 4 years now!
It was even talked about in the press before Conroe was in Intel's roadmap at all...


RE: Trying to step out of the shadows?
By masher2 (blog) on 7/23/2006 10:17:36 AM , Rating: 3
> "AMD's Native Quad Core has been in development for almost 4 years now! It was even talked about in the press before Conroe was in Intel's roadmap..."

I know that, but action is what counts in the market, not talk. AMD's 'demonstration' this year is being prompted by Intel moving up its quad core release to Q406.


By Viditor on 7/23/2006 7:52:22 PM , Rating: 2
quote:
AMD's 'demonstration' this year is being prompted by Intel moving up its quad core release to Q406

Why do you say that? The timing seems fairly normal to me...


By MrSmurf on 7/23/2006 5:36:42 PM , Rating: 2
quote:
Actually I think AMD has always had the designs, but they have been keeping them away from the customers until Intel finally caught up with them. Now that Intel has they are going to pull the designs out like some sort of new invention that is supposed to revolutionize the industry. They will also start producting around the time that Vista will come to the market making K8L along with the latest ATI videocard and chipset the best solutions to buy for Vista.

And you honesty think Intel did the same thing? AMD knew about the Conroe and its theorical performance. Who is to say this isn't their answer? It sure isn't you.


By Calin on 7/27/2006 4:05:05 AM , Rating: 2
You are so wrong. AMD has developed a looong time on the Athlon64 line. It was first known as the Sledgehammer (that would be server version I think), then a new line called Clawhammer (desktop), then they had working samples, then they had working samples running at some 800MHz, then in the end they launched the Clawhammer (Athlon64), then the Sledgehammer.
There is no reason to think K8L will take much less to develop. If AMD would have had an answer to Conroe, they would have put it out until now, in order to regain the performance crown or at least to tie at the high end.


By lemonadesoda on 7/22/2006 7:51:57 PM , Rating: 2
Read the article
quote:
Earlier AMD roadmaps have revealed that quad-core production CPUs would not utilize a native quad-core design until late 2007 or 2008
AMD is pulling out EVERY STOP in order to have a competitive product against Intel this year. And I bet they will be working every weekend in order to make that happen.


RE: Trying to step out of the shadows?
By tuteja1986 on 7/23/2006 12:56:31 AM , Rating: 2
I wonder if AMD has any plan of having Graphic processor core in its Quad core if they end up buying ATI that is.


By Zirconium on 7/23/2006 9:17:14 AM , Rating: 2
No. They don't. Not this quad-core processor at least.


By gorka on 7/24/2006 6:00:50 AM , Rating: 2
"Do you really think AMD just came up with this shit because of Conroe? You think someone spent a weekend at work after the Conroe launch and pulled K8L out of their ass on Monday morning?"

hehe...so true! people are making Conroe out like its the second coming of Christ.

each stage on a roadmap costs either companies billions of dollars. these are hardly 'reactionary' steps chipmakers take to merely offset a singular chip offering by the competitor.

...I'm waiting for someone to say AMD-ATi merger was caused by Conroe as well. :-)


RE: Trying to step out of the shadows?
By JumpingJack on 7/22/2006 3:45:09 AM , Rating: 2
I tend to agree, they need to demonstrate something... ever since Intel put on the dog and pony show at IDF, people have been searching for what AMD has to counter and essentially have come up empty handed, sure a lot of leaked roadmaps, a June analyst meeting with pretty PPT slides of future plans... we haven't really seen a booted and working 65 nm chip... they are likely under some pressure (from analysts perhaps) to start showing some goods and stop with all the talk.

Intel has already demonstrated working quad core server and DT parts, glued together or not, it has bit because there were real...


RE: Trying to step out of the shadows?
By Viditor on 7/23/2006 12:40:14 AM , Rating: 3
quote:
Intel has already demonstrated working quad core server and DT parts, glued together or not, it has bit because there were real...


AMD also demonstrated Quad Core at IDF...


By Knish on 7/24/2006 7:20:46 AM , Rating: 2
Link or shens? I must have missed this one


Blah blah blah pressure
By Regs on 7/21/2006 9:18:48 PM , Rating: 3
I here this so many times. "AMD better have something innovative for kensfield, core duo, etc etc.." You think they'll be demonstrating a crap shot later this year ? My concern wasn't that if they had something competitive to show, but when they had something competitive to show.




RE: Blah blah blah pressure
By Regs on 7/21/2006 9:20:24 PM , Rating: 3
And that photo kind of reminds me of the guy who played in the movie "40-year-old virgin" .


RE: Blah blah blah pressure
By Xenoid on 7/21/2006 9:23:08 PM , Rating: 2
I thought that too! haha


RE: Blah blah blah pressure
By brownba on 7/21/2006 10:03:13 PM , Rating: 2
is it not?


RE: Blah blah blah pressure
By TomZ on 7/22/2006 2:31:18 PM , Rating: 2
I thought he looked like Jim Carey with a haircut:

http://www.imdb.com/gallery/granitz/4545/Events/45...


That's great and all...
By Engine of End on 7/21/2006 9:38:29 PM , Rating: 3
But we need more multi-threaded software. As much as I would want a native quad-core processor, there has to be a point in getting one. Getting better times in SuperPi is not a reason to purchase a new processor.

The juice has to be worth the squeeze.




RE: That's great and all...
By Exodus220 on 7/22/2006 4:21:10 AM , Rating: 2
I agree with you on this front. I look at the idea of releasing the 64 bit processors from AMD and how great that was supposed to be. Intel was behind in their release of x64 but eventually it came. However, I have yet to see it worthwhile or widely supported. So big deal if they release the quad-core CPU's because there is not the software to accomodate it. Oh, and who is going to benefit from a quad-core anyways?


RE: That's great and all...
By Targon on 7/23/2006 12:55:41 PM , Rating: 1
It's not only about multi-threaded software at this point. If you look at all the various things that run behind the scenes in Windows XP, and figure it will be WORSE with Vista, dual and quad core systems/processors will almost be needed just to handle all the bloat in MS Windows.



RE: That's great and all...
By masher2 (blog) on 7/23/2006 10:49:42 PM , Rating: 2
> "If you look at all the various things that run behind the scenes in Windows XP...dual and quad core systems/processors will almost be needed just to handle all the bloat in MS Windows.

My dual-core desktop runs, sans an active application, at 2% or less cpu utilization. That's with background virus checking, email, 2 networks, and a whole host of other processes. I don't see Vista being significantly worse. The idea that you'll need to devote an entire core to 'Windows bloat' is sheer nonsense.



RE: That's great and all...
By ecktt on 7/24/2006 2:13:48 AM , Rating: 2
i agree.


Innovation needed for AMD!!!
By kitchme on 7/21/2006 8:48:59 PM , Rating: 1
AMD needs to come up with something very innovative and very soon if it wants to keep its share of the market. Intel's Conroe and the quad-core later this year definitely put major pressure on AMD. With current AMD's plans, AMD will match Intel's chips far too late...By that time, Intel will be too far gone.




RE: Innovation needed for AMD!!!
By Genetics on 7/21/2006 8:57:14 PM , Rating: 2
Not necessarily...by that theory nvidia would be far gone by now. AMD just needs to stay somewhat competitive until their next major design can come out. The A64 series has given AMD fresh life and much needed finacial supplies. Here is to competition though.


RE: Innovation needed for AMD!!!
By cgrecu77 on 7/21/2006 9:15:41 PM , Rating: 4
the corporate world puts little value on performance, they mostly care about brand and cost/performance ratio. AMD finally breached the corporate world these last 2 years and now they'll be there to stay. As long as they can offer competititive products they don't need to have the absolute best. The most important aspect for AMD is to make the switch to 65nm as soon as possible, until then they will lose money heavily because of the price war (intel's cpus probably cost a little bit less + they're faster).


Conroe "needs" a memory controller?
By Anemone on 7/23/2006 9:42:36 AM , Rating: 4
Quad core in development for 4 years, probably so. But K8L in development for that long, doubtful. If so they'd have a working sample now. And if they did, they'd have shown everyone who'd listen what it could do and how long till we'd have it.

Now I do know how long this race has been going on, and yes I do very much believe that K8L was the fastest answer to Conroe. They've known about what Conroe was likely to do for almost 2 years now. Everyone knew it was a leap beyond Yonah, and everyone knew Yonah was a mild leap beyond dual P-M's and that being a leap beyond P3's. And if you go back in time enough you'll remember that the old P3 vs Athlon race that was going on, really never died, it just stopped by the sidelines for a breather.

AMD is no dummy as a group, but they didn't exactly invent the onboard memory controller, thank you Alpha. There are tradeoffs to the technology. But that's not what I came here to note.

AMD is behind, partly because they expected Conroe to be a faster clocked P-M dual core and that would have been something to just match clock for clock with the high end A64's. They've been caught napping on their laurels which is something AMD has done before. They come up with a good and superior level of performance at a given price point and then suddenly "advances" either in speed upgrades or process technology go from 6 months out to 12 months out. Just have a look back and you'll watch each time they've been ahead has been followed by a target slip, where sometimes they fall behind and sometimes they don't. But it appears like they take a breather after each rush to leap forward. That's natural and certainly not something other industries don't see as well.

However in this case it came at a bad time, and is compounded by a misjudgement of what Intel's likely next gen was going to be. Intel's chip is better than almost everyone thought it would be and it clocks higher than almost everyone thought it would. It doesn't trip over high latency memory accesses and a quad core version running in excess of 4ghz is already sitting testers hands. Even if you say a 3.2ghz quad core K8L comes in reasonably well, the Athlon itself isn't ready to scale past 3.6ghz or so even at 65nm. They'll manage a few process miracles and perhaps get to 3.8 or 4.0 by the end of 2007 or mid 2008 but that is barely on the edge of what is possible.

Meanwhile you have chips right now on air that can achieve 4ghz status. Can you even concieve what these chips might do under a well underway 45nm process in late 2007? At that point why bother with CSI? Just tap all 4 cores onto the same 8mb dynamic cache at 45nm and run that at 4.5-5ghz. Need more memory bandwidth? If quad pumping could live this long, just move the server issue downstream and put dual 1333 busses onto this chip. Conroe sips memory needs very gently and hasn't been badly underpowered yet by the current structure. It will need more at quad core, but I think folks vastly overestimate how much more it will need. Dual and quad memory busses would do the answer for a year or year and a half and then you could add a built in controller as you had a mature 45nm process or as you moved to the next level. At that point you'd be wondering what to do with all the die space.

Is AMD out of the game? No, not really. Did they misjudge what the Intel response was likely to perform at? Yes, absolutely they did. Do they have designs that will help address this? Yes, if nothing more than taking a few of their own and then copying some of the ideas that went into Conroe. You think Intel would be the only one to copy AMD 64 and built in memory concepts? Nope! It works both ways for both parties. But don't think you are going to see a sudden miracle from quad core. It's going to be expensive and right now at least to the paying customer (even if they are underpricing) Conroe is really cheap. So they have to be able to offer a 3.4ghz quad core that can compete with a $999 quad core E6700 in the form of Kentsfield. And it has to be shipping in Q1 07. I don't know about you, but if I believed I was going to be shipping a product in Q1 07, I would probably have testing samples in the field no later than Sep 06. If you are a wagering person, then you can offer up the odds on that happening.

I think we'll just see.




AMD needs do something
By nerdboy on 7/21/06, Rating: 0
RE: AMD needs do something
By Nehemoth on 7/21/2006 11:17:55 PM , Rating: 1
Somebody remember this roadmap??
http://www.vr-zone.com/?i=2328&s=1

as you can see AMD was supposed to release a processor with graphics and also with PCI Express controllers.
That's the ATI thing??


RE: AMD needs do something
By bob661 on 7/22/06, Rating: 0
RE: AMD needs do something
By masher2 (blog) on 7/22/2006 1:30:30 PM , Rating: 2
> "You win the dumb ass of the week award!!! Your prize is a tin foil hat with cologen breasts on it! "

Man...I was really hoping for that :(


better for now?
By rykerabel on 7/24/2006 5:56:16 PM , Rating: 3
Well, for now and for me is to just upgrade what I have. With these price cuts, i can junk my a64 3000+ and pop in an a64 4800+ for a steal. Huge improvement for only $300. Intel is not compelling for me because I'd have to pay $300(cpu) + $200(mboard) + $400 (4gb DDR2) = $900 total just to get the same performance from intel, or blow $1500 for the top performance available.

my point, for the next 6 months, current AMD users will stay with AMD for upgrades because it is way more "bang for the buck". After that though, upgrades will call for mboard/memory upgrades to get any higher and then AMD better be ready.

6 months AMD gotta do it, or be truely back in the underdog seat.




Learn to Love the Competition
By techhappy on 7/22/2006 3:05:54 AM , Rating: 2
We all need to learn to love the competition. I have zero qualms with Intel or AMD. The ensuing months of competition will only drive prices down and innovation forward as each company plays hopscotch to become the performance vs. price champ.

It is true to say, that Intel has been playing unfairly for a long time and that AMD rightfully earned it's lead over the last 3 years. But I'm sure we can agree that Intel is finally doing the right thing, by putting the focus on developing better tech, as is evidenced with Conroe and other developments in the works.

I personally hope AMD sticks around, because if it wasn't for them nipping away at Intel's sleeping bosom, what other catalyst would they have to get their butts in gear and out making better tech. The AMD64 line has put excitement back into the PC world, pushing forward more momentum for pc gaming and multimedia power driven applications.

I look at this time as a new rennaissance for the industry, which was languid, not long ago. Bring out the multi cores and start the pricing battle. In the end, we can only benefit from the fruits of this competitive flavor.

Let the battle begin....




Feel the pressure
By MyK Von DyK on 7/22/2006 6:21:45 AM , Rating: 2
Pipeline Question
By Ringold on 7/23/2006 5:05:28 PM , Rating: 2
It seems to me like AMD and Intel have at any given time a few products in reserve just in the event that one has a home-run and competitive performance must be maintained. Good business sense, obviously. But given that processers take years upon years to develop, and given that Intel seem to be dumping everything they've got in their product pipeline all in the space of about a year while at the same time throwing their margins out the window in a price war....

Does this mean that a year or two down the road progress will look downright stalled out asides from minor speed bumps, almost like the last year has been? Or do you guys think AMD and Intel will either be able to speed projects up or have enough still near-completion to be able to keep a good stream of next-generation parts and good revisions to current chips coming? Does the loss of profit margins portend bad things for AMD with higher R&D costs without Intel's war chest? Or is AMD more resilient than it appears?

Either way, I'm happy. Price wars, as long as they dont kill AMD, are good. :) Bumping up my upgrade cycle with every cost reduction.




"Spreading the rumors, it's very easy because the people who write about Apple want that story, and you can claim its credible because you spoke to someone at Apple." -- Investment guru Jim Cramer

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki