Print 53 comment(s) - last by Justin Case.. on May 17 at 5:23 PM

AMD's tent event was held on the hottest day of 2007 thus far

Two AMD Phenom FX processors in action, coupled with a lone Radeon HD 2900 XT (Source: Corsini)

AMD's "Hammerhead" RD790 reference design (Source: AnandTech/Anand Lal Shimpi)
AMD demonstrates 'Barcelona,' 'RD790' and 'R600' in California

AMD today held a press event demonstrating its next-generation platforms and technologies in Sunnyvale, California. The next-generation platforms and technologies include upcoming AMD K10 Phenom processors, RD790 chipsets and ATI Radeon HD 2900 XT graphics cards.

AMD demonstrated its upcoming Phenom FX processors in a Quad FX configuration. The Quad FX configuration consisted of an RD790-based motherboard with two quad-core Phenom FX processors, for a total of 8 cores, and an ATI Radeon HD 2900 XT.

Clock frequencies of the Phenom FX processors were not disclosed. "AMD failed to go into more detail regarding the technical specifications of the processors. The company only went as far as to say that the operating frequency of the processors used in the dual-socket system was more than that of the single-socket one," claimed Paolo Corsini, senior editor for Italian publication HWUpgrade.IT.  Recent guidance from AMD claims the Phenom FX processors range from 2.2 GHz to 2.6 GHz.

Industry analyst Anand Lal Shimpi claims, "AMD's reasoning for not disclosing more information today has to do with not wanting to show all of its cards up front, and to give Intel the opportunity to react."  Shimpi continued, "[I] still don't believe it's the right decision, and we can't help but believe that the reason for not disclosing performance today is because performance isn't where it needs to be, but only AMD knows for sure at this point."

The ad hoc demonstration paired Agena FX along with ATI's newest flagship, the Radeon HD 2900 XT, codenamed R600. May 14 will mark the actual R600 launch date, though vendors were allowed to show non-Crossfire configurations of the cards.

The motherboard used in the Quad FX demonstration was AMD’s very own RD790-based Wahoo reference board. Wahoo features AMD’s upcoming RD790 paired to the existing SB600 south bridge with three PCIe 2.0 x16 slots. AMD does not intend to sell Wahoo under its own name; however. AMD expects to offer complete reference design to manufacturers, similar to the Designed by NVIDIA motherboards.

AMD also plans to offer a Socket AM2+ RD790-based Hammerhead reference board. Hammerhead offers Hyper Transport 3.0, four PCIe 2.0 x16, one PCIe x1 and two PCI slots.

Expect AMD to unveil its ATI Radeon HD 2900 XT next week while RD790 and Phenom processors will have to wait as late as Q4 2007.  AMD's latest guidance suggests RD790 and K10 will launch simultaneously.

Intel expects to unveil its Skulltrail Quad FX-competitor in the second half of 2007.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

In the old days.
By Misty Dingos on 5/11/2007 8:17:40 AM , Rating: 5
You had a motherboard, ONE processor, a video card and some other stuff you wanted in the case.

Ten years from now (if you can afford it) a top end PC will have a four or eight socket MB with eight to sixteen cores apiece, four video cards, two physics cards, two HDTV tuner cards, a twenty four channel wireless multi-gigabyte communication port, a BlueRay drive and a HD-DVD drive (because two standards are really better than one), one terabyte of OctoDDR RAM (on two sticks), and four hard drives spinning at 15,000 RPM and a storage capacity of ten thousand terabytes apiece, and 25.1 sound from one sound card from Creative Labs. It will require two dedicated 220V lines directly to the power distribution center and a dedicated HVAC unit to keep it cool. Just so you can play WOW VIII or chat with someone in the house next to yours. It will also cost as much a new F-22 with pin striping.

OK perhaps my sarcasm has gotten the better of me. But I can not be the only one who wants better not more. A custom made PC can now cost more than $10,000! Of all the choices in your life, you should not include the following. "Will I buy a car or a PC?"

AMD, Intel I want better not more. If you can't make the PC better without adding more CPU sockets, multiple graphic cards, hard drives that seem to breed like rabbits, and an ever increasing power demand that may require your own power plant. Well then you need to go back to the drawing board and start the BLEEP over.

RE: In the old days.
By sixth on 5/11/2007 8:56:35 AM , Rating: 2
Can't agree more. Its freaking ridiculous that every time i add another server to my rack i have to make sure i have enough juice to run everything, amount heat output, change in A/C, how much more power is going to be consumed etc. Heat out put has went UP..i dont care what Intel/AMD guys say, processors are just as hot if not hotter than in the past, and isn't getting any better.

RE: In the old days.
By kumquat on 5/12/2007 10:42:21 AM , Rating: 2
The heat thing is just incredibly... -not true-. A Pentium 3 (not even a Xeon with gobs of cache) was in the 25-35W range (after we moved to 0.18 micron, mind you, not with the .25 Katmai/etc stuff that most of the Xeon chips at the time were based on). Now I can go out and buy a -DUAL- 1.9GHz chip that only consumes 65W for the pair, tops. And keep in mind that AMD hasn't even released 65nm HE and EE server chips. Still think things are getting hotter?

RE: In the old days.
By Justin Case on 5/17/2007 5:23:20 PM , Rating: 2
Really? Why do you buy new CPUs then? Keep using the old models, if you think those give you better performance per watt...

RE: In the old days.
By punko on 5/11/2007 9:34:32 AM , Rating: 5
A custom made PC can now cost more than $10,000! Of all the choices in your life, you should not include the following. "Will I buy a car or a PC?"

Just for a point . . .

While I was in 3rd year University, a 386 machine was $10,000.

This was in 1988. In today's dollars that is approximately $24,300 in 2007 dollars, using published inflation information.

Not only could you buy a car for that, in 1987, you could buy 2 brand new cars for that. 10k in 2007 gets you a crappy used car or a down payment on a new one.

We've never had it so good.

And don't get me started about programming PDP-11's with punch cards . . . damn kids these days . . . GET OFF THE LAWN AND TURN DOWN THAT MUSIC

RE: In the old days.
By LatinMessiah on 5/11/2007 1:15:24 PM , Rating: 3
. . . damn kids these days . . . GET OFF THE LAWN AND TURN DOWN THAT MUSIC

Make us, Gramps!

Seriously though, I'm glad I live in a time in which everyone can actually own there own PC. My brothers and I had to share one computer back in 1995 and it was anything but a personal computer. As newer technology pushes existing and older technology back, prices drop and availability increases making it possible for anyone to own one or more personal computers.

Raise your mice and rejoice, fellow PC'ers!

RE: In the old days.
By punko on 5/11/2007 2:12:50 PM , Rating: 4
Make us, Gramps!

Don't make me get out my cane :)

Hopefully not Gramps for another 25 years!

Mental note: Make sure I teach the boys about "free milk and a cow"

RE: In the old days.
By Tsuwamono on 5/11/2007 2:21:53 PM , Rating: 2
lol dont tell your wife about that one if shes anything like my wife

RE: In the old days.
By ajfink on 5/11/2007 11:49:08 AM , Rating: 3
I want an F-22 with pin striping...

As Anand said, AMD told him they were keeping things toned down as to not get Intel rolling even harder against them. If indeed AMD has had a recent silicon breakthrough and they can clock their quads (not just the dual-cores) to 2.9Ghz effectively and coolly, then I think I'd rather have a dual-core somewhere at 3.3Ghz until software catches up with the threaded world.

RE: In the old days.
By lumbergeek on 5/11/2007 3:21:37 PM , Rating: 4
I want an F-35 with pin-striping. The F-22 Raptor does not have VTOL, so I would need a runway at home whereas with a VTOL F-35 I could fly to and from work and not need to worry about traffic congestion.....

RE: In the old days.
By bubbacub616 on 5/11/2007 5:23:56 PM , Rating: 2
ahh god i love being a geek

RE: In the old days.
By Natfly on 5/11/2007 12:06:07 PM , Rating: 5
In the old days you had:

Video Card,
Sound Card,
Network Card,
IDE Controller Card,
maybe an addon USB card.

Hell it used to take 2 video cards to get dual monitors, but now all mid-high cards have 2 DVI outputs and also s-video.

Now almost all of that can be had on a single motherboard. The only card I have in my PCs is a video card (with a tv tuner in just one of them.) More and more technologies come out but at the same time they integrate stuff together.

RE: In the old days.
By Armorize on 5/11/2007 5:25:58 PM , Rating: 2
Enough said there. Congrats, you are now the voice of about 1 in 5 people on the interweb.

Except they wont be hard drives anymore they will be SSD, or Holographic/Light based drives. Which will add another $2,000 for 10 Gb, yes Gb not GB.

Then of course theres the carpc which is slowly but surely sneaking its way into cars so you might be paying about $50k more to have your geo-metro with quad-cpu-gpu-ppu and about 10 other acronyms!!1

Something wrong?
By david1983xtc on 5/11/2007 6:37:07 AM , Rating: 3
How can that 2900XT works if it only has connected one 6pin power connector? And it seems that the 6pin connector is attachend into the 8pin connector... ?

RE: Something wrong?
By KristopherKubicki on 5/11/2007 6:42:02 AM , Rating: 2
LOL nice catch.

RE: Something wrong?
By James Holden on 5/11/2007 7:09:18 AM , Rating: 2
I'm told the 8-pin connector is backwards compatible with 6-pin connectors... but you still need 2 power rails into the card. The image only shows one.

RE: Something wrong?
By Mitch101 on 5/11/2007 9:24:38 AM , Rating: 2
It will work with the 6 pin power connection but wont allow for overclocking unless the 8pin power is connected.

RE: Something wrong?
By Habu on 5/11/07, Rating: -1
RE: Something wrong?
By Griswold on 5/11/2007 7:40:33 AM , Rating: 2
So you're saying the DT crew was lying to us with fakeware benchmarks the other week? :p

RE: Something wrong?
By Tsuwamono on 5/11/2007 2:18:22 PM , Rating: 1
your a moron. Thats all i have to say

RE: Something wrong?
By James Holden on 5/11/2007 7:10:12 AM , Rating: 3
LOL no wonder there's no benchmarks!

RE: Something wrong?
By Griswold on 5/11/2007 7:36:23 AM , Rating: 2
Yes, you got it wrong. The second plug appears to be for the overdrive (overclocking) functionality, which requires both plugs connected to power. For stock speed you wont need that. Thats what has been floating about for quite some time now and the demonstration at hand obviously proves it.

As to why the 6 pin one is connected to the 8pin... no idea, but it apparently still works.

RE: Something wrong?
By david1983xtc on 5/11/2007 7:53:55 AM , Rating: 2
Sorry if this has been asked before... but is 2800xt designed to work in pcie2.0 slots?

If so, this will explain why it only has one 6pin power connector, pcie2.0 can bring more power than normal pcie right?

RE: Something wrong?
By david1983xtc on 5/11/2007 7:54:32 AM , Rating: 2
Sorry 2900xt :)

RE: Something wrong?
By Creig on 5/11/2007 8:05:37 AM , Rating: 3
The RD790 Wahoo board has a PCI-E 2.0 slot which can deliver 150 watts on its own. That plus a single 75 watt 6-pin PCI-E 1.0 equals a max supply of 225 watts.

To get that much power on a system available right now you need a slightly different configuration. Specifically, one PCI-E 1.0 slot at 75 watts and two 6-pin PCI-E 1.0 connectors at 75 watts each for a grand total of 225 watts.

RE: Something wrong?
By fake01 on 5/11/2007 8:15:20 AM , Rating: 1

First of all, the x2900xt can work with only 1 2x3 (6 pin) connector perfectly fine. The other 2x4 (8 pin) connector is for overclocking purposes for those who wish to overclock, it is NOT required to run the card at stock speeds. Though if you want the extra features like "optimal overdrive overclocking", than it is required.

But i noticed that they have a 2x3 (6 pin) connector inserted into the 2x4 (8 pin) slot. Perhaps the 2x3 (6 pin) connector can work in both the 2x3 (6 pin) slot and the 2x4 (8 pin) slot.

By gramboh on 5/11/2007 6:50:17 AM , Rating: 2
Why is AMD not releasing any Phenom/R600 benchmarks? The only possible reason I can think of is trying to avoid stagnating sales of Athlon chips. But benchmarks would push up their stock price, they have to do it.

By Griswold on 5/11/2007 7:39:31 AM , Rating: 2
Benchmarks dont push stock prices.. however, selling less of your current products will have a negative impact on it - theres still a quarter (revenue figures) between now and the launch.

By fake01 on 5/11/2007 8:28:36 AM , Rating: 2
They are releasing theoretical benchmarks. Intel didn't release proper benchmarks of their C2D's before they came out, and yet they annihilated the competition. Perhaps AMD will do the same. If Intel truly knew what Barcelona (Phenom) was capable of then they will do everything in there power to overcome it, they have the money, power, and resources to do this. AMD can't afford another lose so they are probably keeping most real benchmarks in their bullet proof safe waiting to release their products to us, and when we see the real benchmarks we will go "WOW", just like you all did with the C2D's.

By coldpower27 on 5/11/07, Rating: 0
By raven3x7 on 5/11/2007 12:10:38 PM , Rating: 3
AMD never did release benchmarks before launch in the past, why would they now? Especially since their current lineup is already struggling because of poor marketing and low performance on the high end parts. BTW even if Intel knew right now or for the past few months that AMD would take back the performance crown there is not much they could do despite their power and resources, other than try to accelerate their roadmap and hope that is good enough (sounds familiar?). You cant design a new processor in a few months. Finally id like to pint out that AMD held true to their commitment of making K10 AM2 compatible. So in retrospect from a future proof point of view AM2 was/is actually a better platform

By coldpower27 on 5/11/07, Rating: 0
By raven3x7 on 5/12/2007 6:47:18 AM , Rating: 2
Yes and Intel did pay the price for it. Namely their Pentium D sales dropped quite a bit.
Upgrade to Penryn? Not without buying a new motherboard im afraid.
As for AM2+ backwards compatibility most ppl i'm sure haven't really noticed. And AMD's marketing failed to make the general public aware of this feature and its implications.Standard powernow will still work on AM2 mobos btw.
I do agree that AMD has the same performance at the same price currently. I disagree however that Intel gets sales due to brand recognition only. Most ppl actually believe that any Core2 will outperform any X2 which(with the exception of media encoding) is clearly not true. Enthusiast websites have significantly contributed to this illusion. But in the end it is again AMD's marketing devision that should take most of the blame for this really.
I will disagree with your final point as well. Most ppl would presume that Agena performance would be indicative of Kuma performance, and if its really good it could potentially damage X2 sales significantly.

By coldpower27 on 5/13/2007 1:08:09 PM , Rating: 1
They knew they would take a hit to their own margins, but given their 65nm process advantage it didn't really matter quite as much. They still were quite profitable in the end. Intel's intention was to slow down AMD's momentum and they did that, not to mention AMD itself helped this along with their AM2 platform shutting off their own 939 User base, as well as forcing people to switch to DDR2 just as Conroe as coming.

Not as many people upgrade as you think, so the backwards compatibility feature matters more for the DIY, rather then then anybody else. AMD's OEM would want to place checkbox features on their products and having HT3.0 and the split plane power management features would be a big plus here.

Standard Cool'n'Quiet won't be sufficient as that only puts AMD and Intel at parity due to hotter chipset on Nvidia's side, and ATI chipsets haven't been of sufficient quality for me to consider on the performance or stability front.

I don't believe most people would be foolish enough to presume a Quad would be indicative of a Dual's performance. There's nothing further to damage on AMD's side, AMD's next generation architecture is coming out soon enough that people are only consider buying processors to tide them over till then rather then on the high end.

I said given even performance, default priority goes to Intel, I didn't say that was the only factor to the reason why Intel gets sales.

By bubbacub616 on 5/11/2007 5:27:17 PM , Rating: 2
the performance boost from c2d was pretty damn impressive - just going on history alone I think its unlikely that we will get 2 such jumps in performance so close together - however i will be VERY happy to be proved wrong

By Mitch101 on 5/11/2007 9:34:58 AM , Rating: 1
It may have been they didnt have the recent respin of silicon available at the press event. They also have some excess Athlon 64's still to get out of inventory.

Rumor is a recent silicon respin got the chips to 2.9ghz from the original 2.6ghz and possibly even up to 3.1ghz on stock air cooling.

Another rumor is they are rebaking the R600 again possibly because of the respin success of the K10 improvements and are hoping to get a similar response from the R600.

Not sure what occured in the lab but supposedly AMD accidentally hit a homerun and got a 10-20% boost in chip speed that Im not sure they even anticipated.

10 years from now...
By DarkElfa on 5/11/2007 9:32:29 AM , Rating: 2
No, 10 years from now a PC will have 1 chip with 128 cores= 64 multi purpose and 64 specialized for things like graphics, sound and physics.

RE: 10 years from now...
By mcnabney on 5/11/2007 11:38:17 AM , Rating: 5
And people will buy them and just use them to get email, work on their MySpace page, and play solitaire.

RE: 10 years from now...
By thartist on 5/11/2007 4:40:34 PM , Rating: 2
People will buy that because Windows 2017 will require 96 cores to run, and because the computing market will have nothing "slower" than those processors. That's the way it is...

RE: 10 years from now...
By goku on 5/12/2007 5:22:08 AM , Rating: 2
And of course when you argue with people that windows is more bloated than ever, the pseudotechies will come out and say that "no, you're wrong" or "if it's not being used, it's wasted" etc.

cool, how bout some benchies??
By theteamaqua on 5/11/2007 6:07:06 AM , Rating: 2
cool, how bout some benchies?? divx, cinebench ...

RE: cool, how bout some benchies??
By Adsski on 5/11/2007 8:21:14 AM , Rating: 3
I've read somewhere that the K10 was demoed performing a video encode, apparently 1080p movie trailer source material to H.264 in real time. This sounds impressive but I don't know the codec of the source material as the article didn't say, best guess would be the high def mpeg2 or wmv.

Anyone got anymore details?

RE: cool, how bout some benchies??
By Mitch101 on 5/11/2007 9:47:00 AM , Rating: 2
I heard the same and to give you an example my Dual Core Athlon 64 with 2 gigs of ram overclocked to 4800+ levels takes about 12-16 hours to do a movie in HD. While we dont know what exact HD format and settings they used that makes it roughly 8 to 10.6 times faster than my config can do it in. WOW!

Thats recorded on my media center PC using an antenna for the HD feed. So No Im not copying HD movies.

RE: cool, how bout some benchies??
By moocow2 on 5/11/2007 7:42:53 PM , Rating: 2
I have seen inconsistent reports on different websites, but I think they were transcoding a 1080P recording into a 720P H.264 format, on a dual quadcore Agena processor. This would be 1/3 the encoding resolution as you are running with 4 times as many cores.

amd/ati merger
By ttnuagadam on 5/11/2007 11:34:25 AM , Rating: 2
I wonder if the ati/amd merger had anything to do with both companies getting a little behind...

RE: amd/ati merger
By JimFear on 5/11/2007 12:56:40 PM , Rating: 3
Sounds kinky :)

Motherboard molex connecters
By nbesheer on 5/11/2007 1:53:47 PM , Rating: 4
Is it safe to assume that the 2 molex connectors on the motherboard are for crossfire since there both unplugged ?

PS. My first post ever.

By zsouthboy on 5/11/2007 10:57:29 AM , Rating: 2
In fact, they should've gotten to sponsor it somehow, and tossed the yahoo logo on it.

Instead, they just did a Slim Pickens reference.

That reminds me, that movie was awesome. Never before had I worried about my "essence."

By killerroach on 5/13/2007 9:55:34 PM , Rating: 2
Secondary problem: They don't want to repeat 3Dfx's demise.

It's not so much that they don't want to show Intel their projects; they don't want to show consumers them until they can clear the channel of back inventory. Intel could survive having a glut of Pentium D chips that nobody wanted; AMD doesn't have that margin for error. If they create excess hype for a chip six months down the road, and have millions of Athlon 64s that now nobody wants to buy because they're all busy waiting for Phenoms, well... AMD will go bankrupt before they will be able to ship Phenom chips in volume.

"When an individual makes a copy of a song for himself, I suppose we can say he stole a song." -- Sony BMG attorney Jennifer Pariser

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki