Print 30 comment(s) - last by Aymincendiary.. on Jul 5 at 11:25 PM

AMD drops the Phenom X2 TDP to 45-watts

AMD plans to cut power consumption of its mainstream Phenom X2 processors with the Phenom X2 GE-series, according to the company’s latest roadmap. The Phenom X2 GE-series joins the Phenom X2 GS-series in late Q1’2008, after the Phenom X2 GS-series makes its debut. AMD has three Phenom X2 GS-series in the pipeline with launches beginning in Q4’2007 and more models added in Q1’2008.

The new Phenom X2 GE-series matches the recently released Athlon X2 BE-series in terms of thermal design power, or TDP, at 45-watts. AMD Phenom X2 GS-series processors have 65-watt and 89-watt TDP ratings.The low power Phenom X2 GE-series will have three models – the GE-6600, GE-6500 and GE-6400.

AMD’s Phenom X2 GE-6600 clocks in at 2.3 GHz, coincidentally, the Intel Core 2 Duo E6600 clocks in at 2.4 GHz. The middle of the Phenom X2 GE-series consists of the 2.1 GHz Phenom X2 GE-6500 while the 1.9 GHz Phenom X2 GE-6400 occupies the bottom of the lineup.

AMD Phenom X2 GE-series
L2 Cache
L3 Cache
HT3 Bus
GE-66002.3 GHz 2x512KB2MB
~ 3200 MHz
GE-6500 2.1 GHz2x512KB2MB
~ 3200 MHz
1.9 GHz 2x512KB2MB
~ 3200 MHz

All Phenom X2 GE-series processors share the same features. AMD equips the Phenom X2 GE-series with an HT3 bus with speeds equal to or in excess of 3.2 GHz. AMD has yet to set the official speeds of the HT3 bus on Phenom X2 GE-series processors. Phenom X2 GE-series processors are identical to the GS-series, in terms of cache configurations. The Phenom X2 GE-series feature 512KB of L2 cache per core and 2MB shared L3 cache.

Expect the AMD Phenom GE-series to drop into a Socket AM2+ platform in Q1’2008.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Amd is not bad
By soydios on 7/3/2007 1:22:11 PM , Rating: 3
Let me set some things straight.
For years, AMD's K8 architecture was beating Intel's Pentium 4 across the board: in raw performance, in power efficiency, in price, and in overclocking. Then, Intel released the Core 2 Duo, and left all of us with our jaws on the floor as Chipzilla beat AMD at their own game, in every possible fashion.

Intel is now dominant in the enthusiast market. The K8 architecture has run out of headroom for overclocking, yet overclocks to 50% above stock are not uncommon for Core 2 Duo. Intel's Conroe family is more efficient in power and performance clock-for-clock than AMD's K8.

Time will only tell what Phenom will bring to the table. Current benchmarks and chip yields are somewhat disappointing, but keep in mind, this is exactly where Athlon 64 was back in the day when it was introduced. I'm going to wait and see what they look like after production ramps up first quarter next year.

The true quad-core versus a pair of dual-core dies is only relevant in a theoretical debate or synthetic benchmarks designed to test core-to-core latency. In real-world benchmarks and usage, the difference is negligible.

AMD's fabs are running at 100% capacity. They aren't going away anytime soon. I really hope that my next system in two years will have an AMD processor in it, for their sake, but I'll buy whatever gets me the best performance/price ratio.

RE: Amd is not bad
By bolders on 7/5/2007 7:26:37 AM , Rating: 2
Good post with a balanced perspective. Dont know why u have been rated down :(

"There's no chance that the iPhone is going to get any significant market share. No chance." -- Microsoft CEO Steve Ballmer
Related Articles
AMD Prepares 45-watt "Brisbane"
May 30, 2007, 1:00 AM
Here Comes "Conroe"
July 13, 2006, 12:47 PM

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
No More Turtlenecks - Try Snakables
September 19, 2016, 7:44 AM
ADHD Diagnosis and Treatment in Children: Problem or Paranoia?
September 19, 2016, 5:30 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
Automaker Porsche may expand range of Panamera Coupe design.
September 18, 2016, 11:00 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki