backtop


Print 36 comment(s) - last by Hare.. on Mar 14 at 2:53 AM

Intel announces energy-efficient quad-core processors that consume 50-watts of power

Intel today released two new energy-efficient quad-core Xeon processors for multi-processor servers. The new Intel Xeon L5320 and L5310 operate at 1.86 GHz and 1.60 GHz respectively. Energy-efficient Xeon models consume 50-watts of power, which translates to 12.5-watts of power per core. Intel’s regular quad-core Xeon 5300-series consume 120-watts of power.

Energy-efficient Intel Xeon L5320 and L5310 processors are nearly identical to their higher-clocked counter parts. The energy-efficient models have 8MB of total L2 cache, 4MB of shared L2 per pair of cores, as with other Xeon 5300-series models. Front-side bus of the Xeon L5320 and L5310 are clocked at 1066 MHz, similar to the normal Xeon E5320 and E5310.

Pricing for the energy-efficient Intel Xeon L5320 and L5310 is $519 and $455 in quantities of 1,000, respectively. Intel Xeon L5320 and L5310 processors are drop-in compatible with Intel’s Bensley server platform.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Wow thats impressive !!
By ButterFlyEffect78 on 3/12/2007 4:59:16 PM , Rating: 2
Now imagine how low the upcoming 45nm in wattage would be at those speeds... hmmmm Im thinking 5 watts per core.. Anyone?




RE: Wow thats impressive !!
By Axbattler on 3/12/2007 5:08:31 PM , Rating: 4
That'd be great.. Improvement in CPU power efficiency is always good, but I'd like ATI/nVidia to follow suit foremost.


RE: Wow thats impressive !!
By smitty3268 on 3/12/2007 5:32:22 PM , Rating: 2
A 62.5% decrease in power seems unlikely from 45nm.


RE: Wow thats impressive !!
By HaZaRd2K6 on 3/12/2007 5:42:07 PM , Rating: 2
Slightly, yes. But we could conceivably see 9-10 watts per core (totaling 36-40 watts).


RE: Wow thats impressive !!
By theteamaqua on 3/12/2007 8:24:32 PM , Rating: 3
well yeah i do hope that ATI and nVidia follow what intel has been doing with Core 2, lower power consumption yet performance is better...

R600 can draw as much as 270W ... to be honest i dont think nvidia nor ATI will fix that, GPUs will keep eating more and more power


RE: Wow thats impressive !!
By theteamaqua on 3/12/2007 8:31:18 PM , Rating: 2
well by eating more power i mean when a new architecture come out, for example G80 on 80nm will definitely have lower power consumption than 90nm ones. but when G90 on 80nm or R700 on 65nm chances r the power envelop will rise again


RE: Wow thats impressive !!
By Hare on 3/13/2007 3:07:47 AM , Rating: 4
quote:
R600 can draw as much as 270W ...
Until we see actual measurements I won't believe these figures.

Example:
- Intel Core 2 Extreme X6800 (2.93GHz/4MB)
- 8800GTX
- Bells & Whistles
--> Load consumption: 230W (the whole machine)

I'm having a hard time believing that the R600 would have >double the power requirements.


RE: Wow thats impressive !!
By Hare on 3/13/2007 3:24:11 AM , Rating: 2
This was supposed to be a reply to the person above. Something is wrong with the comment system. Someone just posted to another article just because he had another tab open? Javascript problems with ID's?


RE: Wow thats impressive !!
By theteamaqua on 3/13/07, Rating: 0
RE: Wow thats impressive !!
By Hare on 3/13/2007 6:53:32 AM , Rating: 3
That measured power consumption in the Anandtech article is from the wall socket. Multiply that with the efficiency of the PSU (around 80%)

280W * 0.8 = ~230W

quote:
there is a reason why the minimum PSU for 8800 series is 450w

That "suggested power" is because people have ancient PSU's with minimal 12V lines. Old and cheap 600W PSU's can't even match a well designed 350W PSU (12V line amps are all that matters nowadays). It's just a safety margin for people with crappy PSUs. I've seen an FX60 + CF 2 x X1900XTX rig running just fine on a quality 380W power (power available from the 12V lines).

If 150W Pentium D had trouble keeping cool with big heatsinks, how do you think they'll cool down a 270W R600? I don't think R600 will eat much more than 150W. I could be wrong, we'll see...


RE: Wow thats impressive !!
By theteamaqua on 3/13/2007 2:36:04 PM , Rating: 1
well go ahead and try run R600 with X6800 with a 400W PSU and tell me how it is

besides u just mutiply .8 ... by not knowing the PSU's efficeincy

also try runing GForce 8800GTX with X6800 with a 300W PSU and tell me how well it overclocks


RE: Wow thats impressive !!
By Hare on 3/14/2007 2:53:07 AM , Rating: 2
PSU maker PR FUD sure works great.

Check out silentpcreview.com. I know how PSUs work and I know what are realistic power figures. The 80% efficiency quess is optimistic. It also could be around 75% thanks to 115V grid. High end power supplies peak around 85% and usually can't exceed 80% efficiency with 50% loads.

As I said. I've seen a quality 380W power supply running a FX60 + 2 x X1900XTX (OC'd) without breaking a sweat. Power consumption is a lot smaller then people think. PSU makers lie just to sell those >600W PSU's.


Single Core, Dual Core, Quad Core?
By breethon on 3/12/2007 10:33:16 PM , Rating: 3
Has there been anything coded to actually take advantage of multi-core technology? I am thinking games mostly. I haven't seen anything yet. I run a dual core opteron and have yet to see anything to make my purchase worth it, let alone 4 cores.




By theteamaqua on 3/12/2007 10:58:36 PM , Rating: 2
good point even SupCom barely uses my 2nd core ... my 2nd core is always at 20 to 30% while 1st core is at 90 to 100% ... they chose coarse over fine grained ...


By nurbsenvi on 3/13/2007 12:58:54 AM , Rating: 2
Try Doom3/quake4 and Oh! the upcoming source engine update is set to utilize n number of CPUs it's coming with Half Life episode 2 this American Fall.


RE: Single Core, Dual Core, Quad Core?
By D4rr3n on 3/13/2007 8:41:59 AM , Rating: 3
Well as far as multi-core support and overall performance as a whole it depends which OS you are using and more importantly what programs you use. IMHO quad-core is overkill for home desktop pc's, especially if running XP. The overall performance in XP with a quad-core just isn't efficient enough nor does it scale well enough to justify it's price premium vs. just a dual-core processor. An e6600 and qx6700 will OC to the same levels and other than a handful of quad-core optimized apps they will perform identical and you won't notice the difference.....you will however notice the $500+ you saved.

The main performance increase for home users with multiple cores is obviously in multitasking. The difference between multitasking on a single and dual setup is very noticeable. However, between a dual and quad the difference is pretty negligible....even when testing scenarios you would never encounter in real world use. Which just further illustrates XP's inability to efficiently use a quad. Now I won't tell anyone not to get a quad or that it is somehow a bad processor. Quad's are theoretically better for sure and potentially better for the future. If quad-core support gets big you get a nice boost which would be unavailable with the dual. If they cost close to the same price or if XP could get the most out of them or if they had better app support that would certainly be what I'd buy. Unfortunately that's not the case, with XP.

If you plan on not using XP, supposedly Vista, 2003 server, XP Pro x64, and Linux (and other alternative OS's) all have much better quad+ core support. From what I've seen that is most likely true (I still would question x64's support though) but I cannot personally vouch for their performance either way. However I will say this even if these OS's do work better with quads it still won't matter unless the programs you use fully support quads otherwise it's the same old story as in XP.

Now as far as gaming goes that is probably the last place you should look for improvements with a dual or quad and it is the last place you will find any improvements. Quad cores and dual cores offer pretty much identical gaming performance clock for clock. Either are fine for games but they really can't offer any improvements due to having multiple cores. Just to put it simply, if you max out all the possible quality settings both in game and in the driver control panel, max out the resolution, and max out AA and AF your video card (even an 8800gtx) will be the bottleneck in your system, PERIOD. To get to a point where the cpu is a limiting factor in your system and any multicore optimizations would have any effect you have to lower those settings to a point that doesn't max out the potential of your video card. So dual/quad core improvements (at max settings) for gaming....don't hold your breath on that one unfortunately. And anyone that wants to argue that please note again I am talking about when you are running MAXIMUM POSSIBLE VIDEO SETTINGS.


RE: Single Core, Dual Core, Quad Core?
By theteamaqua on 3/13/2007 2:43:32 PM , Rating: 2
@ y D4rr3n on Ma

yeah nice reading

i agree that "An e6600 and qx6700 will OC to the same levels and other than a handful of quad-core optimized apps they will perform identical and you won't notice the difference.....you will however notice the $500+ you saved."

and yeah for 3d gaming GPU is always the most important component period.

as for servers.. well yeah servers are design to take advantage as many cores as possible

if only the Win XP , Vista, does the multithreading for the software on a lower level


By D4rr3n on 3/13/2007 5:39:15 PM , Rating: 2
Well thanks, I actually wrote something a little bit longer (though I thought it was too long and trimmed it) talking about quad core performance in servers being very good even if they decided to serve off xp pro. It certainly couldn't match some specialized server software, linux, or even 2003 server it was no slouch. I've even seen dual quads serving off an xp pro box and get perfectly scaled performance along each added core. My point was quads are more a specialized task cpu (by that I mean you get it for a specific purpose, most likely in a professional/business/design/server environment) and unless you are running a certain program designed for a quad setup in that specific type of environment you won't notice a difference. And 99% of your home users will never use those programs.

The major problem with XP is the thread handler not allocating properly or efficiently. It's performance when you using dual core improves immensely (not perfect though and eventually reaches it's limits). One would expect that doubling to a quad core would have similar effects but it's improvements are very small. Now this is multitasking performance with XP allocating the cpu cycles for each thread. When you run a single program properly designed for quad you will get the performance you should. So like I said you can't assume anything other than XP on it's own isn't an adequate platform for quads, especially your standard home desktop user who thinks it may improve multitasking or.....gaming.

And if someone goes out and buys an $800-900+ quad for gaming they need to do some research or discover the meaning of priorities. For the same price or less you could get an e6600 and 8800gtx, or 2 8800gtx's for sli, or one of Dell's 24" HD lcds with either card or cpu etc etc. Several ways to better spend that money for an increased gaming experience that a quad won't give you.


By Hoser McMoose on 3/13/2007 9:48:55 AM , Rating: 4
Uhh, you DO realize that these are Xeon chips that are targeted at the server market, right? Many server applications will take about as many cores as you can throw at them!


Do low power CPUs over clock better?
By nurbsenvi on 3/12/2007 6:02:16 PM , Rating: 2
Do low power CPUs over clock better than the normal ones?

How do they make low-power CPUs? is it just good batch of CPUs down clocked or do they actually do something to reduce the leakage?

Sorry to ask so many questions...




RE: Do low power CPUs over clock better?
By Shintai on 3/12/2007 6:09:44 PM , Rating: 2
They dont OC better per default. They are just running at a lower voltage. Sometimes using slower switching transistors that takes less power.

There was a person at the XS forum running a C2D at 0.88V at stock speed of 2.4Ghz. I doubt it used over 15-20W. So low voltage or ultra low voltage is the key.


By yehuda on 3/12/2007 9:28:55 PM , Rating: 2
Here's the link:

http://www.xtremesystems.org/forums/showthread.php...

I wish we could hear more often about undervolting.


By D4rr3n on 3/13/2007 12:25:57 AM , Rating: 2
nurbsenvi: Obviously nothing is guaranteed in overclocking, but yes a low power/low voltage version of CPU should overclock better than the higher power version the vast majority of the time. This low power version creates less heat and uses a lower stock voltage (which are interrelated) to achieve the same speeds. Being stable at a lower voltage for the same speeds generally means a better/more efficient CPU also. And 2 of the major things that limit your maximum overclock are heat and the maximum safe voltage you can give your CPU. So if one chip can do stock speeds at lower voltage and giving off less heat you can generally assume that these differences will remain as you continue to overclock. Basically you have more "wiggle room" at the top end with the low power versions. Which is why so many overclockers have chosen in the past to run low power mobile versions of desktop CPU's in their main setup. But, like I said nothing is guaranteed.

Yehuda: While I agree and understand the benefits undervolting can have in various scenarios (especially passively cooled silent pc's, sff setups, and HTPC's for example). It can actually be quite dangerous (even more so than overvolting) depending on the type of setup you have. A64's have been known to die running low vcore and high vdimm. The combination of a low cpu voltage and high memory voltage did not play nice with the on die memory controller and a lot of people lost their A64's very fast unfortunately. This is no longer an issue with the much lower voltage used in DDR2, however if it is to be discussed more people need to be aware of possible dangers related to certain configurations.


By Hoser McMoose on 3/13/2007 9:43:59 AM , Rating: 2
There's as much reason to suspect that it might be WORSE for overclocking, not better.

There are two tricks to lowering power consumption. The first and most important is to reduce the voltage. By running at a lower clock speed and lower voltage the same chip will consume less power. Since power tends to go up with the square of voltage, this factor is more important.

However beyond that there are some tweaks made at the manufacturing level. Making processors is a tricky business and there are a number of adjustments that can be made in the system. Some of these adjustments can improve maximum clock speed, yields, or power consumption.

Intel (and AMD, IBM, etc.) will tweak these various parts of their manufacturing process to optimize production for various parts. One run of wafers might get tweaked for maximum clock speed and sold as high-end desktop chips. Another run might get tweaked for maximum yields to be sold as mainstream or value processors. A third run might get optimized for minimum power consumption to be sold as mobile chips and these low-power server chips.

Then, after all is said and done, it comes down to a measure of tolerances. Intel has followed AMD's lead and just specified a few "thermal design powers" for all their processors, with that number now simply being a maximum that will not be exceeded for any processors in that range. Of course, individual chips will vary. So, for example (numbers pulled purely out of my ass), out of 100 Core 2 Duo/Quad/Xeon dies at 2.6GHz, Intel might have 5 that consume between 60 and 65W of power, 20 consuming between 50 and 60W, 50 consuming between 40 and 50W, 20 that consume between 30 and 40W and the last 5 consuming less then 30W. Now Intel could sell ALL of these chips as Core 2 Duo chips with a 65W maximum, or they could split off the best 25% or so and sell those at a premium and rated with a 40W TDP, while selling the remainder as "standard" chips. Market condition play a big role in deciding whether this is practical/economical to do or not.


They should make 12.5 watt single core
By electriple9 on 3/12/2007 8:03:37 PM , Rating: 1
They should make 12.5 watt single core cpus, and 25 watt dual cores. Would totally take via and other low power cpus away. I feel bad for amd. Even tho I am an amd fan.
Thanks




RE: They should make 12.5 watt single core
By jeffbui on 3/12/2007 9:53:48 PM , Rating: 2
You realize they make single/dual core low voltage CPUs right? Core Solo U1500 at 5.5 watts and L series dual core at 15W and U series at 9 watts.


RE: They should make 12.5 watt single core
By electriple9 on 3/13/2007 1:37:45 AM , Rating: 2
But they do not make these cpus for the average customer, they are usually soldered on. They should make a socket775 cpu at low wattage.
THanks


RE: They should make 12.5 watt single core
By Shintai on 3/13/2007 5:50:22 AM , Rating: 2
Soldered on? Where? They are normal mobile socket 478/479. And you can even get desktop boards that supports these mobile CPUs.


By Zandros on 3/13/2007 11:43:17 AM , Rating: 2
Please.

ULV/LV Core/Core 2 are packaged in flip chip ball grid arrays. Which means that they are soldered to the motherboard.


Sweet.
By Mitch101 on 3/12/2007 6:21:23 PM , Rating: 1
Now if only we could get the video cards down in power consumption.




RE: Sweet.
By Shintai on 3/12/07, Rating: 0
RE: Sweet.
By KaiserCSS on 3/12/2007 7:00:09 PM , Rating: 1
Ignorance must be bliss.


RE: Sweet.
By AntiV6 on 3/13/2007 1:41:57 AM , Rating: 2
Why is everyone complaining about the "extreme" energy consumption of modern GPU's? People want more performance, and that is what Nvidia/DAAMIT give us. Wont the extra juice that the GPU's consume only cost a few dollars more per month?

And if so; the people who own them should be able to afford it...


RE: Sweet.
By TSS on 3/13/2007 4:20:51 AM , Rating: 2
it's not so much the juice as the heat.

my ATI x1900xt at stock speeds runs 90-100 degrees stock. thats CELCIUS, not fahrenheit. replaced the stock cooling block for a zalman (vf500 i belive, the one before that heatpipe one) and it mattered, uhm, almost nothing. just cools down alot quicker but still runs hot. not to mention to could start building some heavy duty muscle by just lifting the stock cooler a couple of times, and now the new ATI card will be bigger, larger cooler, and oh! if my card runs 90 stock, then how much will a card do when you up the wattage by what was it? 40% or something? (cant find how much the x1900xt uses).


and in other news
By Samus on 3/13/2007 6:02:48 AM , Rating: 3
hector ruiz shit his pants as he walked in the office this morning upon the breaking news.




Intel news???
By crystal clear on 3/13/2007 2:45:31 AM , Rating: 1
Translation on the chinese version-

According to source form Taiwan manufacturer, Intel recently unveiled its updated pricelist for Q3’s processor products. It’s showed that the Intel Core 2 Quad Q6600 priced at $851 per 1,000 units shipped will experience a big jump of 70% off, putting Quad Core into commonplace.

In the existing plan of price update for Q2, Intel Core 2 Quad Q6600 has already experienced a large cut from $851 per 1,000 units shipped to $530 on April 22. As to against its rival AMD’s new processors effectively, Intel has painfully given up the release of Core 2 Quad Q6400 (2.13GHz/4MB L2 x 2/1066MHz FSB) in Q3. Instead, the company decided to make another cut to Core 2 Quad Q6600, where its price would be as low as $266 per 1,000 units shipped. Meaning this new processor would experience a 68.7% accumulative cut of price since its release in March.

In addition, in order to fill in the gap of Core 2 Quad Q6600 (2.66GHz?4MB x 2 L2 Cache?1066MHz FSB), Core 2 Quad Q6700 would be released. The Core 2 Quad Q6700 processor will be priced at $530 per 1,000 units shipped.

Another point that should deserve our attention is that Intel would no longer release any Dual-Core Intel Core 2 Extreme edition. Intel Core 2 Extreme QX6800 (2.93GHz/4MB L2 x 2/1066MHz FSB) processor released in Q3 will be the last Dual Core of this series. The Core 2 Extreme QX6800 processor will be priced at $999 per 1,000 units shipped, replacing the original segment of Core 2 Extreme QX6700. On the other hand, although Intel’s next-generation Dual Core would has an enhanced FSB to 1333MHz FSB, the next-generation Quad-Core Yorklfield would has 1333MHz only in Extreme edition, while other would still remain in 1066MHz FSB.

Source-
HKEPC

Note this-

1) "Intel would no longer release any Dual-Core Intel Core 2 Extreme edition. Intel Core 2 Extreme QX6800 (2.93GHz/4MB L2 x 2/1066MHz FSB) processor released in Q3 will be the last Dual Core of this series."

2) Meaning this new processor( Core 2 Quad Q6600 ) would experience a 68.7% accumulative cut of price since its release in March.




"I want people to see my movies in the best formats possible. For [Paramount] to deny people who have Blu-ray sucks!" -- Movie Director Michael Bay

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki