backtop


Print 85 comment(s) - last by plonk420.. on Sep 23 at 1:37 AM


The new Westmere-EX CPU will bring 10 cores to a single server socket.  (Source: Anandtech)

Intel will keep the CORE-ix brand names for its upcoming "Sandy Bridge" architecture redesign.  (Source: Anandtech)

"Sandy Bridge" will use a ring bus to allow the on-chip cores and media units (including the on-die GPU) to access the cache.  (Source: Anandtech)
Chipmaker doesn't reveal launch date for the Westmere-EX

Intel likely today set those looking to deploy a high-performance single socket server solution salivating with its unveiling of the Westmere-EX.  Following the Gulftown lineup -- which trickled out starting in March 2010 -- the Westmere-EX is Intel's latest 32 nm Westmere chip.

Westmere is very similar to the
Nehalem 45 nm architecture, meaning it's a "tick" design -- not a major redesign.  That's not to say there isn't enough to be excited about here, though.  Intel is making good use of its extra die space saved by the shrink and the Westmere-EX packs an incredible 10 cores in a single socket package.  That adds up to a total of 20 threads.

For the supercomputing-minded, the new chip bumps the amount of usable memory from 1TB (64 DIMM slots) to 2TB.  There's no official word on the name of the processor -- past
Gulftown server designs were in the Xeon 3600- and 5600-series.  Also not revealed are clock speeds and launch date.

Perhaps more exciting was new details Intel revealed about its upcoming "tock" (architecture redesign), code-named
Sandy Bridge The upcoming 32 nm architecture will feature a ring design for its last-level cache access.  Cache will be accessible by an on-chip 3D Graphics Processing Unit, the four (or potentially more) cores, and the Media Processing unit.  The ring bus is designed to deliver high-bandwidth to the various connected cores in the chip.

The processor will feature the return of Turbo Boost mode, which allows the easy overclocking of Intel's processors.

Sandy Bridge PC processors will keep the CORE-i3, i5, and i7 designations and will be rebranded the  "new CORE-i3..."  That approach is likely to create confusion among customers about exactly what they're buying, given that the average user likely wouldn't be able to pick a Nehalem i7 from a Westmere i7 or Sandy Bridge i7.

On a more positive note, though, 
AnandTech is reporting that the Media Processing Unit will include video transcode hardware.  In a demo that hardware crunched ~1 minute long 30Mbps 1080p HD video clip to an iPhone compatible format in under 10 seconds.  The transcode hardly can be viewed as Intel's attempt to fend of NVIDIA's GPU computing from entering the consumer market.

GPU computing is a hot new field of computing -- it centers around the notion that dedicated video hardware can outperform CPUs at a number tasks, including chemical simulations, video encoding, physics simulations, and more.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

not that Im complainging but
By shin0bi272 on 9/13/2010 6:54:12 PM , Rating: 3
I cant believe Im saying it but who needs this much power in their home pc? Servers I can see sure but home gaming wont see much of a benefit from 10cores 20threads for years.




RE: not that Im complainging but
By thrust2night on 9/13/2010 7:17:25 PM , Rating: 2
You're right. But the sale of such processors to millions of home users will result in more incentive for Intel (and its competition) to continue innovating and moving technology forward. Console hardware shows that we don't need modern GPUs or CPUs to play games but that doesn't mean Intel should simply stop R&D. These CPUs can be used by companies as well. Imagine the effects it will have on the economy if Intel and AMD only refreshed their processors once or twice a year.


RE: not that Im complainging but
By thrust2night on 9/13/2010 7:18:50 PM , Rating: 2
I meant.... once or twice a decade :)


RE: not that Im complainging but
By Cypherdude1 on 9/13/2010 10:56:45 PM , Rating: 2
quote:
I cant believe Im saying it but who needs this much power in their home pc? Servers I can see sure but home gaming wont see much of a benefit from 10cores 20threads for years.
I agree. Most software is not optimized to run on more than 4 cores. Even with the latest O/S's which have better CPU multi-core management, most users don't have that many apps running to keep a 10-core CPU busy.

That being said, a 6 core fusion-type CPU would be practical if it had 4 CPU cores and 2 GPU cores. The problem is manufacturers tend to cut costs when they sell integrated products. The CPU manufacturer would use cheaper CPU and GPU cores resulting in less performance.

Moreover, the true limiting factor would be with the motherboard manufacturer. The CPU manufacturers would be completely dependent on Asus to manufacture a motherboard which has all the correct external hardware connections. The motherboard is where all the connections would be made. The motherboard would need at least 2 DVI connectors and at least 1 HDMI connector.

Manufacturers usually reduce features for integrated products because they want to decrease costs and decrease their selling price to attract more consumers.


RE: not that Im complainging but
By theapparition on 9/14/2010 9:55:44 AM , Rating: 3
quote:
Moreover, the true limiting factor would be with the motherboard manufacturer. The CPU manufacturers would be completely dependent on Asus to manufacture a motherboard which has all the correct external hardware connections.

Not a problem for Intel since they sell plenty of motherboards. If other manufacturers want to sell competitive offerings, they will feature rich thier products as well. Or if a consumer doesn't want those features, get the cheaper model. Pretty simple.

I think the point you bring up is pretty much irrelevant.


RE: not that Im complainging but
By Lerianis on 9/19/2010 1:16:44 PM , Rating: 2
Not really irrelevant seeing as how a lot of people don't trust Intel made motherboards. I personally don't, I only trust non-Intel made motherboards.


RE: not that Im complainging but
By MrFord on 9/14/2010 11:59:43 AM , Rating: 2
If pretty much any AMD785/890GX boards features VGA/DVI/HDMI, I'm sure that wouldn't be a problem at this point


By Cypherdude1 on 9/15/2010 5:33:54 AM , Rating: 2
quote:
If pretty much any AMD785/890GX boards features VGA/DVI/HDMI, I'm sure that wouldn't be a problem at this point
A decent integrated motherboard would require at least 2 DVI and 1 HDMI. Moreover, Intel has never really had high performance integrated graphics. In order for Intel to produce an integrated 6 or 8 core (4 CPU + 4 GPU core) CPU with high performance graphics, Intel would need to buyout nVidia.

Because AMD's CPU's are nearly as good as Intel's and AMD purchased ATI with their high performance graphics, AMD actually has the advantage here. An integrated AMD CPU, with 6 or 8 high performance cores, would actually be a better product than Intel's. As I mentioned before, Asus would need to produce a fully featured integrated motherboard with 2 DVI + 1 HDMI connectors and not skimp. Furthermore, AMD would need to include both CPU and GPU high performance cores and also not skimp. All of these conditions would need to be true to have a high performance system and I highly doubt this will happen, at least not in the foreseeable future.


RE: not that Im complainging but
By FITCamaro on 9/13/2010 7:41:10 PM , Rating: 3
Most companies are doing just fine with dual core CPUs. Unless you're working with CAD, audio encoding, video encoding, etc. A Core2 is more than enough. This is what the PCs at my office use.

For my machine I just need a decent graphics card. Which I don't have.....


RE: not that Im complainging but
By hughlle on 9/14/2010 3:42:58 AM , Rating: 4
What do you mean console hardware shows us we don't need modern GPU's and CPU's???? Consoles look and play like utter crap.


RE: not that Im complainging but
By Lerianis on 9/19/2010 1:20:43 PM , Rating: 2
Not really. Sure, compared to an EXTREME gaming PC playing the same game, the console version looks like crap.

However, for the average user, who has never seen that 'extreme gaming PC' video quality? A console looks pretty good.

Heck, I was playing an XBox 360 in Best Buy the other day while waiting for my parents to finish buying something..... I was shocked at the video quality.

Comparable to my gaming class PC (albeit that my PC is 2 years old).

It was OTHER things that turned me off the console: easily scratched discs and no way to back them up, having to have the easily scratched disc in the drive all the time, etc.


RE: not that Im complainging but
By invidious on 9/14/2010 2:06:12 PM , Rating: 5
quote:
Console hardware shows that we don't need modern GPUs or CPUs to play games but that doesn't mean Intel should simply stop R&D
WTH are you talking about? Console graphics are horrendous compared to PC graphics. If you can't tell the difference you are truly living in the dark.

The low resolution textures, lack of AA and AS, and super short mapping radius are glaringly obvious whenever I play ps3 or 360. Sure I get used to it after a while but that doesn't make it OK.


RE: not that Im complainging but
By Phoque on 9/15/2010 8:01:58 PM , Rating: 2
quote:
Console graphics are horrendous compared to PC graphics. If you can't tell the difference you are truly living in the dark.


Are you illiterate? Or perhaps slightly stupid? The guy said:

quote:
Console hardware shows that we don't need modern GPUs or CPUs to play games but that doesn't mean Intel should simply stop R&D


And even though I agree to some extent with what you say, I disaggree with your suggestion that thrust2night doesn't know what he's talking about, because it is true that current gen console demonstrate it doesn't take current hardware to play awesome games, f*** man, just look at how the Wii did with outdated cpu & gpu at launch.


RE: not that Im complainging but
By Lerianis on 9/16/2010 9:03:29 AM , Rating: 2
I have to disagree with that statement, thrust2night. The fact is that console hardware has SERIOUS problems with multiple people on the screen at one time at high resolutions and with the latest 'eye-candy'.

Those PC quality 'super-GPU's' are still necessary for good gaming for someone even like me, who doesn't care about eye-candy but wants more than 30fps framerates.


RE: not that Im complainging but
By ekv on 9/13/2010 7:32:27 PM , Rating: 3
Depends on what you use your PC for. Granted if it's just word processing, stick with your Core 2 or Q9650, etc. If you have a side business where you're Photoshop'ing or making video games then you'll want some more horsepower.

Home gaming can actually put 10 cores and 20 threads to use. The XBOX/360 already has multiple threads and it is quite convenient. It does take some thought to use the parallelism, and that is why it is taking a couple "years" to take full advantage of the hardware.

Besides, I love b*tchin' hardware. Can't get enough 8)


RE: not that Im complainging but
By dark matter on 9/14/2010 2:27:01 AM , Rating: 3
Photoshopping? Really? Video, yes, but photoshop you do not need a 10 core machine. I use RAW files and trust my Core2Duo still handles it rather well. A quad would be nice. But 10 cores. The difference would be so negligible.

The only application for 10 cores would be a server (virtualisation). I would imagine that GPU's would be better served for Video rendering and 3d rendering.

Don't buy into the Intel hype. 10 Cores on a home machine? Absolutely no need. Seriously.



By piroroadkill on 9/14/2010 3:52:04 AM , Rating: 3
Yeah, this is pretty much true. I can't possibly justify upgrading my CPU, because I can still play the latest games fine, and it does everything I need to, with good speed (Core 2 Duo @ 3.4GHz). The only upgrade I think that would make a decent difference would be an SSD.


By piroroadkill on 9/14/2010 3:53:47 AM , Rating: 1
Also, not to shit on your chips, but Photoshop can consume resources if you require it to. Creating a large, large canvas (20000x30000, for example) for drawing, will absorb resources as you fill it, and to be honest, the limiting factor then seems to be HDD speed as it spews data onto the disk


RE: not that Im complainging but
By ekv on 9/14/2010 4:47:02 AM , Rating: 3
quote:
Photoshopping? Really?
If you have a couple thousand large RAW files to process at a time, yeah, it helps to have some more CPU. Of course, that also would require Adobe re-writing their software to utilize that kind of parallelism ... I'm not holding my breath.

My i7 does well, but there are times when I max it out ... and there are a couple more things I'd like to get done. Not often, but it would be nice to have the capability. [Competition being a good thing as far as consumer pricing is concerned].

I'm not hyping Intel. In fact, I think the SB GPU was somewhat crippled. AMD may be able to capitalize, but we shall see.
quote:
10 Cores on a home machine?
If man were meant to fly... 640k ought to be enough memory for anybody ... 3 or 4 IBM mainframes ought to serve the worlds needs ... and so on 8)


RE: not that Im complainging but
By saganhill on 9/14/2010 7:54:04 AM , Rating: 3
I disagree 100%. I rip videos and can tell you that using an i7 CPU has faster results. Using photoshop(64bit) has faster results. All other computing has faster results... And its not "negligible".

And saying "absolutly no need" for faster CPUs and computers in general is like saying no one will ever need more than 637kb of memory in any computer. The arogance of such a statement will ALWAYS fail and history will ALWAYS prove you wrong. Seriously...


RE: not that Im complainging but
By FITCamaro on 9/14/2010 8:12:40 AM , Rating: 2
The point is that 90% of people have absolutely no need for 10-core CPUs. Or really even quad-cores. The only thing driving this stuff is power users and games.


RE: not that Im complainging but
By Zarsky on 9/14/2010 1:42:47 PM , Rating: 3
That's why they still sell dual cores!

But as the tech goes further, and more and more people have 2 or more cores, that's when developers are starting to optimize their software for more cores.


RE: not that Im complainging but
By YashBudini on 9/21/2010 11:22:47 PM , Rating: 2
quote:
The point is that 90% of people have absolutely no need for 10-core CPUs. Or really even quad-cores. The only thing driving this stuff is power users and games.

The same could be said of you childish car fetish.


RE: not that Im complainging but
By DaveSylvia on 9/14/2010 9:50:45 AM , Rating: 2
There was a time when a CPU struggled just to play an MP3. Soon after, it was awesome that we could rip a song off a CD into an MP3 - only it took 30 minutes per song.

Now we can rip a whole CD in less than 10 min!

Until both ATI and nVidia agree on a standard for GPGPU, video encoding and ripping on the GPU will still be somewhat immature.

A 10-core beast would plow through a video encode process - with today's software. This will be useful for 90% of the people - just not right now. You have to start somewhere! This will trickle down and that won't happen unless it comes out at the high-end first.

There will absolutely be a need for this - don't be so short-sited.


RE: not that Im complainging but
By Nutzo on 9/14/2010 11:10:26 AM , Rating: 2
I have yet to find video compression software that will make full usage of my quad core i7 860 @3.5Ghz.
Most packages only end up pushing it to 25%-35%, using only 2-3 cores. It's not I/O bound either, as I can actually run 2 video compression tasks at the same time with only a 5% drop in speed on each task.


By theapparition on 9/14/2010 10:02:05 AM , Rating: 5
quote:
Don't buy into the Intel hype. 10 Cores on a home machine? Absolutely no need. Seriously.

Sigh.
Did IQ's just drop sharply when I was away? <\movie reference>

Where in your mind did you ever get the idea that Intel was marketing this for a home computer? It's a server/workstation chip, and will be marketed and priced accordingly.


RE: not that Im complainging but
By EricMartello on 9/14/10, Rating: -1
RE: not that Im complainging but
By ianweck on 9/14/2010 4:32:12 PM , Rating: 4
Wow! I've seen alot of fanboys on these forums but I've never seen a multi-core fanboy before.


RE: not that Im complainging but
By Ammohunt on 9/14/2010 2:55:25 PM , Rating: 3
quote:
The only application for 10 cores would be a server (virtualisation). I would imagine that GPU's would be better served for Video rendering and 3d rendering.


You said it Virtuali z ation people automatically think large server farms;cloud computing to a lesser extent. Think bigger how about running a hyper-visor on the bare metal of your ten core system while running multiple virtualized applications/Operating systems from different architectures I could seamlessly switch to a Gnome application running on an Ubuntu slice seamlessly from a Windows desktop the options could be limitless.


RE: not that Im complainging but
By Nutzo on 9/14/2010 11:03:07 AM , Rating: 2
I agree to a certain extent.
I find the i3/i5 dual cores to be plenty fast. i3 for home/basic office, and a higher end i5 in laptops and desktops that need to run the ocasional VM.

As for Servers, Dual quad cores is a must as it allows me to have several virtualize application servers instead of trying to install everyting on a single OS.
I find it more cost effective to buy dual lower end quad cores than to buy a single high end quad.


RE: not that Im complainging but
By invidious on 9/14/2010 2:16:36 PM , Rating: 2
Very few computer games currently support 4 cores and those that do don't use them efficiently. The best core2duo runs games better than the best core2quad. Sure the four core i7's may be faster than core2's but they are a completely different architecture and cost far more.


RE: not that Im complainging but
By Lifted on 9/13/2010 7:42:19 PM , Rating: 5
If it wasn't obvious, it's a server CPU. It's not needed for average home PC users, and they won't be buying it.


By drunkenmastermind on 9/13/2010 9:17:20 PM , Rating: 1
My mates has Dual 5560's giving him 16 threads and he loves that shit! http://www.youtube.com/watch?v=quLfDLROtts


RE: not that Im complainging but
By XSpeedracerX on 9/13/10, Rating: -1
RE: not that Im complainging but
By dark matter on 9/14/10, Rating: -1
By XSpeedracerX on 9/16/2010 2:21:52 AM , Rating: 1
Well, thats how it is. What do you think the two year smartphone upgrade cycle is about? You really think your smartphone will be dead in only two years flat? Even when my nokia dumbphone lasted ten?

Someone able to dump $3-4k into their winbox is someone with some disposable income who likely won't settle for 'good enough'. They don't even have to feel sluggish - once these parts aren't the top of the line anymore, or get trounced in the latest benchmark, their history still useful or not. Yes you could say that this person is rather spoiled. No there's nothing you can do to stop them.


RE: not that Im complainging but
By lamerz4391 on 9/14/10, Rating: 0
RE: not that Im complainging but
By XSpeedracerX on 9/16/2010 2:15:32 AM , Rating: 2
Yeah, the cluelessness is pretty astounding. On YOUR part.

No one was talking about the enterprise world, so you can take your strawman back to the bail of hay you pulled him out off. Also, unless your press release is comming out of IBM, sun or SGI, your enterprise CPU announcement is very much about the 'xtreme gaming' market, since a variant of it will be marketed and sold there.


RE: not that Im complainging but
By FaceMaster on 9/21/2010 6:59:58 PM , Rating: 2
I, for one, need more than 1 TB of hard drive space. You clearly haven't discovered the delights of 1080p porn. Just because YOU don't want the world to progress, it doesn't mean that others feel the same way. Why should we have to wait for videos to render? Why should we have to tap our fingers as we wait for the desktop to load? Why should we be restricted to SD, fuzzy porn when there are better alternatives out there? I'm an Xtreme gamer myself, and I DEMAND 1000 fps solid on Solitaire, or else I can't possibly expect to be the best in the world at it. 6 core processors just don't do it for me any more. And as for my phone, I love not being able to see the pixels. It makes the girls look like they're actually there, in the palm of my hand. Go back to your 8 bit wankfests, you Windows 3.1 addict. Wait, you're probably more of an Apple fan if you don't care (or simply don't know) about the faster, better alternatives out there.


RE: not that Im complainging but
By YashBudini on 9/21/2010 11:20:01 PM , Rating: 2
quote:
You clearly haven't discovered the delights of 1080p porn.

Oh I'm sure there's nothing quite so arousing as seeing herpes sores in excruciating detail.

Magazines don't airbrush their photos for the fun of it.


RE: not that Im complainging but
By hifiaudio2 on 9/13/2010 8:07:45 PM , Rating: 2
FYI the "-EX" processors, Nehalem-ex and Westmere - EX are not only server processors, but 4 socket server processors... ie the largest x86 servers. They are currently at 8 cores max per processor from Intel, so moving to 10 core isnt much of a surprise.


RE: not that Im complainging but
By Silver2k7 on 9/14/2010 2:56:20 AM , Rating: 4
"They are currently at 8 cores max per processor from Intel"

But AMD does have its 12-Core doesn't it ?


RE: not that Im complainging but
By kjboughton on 9/13/2010 8:53:45 PM , Rating: 5
I do, for encoding video.

Smarty pants.


RE: not that Im complainging but
By kake on 9/14/2010 1:33:53 AM , Rating: 2
Exactly. This much power is why you can watch almost any TV show in very good HD less than two hours after it airs. Without a TV.


RE: not that Im complainging but
By TheRequiem on 9/13/2010 9:23:10 PM , Rating: 2
I'm not quite sure if I agree to this assessment. We can always use faster processor's. While 10 cores is obviously for servers, the software for these system's in place are geared toward these configuration's, not for gaming or home use. With that being said, what matter's here is the archictectural design in these processor's. They are innovative in the fact that they represent new instructions and abilities, yes, but they are important because they will greatly speed up current software and games that we use. Better hardware, even though maybe a few years ahead of software is still good because it leads the way to better and more efficient programming.


RE: not that Im complainging but
By Silver2k7 on 9/14/2010 3:00:32 AM , Rating: 3
server hardware usually trickles down.

we didn't need 32-bit or 64-bit at first in the home either, but now we do.


RE: not that Im complainging but
By FITCamaro on 9/14/2010 8:15:31 AM , Rating: 2
64-bit came about because of memory address space limits. Not really the need for faster CPUs.


RE: not that Im complainging but
By lamerz4391 on 9/14/2010 11:56:35 AM , Rating: 2
Why the hell was that downranked? It's true.


RE: not that Im complainging but
By TSS on 9/13/2010 11:34:00 PM , Rating: 5
The poor bastards who are still running norton anti virus.


By retrospooty on 9/14/2010 8:13:18 AM , Rating: 3
LOL +1


RE: not that Im complainging but
By ncage on 9/14/2010 12:39:35 AM , Rating: 2
Your definitely right for most people. 99% of people out there don't need it. I'm one of the few people that need it. I'm a programmer by trade and constantly am running my main machine + 2-3 virtual machines and have 24 GB or RAM and usually am taxing my I7 920 OC @4ghz. It almost sucks when i use my labtop :).


By marvdmartian on 9/14/2010 8:59:37 AM , Rating: 2
They need a picture of Dr Evil!

One MILLION cores!!! ;)


RE: not that Im complainging but
By Proxes on 9/14/2010 10:11:04 AM , Rating: 2
People have said this crap over and over again for years. Hardware will always be ahead of software. Don't worry it'll catch up.

Soon a Core 2 Duo won't be able to handle the latest OS. Just like a Pentium 3 and most Pentium 4's can't handle Windows 7. I distinctly remember people saying back in those days how we would never need more than a ghz at home.

It's people like this that would have us all still running 8088's.


RE: not that Im complainging but
By ianweck on 9/14/2010 4:34:30 PM , Rating: 3
quote:
Just like a Pentium 3 and most Pentium 4's can't handle Windows 7


What is this nonsense? My pentium 4 runs Windows 7 just fine.


RE: not that Im complainging but
By sleepeeg3 on 9/14/2010 2:28:56 PM , Rating: 2
Most gaming won't see a benefit from *2* cores, let alone 10!

Intel can continue to pump out higher and higher multi-core chips, but until software catches up, I am sitting on what I have now.


RE: not that Im complainging but
By e36Jeff on 9/14/2010 10:17:25 PM , Rating: 2
last i checked, the EX line, which is what they are talking about here, are server chips, not desktop ones


By Reclaimer77 on 9/15/2010 6:03:51 PM , Rating: 2
Where did Intel say this was for "the home" though?


By LeBeourfCurtaine on 9/16/2010 4:25:36 PM , Rating: 2
*cough* Crysis */cough*


RE: not that Im complainging but
By RuptureX on 9/16/2010 7:03:24 PM , Rating: 2
I understand that you are not complaining about it... but I think its important to develop new technologies even if there is no need for them for 90% of the population. But at least you give choice for people. If some one wants to buy, them buy... and if the day that we will really need to use this tecnology arrives it will be there for us to use.
Well I can see this been used in the first instance in military, and high-tech industrial robots and machines where possibly there is a need to process a big volume of data as faster as you can. respectfully RuptureX.


RE: not that Im complainging but
By talonvor on 9/17/2010 7:41:12 PM , Rating: 2
Its not a matter of who needs it. I want it, regardless of whether or not I can completely use it at the moment. Just like people said there was no need for four cores back when they came out. The software will eventually be developed to advantage of those cores.

Personally, I am excited to see how the extra horse power can be used in both hardware and software applications. Things like AI, robots and many other applications can take advantage of the cores.


RE: not that Im complainging but
By Lerianis on 9/19/2010 1:13:31 PM , Rating: 2
Only someone running a VERY VERY CPU intensive game, to be honest. Heck, my gaming laptop isn't limited by the CPU.... it's limited by the GPU and the heat problems of the chip!

4 cores is well more than enough for most consumer applications, excepting video encoding.


RE: not that Im complainging but
By plonk420 on 9/23/2010 1:37:24 AM , Rating: 2
you people showing up to say "NOBODY WILL NEED THIS KIND OF POWER!!!! EVAR!!!!11one" are like people under the poverty line showing up to whine about BMWs or Benzes having more power or gadgets than "the average person needs". :|

those who can use the power (or who just have too much money to waste) are reading this. please move along, now.


It is not about need but "wants"....
By fteoath64 on 9/14/2010 5:01:32 AM , Rating: 2
This is a classic tactic of Intel driving the market to where they want. As AMD seems to be behind by a comfortable margin in terms of raw performance. Here, Intel can maintain premium pricing for the period they wanted before dropping prices or introducing lower end models.

So my question is why 10 core ?. Not 12 cores ?. Is there an architectural limitation on cache bandwidth and memory bandwidth ?. Like they jumped from 6-cores to 10 bypassing 8-cores.

Sure most people do not need such performance but a lot will upgrade none-the-less because it might make sense. Some might want SATA6G or USB3 features that will come with new boards, we can see that Socket 1156 is being replaced already, so those older CPUs will be scarce over time.

The hardware high-tech business is about pushing the market towards where the manufacturer sees and convincing the customers to go there. Simple as that!. It is way more difficult to do this in software.




By FaceMaster on 9/14/2010 7:12:30 AM , Rating: 2
Yeah, but it's DOUBLE DIGITS!

I'm still waiting on the 101 or 99 core processor, though I'd settle for 69.

And is it only my head that explodes when reading

'Intel likely today set those looking to deploy a high-performance single socket server solution salivating with its unveiling of the Westmere-EX.' ?


By lamerz4391 on 9/14/2010 11:54:59 AM , Rating: 2
Bahahaha! This is about "wants"? Um, you have no idea what you are talking about.

The EX series CPUs, which the 10-core announced Westmere-EX is a part of, is about high end enterprise computing. It's about Intel stealing customers from high end UNIX competitors running systems costing in the hundreds of thousands, to millions of dollars. This is about enterprise workloads that require all the CPU power you can throw at them and still run for hours or days. It's also about virtualization and consolidation of servers. Your comment is very ill-informed, focusing on the consumer space, which this announcement of the Westmere EX has no part of.


Price?
By MikhailT on 9/13/2010 6:47:17 PM , Rating: 2
Any idea what the the price would be?

I'm curious about getting this for a home virtual lab.




RE: Price?
By lamerz4391 on 9/14/2010 11:59:16 AM , Rating: 2
North of $1000, based on Nehalem-EX pricing. These are for quad-socket systems. If you want more cores for a home lab, probably better off looking at Opteron 6 or 12 cores. Much cheaper, though less powerful.


SAllliiivaaaatteee
By DoeBoy on 9/13/2010 6:47:37 PM , Rating: 2
:O````````````````````` I can't wait to see what the benchmarks are for this sucker.




"New" name?
By chagrinnin on 9/13/2010 7:14:10 PM , Rating: 2
Really? I'm thinking marketing ripped you guys off. :P




gpu compute
By skiboysteve on 9/13/2010 9:35:32 PM , Rating: 2
So why put the blurb at the bottom about gpu computing when sandy bridge doesnt support it?




Not for me...
By Landiepete on 9/14/10, Rating: 0
RE: Not for me...
By theapparition on 9/14/2010 10:05:07 AM , Rating: 2
Someone got thier browsers tabs confused.


By khideyoshi on 9/14/2010 11:39:07 AM , Rating: 2
To those saying home users would not need these powers + new powerful sandy bridge, I would say you are all wrong; there are many applications that can make use all these lots of extra (mostly idling) processor powers;

Go and look the broader world out there,

www.worldcommunitygrid.org OR Folding@home

make your new hardware purchase a useful ones for humanity while you enjoy watching HD videos, transcoding, browsing, working & gaming!!




By TxJeepers on 9/16/2010 10:23:39 AM , Rating: 2
I know, I know....number of cores, more processing per cycle, etc. BUT, is AMD and Intel doing anything on increasing the speed boundaries they've hit. I still want a factory 4-5Ghz processor. I've got Crysis II coming out soon and damn it if that is not enough reason I don't know what is!




By Pessimism on 9/15/2010 9:24:29 AM , Rating: 2
Agreed


By kyleb2112 on 9/16/2010 4:23:52 AM , Rating: 2
Conspiracies!
The shortcut stupid people take to feel smart.


By mlmiller1 on 9/17/2010 6:21:01 AM , Rating: 2
I am borrowing that quote!


By YashBudini on 9/21/2010 11:15:50 PM , Rating: 2
quote:
Conspiracies!
The shortcut stupid people take to feel smart.

And so you believe 7 World Trade fell down for the reasons the government told you.

Now who's taking shortcuts?


What's the TDP?
By SunAngel on 9/13/10, Rating: -1
RE: What's the TDP?
By tastyratz on 9/13/2010 7:57:58 PM , Rating: 2
somewhat... but its going to be far more energy efficient compared to 2 quads and a dual core server. Clock for clock efficiency it will crank out. If the other cores can truly enter a deep sleep state it could be managed more appropriately. Idle will likely end up not much worse than current offerings.

Don't forget core2duo vs core2quad resulted in a 65-95 watt tdp jump. You only drink the juice if you use the juice.

and are you really saying overclockers wont welcome a 325 watt tdp if that's what it actually was? Most extremers buy 750+ watt power supplies when your average sli quad build cant even max out a true 400 watt supply.

You want to cry about energy usage look at video cards.

This server chip will do very well where its intended market is.


RE: What's the TDP?
By kake on 9/14/2010 1:38:08 AM , Rating: 2
Anandtech had a good article last week on the various available server processors and their power usage vs. performance.

http://www.anandtech.com/show/3894/server-clash-de...


RE: What's the TDP?
By zpdixon on 9/14/10, Rating: -1
RE: What's the TDP?
By zpdixon on 9/15/2010 2:50:07 AM , Rating: 1
Care to explain the downvote?

I am right to call someone who predicts 325W "crazy" because it would mean the entire system integrator industry would have to redesign chassis and heatsinks to account for a TDP 2.5x higher than the highest current TDP (Intel 130W Nehalem processors).

As AMD demonstrated with its Magny-cours 12-core processors, it is possible to pack a high number of cores in the same TDP envelope as of today. Therefore Intel is likely to do the same (10 cores in approximately the current Intel 95-130W TDP envelope).

I am going to come back to this thread when Intel announces the TDP to prove I was right.

Adios.


"My sex life is pretty good" -- Steve Jobs' random musings during the 2010 D8 conference














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki