backtop


Print 85 comment(s) - last by plonk420.. on Sep 23 at 1:37 AM


The new Westmere-EX CPU will bring 10 cores to a single server socket.  (Source: Anandtech)

Intel will keep the CORE-ix brand names for its upcoming "Sandy Bridge" architecture redesign.  (Source: Anandtech)

"Sandy Bridge" will use a ring bus to allow the on-chip cores and media units (including the on-die GPU) to access the cache.  (Source: Anandtech)
Chipmaker doesn't reveal launch date for the Westmere-EX

Intel likely today set those looking to deploy a high-performance single socket server solution salivating with its unveiling of the Westmere-EX.  Following the Gulftown lineup -- which trickled out starting in March 2010 -- the Westmere-EX is Intel's latest 32 nm Westmere chip.

Westmere is very similar to the
Nehalem 45 nm architecture, meaning it's a "tick" design -- not a major redesign.  That's not to say there isn't enough to be excited about here, though.  Intel is making good use of its extra die space saved by the shrink and the Westmere-EX packs an incredible 10 cores in a single socket package.  That adds up to a total of 20 threads.

For the supercomputing-minded, the new chip bumps the amount of usable memory from 1TB (64 DIMM slots) to 2TB.  There's no official word on the name of the processor -- past
Gulftown server designs were in the Xeon 3600- and 5600-series.  Also not revealed are clock speeds and launch date.

Perhaps more exciting was new details Intel revealed about its upcoming "tock" (architecture redesign), code-named
Sandy Bridge The upcoming 32 nm architecture will feature a ring design for its last-level cache access.  Cache will be accessible by an on-chip 3D Graphics Processing Unit, the four (or potentially more) cores, and the Media Processing unit.  The ring bus is designed to deliver high-bandwidth to the various connected cores in the chip.

The processor will feature the return of Turbo Boost mode, which allows the easy overclocking of Intel's processors.

Sandy Bridge PC processors will keep the CORE-i3, i5, and i7 designations and will be rebranded the  "new CORE-i3..."  That approach is likely to create confusion among customers about exactly what they're buying, given that the average user likely wouldn't be able to pick a Nehalem i7 from a Westmere i7 or Sandy Bridge i7.

On a more positive note, though, 
AnandTech is reporting that the Media Processing Unit will include video transcode hardware.  In a demo that hardware crunched ~1 minute long 30Mbps 1080p HD video clip to an iPhone compatible format in under 10 seconds.  The transcode hardly can be viewed as Intel's attempt to fend of NVIDIA's GPU computing from entering the consumer market.

GPU computing is a hot new field of computing -- it centers around the notion that dedicated video hardware can outperform CPUs at a number tasks, including chemical simulations, video encoding, physics simulations, and more.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: not that Im complainging but
By thrust2night on 9/13/2010 7:17:25 PM , Rating: 2
You're right. But the sale of such processors to millions of home users will result in more incentive for Intel (and its competition) to continue innovating and moving technology forward. Console hardware shows that we don't need modern GPUs or CPUs to play games but that doesn't mean Intel should simply stop R&D. These CPUs can be used by companies as well. Imagine the effects it will have on the economy if Intel and AMD only refreshed their processors once or twice a year.


RE: not that Im complainging but
By thrust2night on 9/13/2010 7:18:50 PM , Rating: 2
I meant.... once or twice a decade :)


RE: not that Im complainging but
By Cypherdude1 on 9/13/2010 10:56:45 PM , Rating: 2
quote:
I cant believe Im saying it but who needs this much power in their home pc? Servers I can see sure but home gaming wont see much of a benefit from 10cores 20threads for years.
I agree. Most software is not optimized to run on more than 4 cores. Even with the latest O/S's which have better CPU multi-core management, most users don't have that many apps running to keep a 10-core CPU busy.

That being said, a 6 core fusion-type CPU would be practical if it had 4 CPU cores and 2 GPU cores. The problem is manufacturers tend to cut costs when they sell integrated products. The CPU manufacturer would use cheaper CPU and GPU cores resulting in less performance.

Moreover, the true limiting factor would be with the motherboard manufacturer. The CPU manufacturers would be completely dependent on Asus to manufacture a motherboard which has all the correct external hardware connections. The motherboard is where all the connections would be made. The motherboard would need at least 2 DVI connectors and at least 1 HDMI connector.

Manufacturers usually reduce features for integrated products because they want to decrease costs and decrease their selling price to attract more consumers.


RE: not that Im complainging but
By theapparition on 9/14/2010 9:55:44 AM , Rating: 3
quote:
Moreover, the true limiting factor would be with the motherboard manufacturer. The CPU manufacturers would be completely dependent on Asus to manufacture a motherboard which has all the correct external hardware connections.

Not a problem for Intel since they sell plenty of motherboards. If other manufacturers want to sell competitive offerings, they will feature rich thier products as well. Or if a consumer doesn't want those features, get the cheaper model. Pretty simple.

I think the point you bring up is pretty much irrelevant.


RE: not that Im complainging but
By Lerianis on 9/19/2010 1:16:44 PM , Rating: 2
Not really irrelevant seeing as how a lot of people don't trust Intel made motherboards. I personally don't, I only trust non-Intel made motherboards.


RE: not that Im complainging but
By MrFord on 9/14/2010 11:59:43 AM , Rating: 2
If pretty much any AMD785/890GX boards features VGA/DVI/HDMI, I'm sure that wouldn't be a problem at this point


By Cypherdude1 on 9/15/2010 5:33:54 AM , Rating: 2
quote:
If pretty much any AMD785/890GX boards features VGA/DVI/HDMI, I'm sure that wouldn't be a problem at this point
A decent integrated motherboard would require at least 2 DVI and 1 HDMI. Moreover, Intel has never really had high performance integrated graphics. In order for Intel to produce an integrated 6 or 8 core (4 CPU + 4 GPU core) CPU with high performance graphics, Intel would need to buyout nVidia.

Because AMD's CPU's are nearly as good as Intel's and AMD purchased ATI with their high performance graphics, AMD actually has the advantage here. An integrated AMD CPU, with 6 or 8 high performance cores, would actually be a better product than Intel's. As I mentioned before, Asus would need to produce a fully featured integrated motherboard with 2 DVI + 1 HDMI connectors and not skimp. Furthermore, AMD would need to include both CPU and GPU high performance cores and also not skimp. All of these conditions would need to be true to have a high performance system and I highly doubt this will happen, at least not in the foreseeable future.


RE: not that Im complainging but
By FITCamaro on 9/13/2010 7:41:10 PM , Rating: 3
Most companies are doing just fine with dual core CPUs. Unless you're working with CAD, audio encoding, video encoding, etc. A Core2 is more than enough. This is what the PCs at my office use.

For my machine I just need a decent graphics card. Which I don't have.....


RE: not that Im complainging but
By hughlle on 9/14/2010 3:42:58 AM , Rating: 4
What do you mean console hardware shows us we don't need modern GPU's and CPU's???? Consoles look and play like utter crap.


RE: not that Im complainging but
By Lerianis on 9/19/2010 1:20:43 PM , Rating: 2
Not really. Sure, compared to an EXTREME gaming PC playing the same game, the console version looks like crap.

However, for the average user, who has never seen that 'extreme gaming PC' video quality? A console looks pretty good.

Heck, I was playing an XBox 360 in Best Buy the other day while waiting for my parents to finish buying something..... I was shocked at the video quality.

Comparable to my gaming class PC (albeit that my PC is 2 years old).

It was OTHER things that turned me off the console: easily scratched discs and no way to back them up, having to have the easily scratched disc in the drive all the time, etc.


RE: not that Im complainging but
By invidious on 9/14/2010 2:06:12 PM , Rating: 5
quote:
Console hardware shows that we don't need modern GPUs or CPUs to play games but that doesn't mean Intel should simply stop R&D
WTH are you talking about? Console graphics are horrendous compared to PC graphics. If you can't tell the difference you are truly living in the dark.

The low resolution textures, lack of AA and AS, and super short mapping radius are glaringly obvious whenever I play ps3 or 360. Sure I get used to it after a while but that doesn't make it OK.


RE: not that Im complainging but
By Phoque on 9/15/2010 8:01:58 PM , Rating: 2
quote:
Console graphics are horrendous compared to PC graphics. If you can't tell the difference you are truly living in the dark.


Are you illiterate? Or perhaps slightly stupid? The guy said:

quote:
Console hardware shows that we don't need modern GPUs or CPUs to play games but that doesn't mean Intel should simply stop R&D


And even though I agree to some extent with what you say, I disaggree with your suggestion that thrust2night doesn't know what he's talking about, because it is true that current gen console demonstrate it doesn't take current hardware to play awesome games, f*** man, just look at how the Wii did with outdated cpu & gpu at launch.


RE: not that Im complainging but
By Lerianis on 9/16/2010 9:03:29 AM , Rating: 2
I have to disagree with that statement, thrust2night. The fact is that console hardware has SERIOUS problems with multiple people on the screen at one time at high resolutions and with the latest 'eye-candy'.

Those PC quality 'super-GPU's' are still necessary for good gaming for someone even like me, who doesn't care about eye-candy but wants more than 30fps framerates.


"It looks like the iPhone 4 might be their Vista, and I'm okay with that." -- Microsoft COO Kevin Turner














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki