backtop


Print 49 comment(s) - last by Pakman333.. on Feb 20 at 5:55 AM


Westmere has a small package

Note the 45nm integrated graphics
Intel prepares for its 32nm transition

Intel's P1268 32nm process is at an incredibly advanced stage, and Intel wants the world to know it.

The CPU behemoth has cancelled several 45nm products because it will have much more advanced 32nm products available this year. AMD, meanwhile, has only been selling 45nm chips since November.

Clarkdale is the desktop version of Westmere, built using two 32nm logic cores and a 45nm graphics core using Intel's "Multi-Chip Packaging". Targeted at the mainstream value market, it is capable of running four threads at once with Intel's newest generation of Hyper-Threading. A server variant of Clarkdale is also to be introduced later in Q1 of 2010.

Arrandale is the mobile version of Clarkdale and will also be available with integrated-on-package graphics. It will allow switchable graphics within Windows 7 and Windows Vista, enabling the use of a higher performance GPU through PCIe when plugged in.  Both Clarkdale and Arrandale will use 5 series chipsets exclusively with DDR3.

This is the first 32nm silicon out of Intel's Fab D1D Research and Development center in Hillsboro, Oregon. We were told that it is fully functional and running Windows 7. Intel also claims that its cycle times are greatly improved over its P1266 45nm process, and expects a faster ramp.

Power consumption numbers are visible for both Clarkdale and Arrandale, but these are just preliminary. Final production silicon will probably be much lower, but this gives a good indication of Intel's prowess.

We’d like to give a special thanks to Stephen Smith, Vice President and Director of Business Operations of Intel's Digital Enterprise Group, for making these pictures possible.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Only One Way
By mindless1 on 2/16/2009 8:29:41 PM , Rating: 1
There's only one way I can think of at the moment, that this separate-die integration of GPU onto the CPU carrier is good. It will allow a single-interface heatsink design to cool both GPU and CPU. That is, so long as the system chipset is then reduced enough in heat density that it no longer needs anything more than a basic passive heatsink.

If these won't allow a basic passive heatsink for the system chipset, and I mean a very small cheap lightweight one equivalent to what we saw back in the Pentium 3 era, it was pointless. With that, you still need a 2nd chip for traditional chipset functions, that 2nd chip commonly being the IGP in lower-tiered products.

What this looks like is a reactionary design to combat nVidia who in all fairness should have had a larger share of the chipset market if it weren't for Intel bundling of CPU + chipset.

What we need is not this, it's a single chip solution integrating all functions you'd typically find in CPU, GPU, northbridge, southbridge. It MUST contain ALL these to make a significant difference in design size and cost, otherwise you're just playing a shell game with where the silicon is.




RE: Only One Way
By Kary on 2/18/2009 12:32:26 PM , Rating: 2
If a GPU gets integrated with EVERY Intel chip and they are programmable to handle massively parallel tasks then.. yes, this could be huge in and of itself (though those are some big IF's)


RE: Only One Way
By Kary on 2/18/2009 12:34:45 PM , Rating: 2
Hmmm, also forgot..as mentioned above, the ability to turn off the main GPU when not running anything graphically intensive (I'm assuming Aero is a breeze for these IGC) then that could save a third or more power when the computer isn't in a video game (which is probably most of the time).


RE: Only One Way
By mindless1 on 2/18/2009 8:14:09 PM , Rating: 2
You can't turn off the whole GPU, it handles 2D as well. Portions of it perhaps, but the same can be said about having it another chip as it already is with existing IGPs.


RE: Only One Way
By mindless1 on 2/18/2009 8:12:41 PM , Rating: 2
That won't matter, as already mentioned there's still the 2nd chip, that 2nd chip usually having the IGP that can still do the parallel tasks, EXCEPT, we have heard nothing from Intel about them doing parallel tasks, while we do already have examples of IGP from nVidia doing non-video processing.

Two dies on one carrier is great when it gets rid of a 2nd chip/carrier, as mentioned previously it would be great news to me if they didn't do it halfway and integrated the entire chipset but since they don't, it has limited usefulness.


RE: Only One Way
By atlmann10 on 2/18/2009 11:30:04 PM , Rating: 2
One thing your not recognizing here, this is not the socket replacement for the I7. This is the 4 core socket for the new chipset (value I7 1266 or something like that)Intel will be releasing between now and school starting next fall. The replacement on the current I7 chipset will be 8 cores. And I believe as I had read a day or two ago the current X58 boards will not apply to this CPU. There will also be a direct replacement on the current I7 1366 chipset, that will be 32nm as well. From what I understood the 4 core 32nm with the graphics core was value and I think is called the I5.


RE: Only One Way
By Pakman333 on 2/20/2009 5:55:09 AM , Rating: 2
You are wrong.

Gulftown is the new 6-core CPU, not 8-core, coming next year, and can use the X58 chipset.

http://www.dailytech.com/Gulftown+is+the+Flagship+...

The Core i5 is 45nm and doesn't have have on package graphics, it is just mainstream Nehalem.

http://www.dailytech.com/Intel+Targets+BacktoSchoo...


Switchable graphics?
By Gorghor on 2/17/2009 9:50:14 AM , Rating: 2
Beyond the whole "Intel is taking over the low-end GPU market" issue, does any one know what is meant with switchable graphics?

Could this be similar to nVidia's now obsolete hybrid power technology?
This tech, that lets you power off your discrete GPU completly when not needed, sure sounded interesting for us casual gamers that don't want to waste 100 watts or more on an idle GPU (which in my case is over 90% of the time.)




RE: Switchable graphics?
By Adonlude on 2/17/2009 3:54:09 PM , Rating: 2
Yep thats what it is.


RE: Switchable graphics?
By Gorghor on 2/17/2009 5:22:34 PM , Rating: 2
Admitting this is true, I can't really see how this would work without specific on-chip features or drivers from ATI/nVidia. I guess we'll have to wait and see.


RE: Switchable graphics?
By Cypherdude1 on 2/18/2009 12:55:40 AM , Rating: 2
So does anyone have any estimates on the speed of the newer 32nm CPU's? How much faster will they be than the i7's? Any estimates on the price increase? I think the i7 920 at $258 is a good buy myself.


Correction
By freeagle on 2/16/2009 3:30:04 PM , Rating: 3
quote:
built using two 32nm logical cores


It's 2 physical cores, and 4 logical cores with HT enabled.




RE: Correction
By freeagle on 2/17/2009 8:38:21 AM , Rating: 2
quote:
built using two 32nm logic cores and a 45nm graphics core


This makes even more sense

(not a sarcasm)


Push the envelope!
By MrPoletski on 2/17/2009 7:30:12 AM , Rating: 2
Good to see technology moving forward swiftly;)




Power consumption numbers
By Saosin on 2/17/2009 6:28:18 PM , Rating: 2
quote:
Power consumption numbers are visible for both Clarkdale and Arrandale, but these are just preliminary.
Where?




Graphics on CPU package : What's the point?
By aegisofrime on 2/16/09, Rating: -1
By AntiV6 on 2/16/2009 9:04:16 AM , Rating: 3
It's optional I think.

But if you ever use the battery, it will get much better battery life because it can use the integrated GPU.

Technology is a good/bad thing for me. I hate when I buy something, 6 months later something twice as fast comes out. lol


By CSMR on 2/16/2009 9:07:55 AM , Rating: 2
-GPGPU is independent of this architectural question. You can do it with both discrete and integrated graphics, on or off die.
-Integrated graphics are not for gamers. This chip is good for average consumers, business users, and power users depending on features, but not gamers. The main benefits will be power consumption and cost, and speed compared to the current generation of integrated graphics.


By Master Kenobi (blog) on 2/16/2009 9:28:11 AM , Rating: 3
quote:
So are we paying for something that we won't use?

No. Read the article and research the chip. These are for low end consumer notebooks. Notebooks that are already sold with IGP's. These are not for our kickass gaming desktops and high end notebooks. Instead of having 2 chips on the board (IGP and CPU) on the budget boards/laptops you now get a single chip/socket with both.


RE: Graphics on CPU package : What's the point?
By TSS on 2/16/2009 9:39:52 AM , Rating: 3
besides that it's a good step forward to more powerfull systems on chips, or rather, system on dies.

just look at the picture. now drop a 4Gbit ram chip and a 8Gbit flash chip on there and you have an oversized watch with more computing power then desktop computers had at the turn of the milennium.

reminds me of one of the top 100 quotes on bash.org:

<erno> hm. I've lost a machine.. literally _lost_. it responds to ping, it works completely, I just can't figure out where in my apartment it is.


RE: Graphics on CPU package : What's the point?
By mattclary on 2/16/2009 10:17:07 AM , Rating: 2
That makes me think of a story I read years ago about a Novell server that accidentally got walled up during a renovation. They rediscovered the machine when remodeling yet again and it had never dropped a packet the whole time.


RE: Graphics on CPU package : What's the point?
By Alphafox78 on 2/16/2009 12:49:31 PM , Rating: 2
The University of North Carolina has finally found a network server that,
although missing for four years, hasn't missed a packet in all that
time. Try as they might, university administrators couldn't find the
server. Working with Novell Inc., IT workers tracked it down by
meticulously following cable until they literally ran into a wall. The
server had been mistakenly sealed behind drywall by maintenance workers.
Source: TechWeb News, 04/09/01:
http://www.techweb.com/wire/story/TWB20010409S0012

link doesnt work tho..


RE: Graphics on CPU package : What's the point?
By freeagle on 2/16/2009 3:35:30 PM , Rating: 4
Maybe they thought firewall is not good enough for securing the servers, so they "implemented" a drywall.


By cheetah2k on 2/16/2009 10:06:11 PM , Rating: 2
AMD must be spewing - Intel beat them to CPU + GPU on a single package... In the Dave vs Goliath battle, Dave is looking about ant sized right now.


By Cypherdude1 on 2/18/2009 12:49:25 AM , Rating: 2
"The requested resource was not found."

http://www.techweb.com/wire/story/TWB20010409S0012


RE: Graphics on CPU package : What's the point?
By amanojaku on 2/16/2009 10:25:51 AM , Rating: 3
quote:
These are for low end consumer notebooks. Notebooks that are already sold with IGP's.
That's not true; the desktop variant has integrated graphics, as well.
quote:
Clarkdale is the desktop version of Westmere, built using two 32nm logical cores and a 45nm graphics core using Intel's "Multi-Chip Packaging".
Intel is smart to do this. If the IGP is capable of playing common low end games like Warcraft a lot of "casual" gamers would benefit from not having to scope out a GPU. I suspect the IGP adds no more than $30 to the cost, anyway, which won't be noticed in the usual debut price of $200-$400. Intel gets a couple of extra bucks, the consumer gets a "free" video card, and a true GPU maker looses market share. Brilliant.


By Master Kenobi (blog) on 2/16/2009 11:08:54 AM , Rating: 2
You conveniently overlook 2 sentences down.
quote:
Instead of having 2 chips on the board (IGP and CPU) on the budget boards/laptops you now get a single chip/socket with both.


RE: Graphics on CPU package : What's the point?
By amanojaku on 2/16/2009 11:26:48 AM , Rating: 2
If this works out on desktops and mobile devices I can see this making its way to servers, particularly blades, and home theater devices, as well. Any device that has limited space will benefit from the reduction resulting from a seamless integration. And imagine Intel bidding for an Intel-inside XBox with some proprietary graphics. This all depends on Intel's level of interest and capability, both of which seem to be increasing these days.


By InternetGeek on 2/16/2009 5:40:34 PM , Rating: 2
I'd rather not based on the historic performance


RE: Graphics on CPU package : What's the point?
By paydirt on 2/16/2009 11:42:40 AM , Rating: 1
I said this about 5 months ago. The battle is no longer between nVidia and AMD, it is between nVidia and Intel. AMD is done.


By grcunning on 2/16/2009 12:15:25 PM , Rating: 2
I wish I had a dollar for each time someone has told me that AMD was done. I remember many "computer experts" telling me that my purchase of an AMD 386-40MHz was a waste of time because AMD wouldn't be around to honor the warranty.


RE: Graphics on CPU package : What's the point?
By Oralen on 2/16/2009 9:39:46 AM , Rating: 1
I don't think the important thing here is the GPU: Intel has never cared for gamer.

(And when Larrabee is released, it will not worry Nvidia or ATI(MD:-) because even if it is a speed demon, programming games to take advantage of it will take a lot of time. And Intel will have to maintain graphic drivers for it, an exercise at which they suck)

What's interesting here is how fast Intel is extending it's lead over AMD, on the manufacturing front:

Smaller die=More profits

The ability to package the CPU and GPU on the same chip=More profit

The fact that those two to are not both 32 nm, but only the CPU, means the ability to continue to use older fabs for longer to manufacture the GPU's=More profits

The conclusion: Even in these uncertain times, I'm not worried about Intel's health or survival. They are going to be OK.

And it's probably what they want to say to their shareholder with these kind of news: ok, the market is in poor shape, but we'll be fine, keep your stocks...


By Oralen on 2/16/2009 9:41:44 AM , Rating: 2
Sorry about the typo's... End of the day...


RE: Graphics on CPU package : What's the point?
By Patito on 2/16/2009 10:43:00 AM , Rating: 3
(And when Larrabee is released, it will not worry Nvidia or ATI(MD:-) because even if it is a speed demon, programming games to take advantage of it will take a lot of time. And Intel will have to maintain graphic drivers for it, an exercise at which they suck)


A small comment on Larrabee. If I'm not mistaken, Larrabee will be an x86 GPU which should make programming games easier as it is a well-established architecture. So in my opinion Nvidia and AMD(ATI) should be worry when Larrabee comes out. Sony has already reported that Playstation 4 will use Larrabee. Who knows what other companies will ditch Nvidia or AMD graphics and go with Larrabee for future their products?


By Master Kenobi (blog) on 2/16/2009 1:27:40 PM , Rating: 2
Ditto. Larrabee is supposed to be able to execute C++ or DirectX code no problem. Considering some of the best C/++ compilers are from Intel, I don't see them having a problem with this.


By monomer on 2/17/2009 10:54:15 AM , Rating: 2
Sony has denied the rumors that they will be using Intel as a supplier for the GPU of the PS4. The news was originally published by The Inquirer, so I would take it with a grain of salt.


RE: Graphics on CPU package : What's the point?
By sweetsauce on 2/16/2009 1:28:12 PM , Rating: 1
One quick search for larrabee and you'll see that the points you made are irrelevant and completely wrong.
quote:
As a GPU, Larrabee will support traditional rasterized 3D graphics (DirectX/OpenGL) for games. However, Larrabee's hybrid of CPU and GPU features should be suitable for general purpose GPU (GPGPU) or stream processing tasks
You really think Intel would be stupid and not include Directx/OpenGL extensions on the chip? Its programmable functions will be for the more advanced programmers like a Carmack or a Tim Sweeney, who won't be tied down to a specific API. Worst case scenario for nvidia and ati is Larrabee being a speed demon. Intel getting serious about graphics is the best thing ever for us gamers. We may actually see real time ray-tracing in games. Imagine the possibilities...
http://upload.wikimedia.org/wikipedia/commons/e/ec...


By Oralen on 2/16/2009 5:25:41 PM , Rating: 2
Well...

Intel is a great company, but if there is one area in which they have never delivered, it is graphics.

People can throw around rumors about the future Playstation, or about how great Larrabee is going to be, about what may happen...

Personnaly I'll wait before making assumption, and I don't think the decades of experience ATI and Nvidia have build will just come overnight to Intel the day Larrabee is released.


By mindless1 on 2/16/2009 8:37:15 PM , Rating: 2
Intel is not getting serious about graphics from a gaming perspective. This is a low-end product to make use of the same carrier/heatsink. Otherwise, it could have been designed to be blazing fast and it would still disappoint because Intel doesn't commit the necessary resources to driver development for gaming. They could surprise me and change gears, but this has not happened in the past, IGP after IGP and now yet another IGP.


By haukionkannel on 2/17/2009 3:07:05 AM , Rating: 2
Yep!

This mean that when in the past the manufacturer desided if they put intel, amd (ati) or nvidia integrated graphic in their low end products (read products that is the biggest selling and most profitable in the world...) They now deside between intel and intel... If the GPU is allready in the CPU why to put in another? No reason at all. So this actually kills completely AMD and NVidia in low end intel CPU based computers...
If someone use AMD CPU they deside between intel, amd and nvidia like before, so intel really takes the low end front this time! They will get total monopoly sooner than anyone expected. In midle and high end the situation does not change, but this may eat the economy of Nvidia and AMD so badly that they may have to give up the fight, or the high end GPU will get much more expensive that today.
Maybe in the future we all all play with intel grahics because there is not any competition left? Who knows... I can't wait that day to happen... I may even have to did out my old Doom if i want to play games... och and nethack ;-)

Seriously. This is strong move from Intel to completely take over low and low-middle range GPU manufacturing.


RE: Graphics on CPU package : What's the point?
By phatboye on 2/16/2009 12:16:23 PM , Rating: 3
The CPU world does not revolve around gamers.


By Anonymous Freak on 2/16/2009 12:56:40 PM , Rating: 2
Indeed. And many gamers forget that the GPU world doesn't actually revolve around them, either.

Intel is still the largest supplier of GPUs, and sub-$100 GPUs are still nVidia and AMD's biggest money makers.

Yeah, the ultra-high end GPUs are the ones that get the press, (and the ultra-high-end CPUs, for that matter,) but they're not where the money is.


RE: Graphics on CPU package : What's the point?
By iwod on 2/16/2009 8:59:36 PM , Rating: 1
Actually, the point is that Intel is shaving CRAPPY Graphics whether we want it or not.
Westmere as far as roadmap goes, is not available without Intel Graphics MCM.


By Meph3961 on 2/17/2009 1:42:11 AM , Rating: 2
quote:
Actually, the point is that Intel is shaving CRAPPY Graphics whether we want it or not. Westmere as far as roadmap goes, is not available without Intel Graphics MCM.


Not true. You forgot about Gulfstown. It does not have a gpu on chip.


How about some freakin' chipsets now, Intel?
By aos007 on 2/16/09, Rating: -1
RE: How about some freakin' chipsets now, Intel?
By sweetsauce on 2/16/2009 2:07:42 PM , Rating: 3
X58/I7 isn't meant for you. Its for the enthusiast market, the one that doesn't worry about something being affordable. If you want luxury performance, you must be willing to pay for it. Now you can argue that the price/performance of I7 isn't worth it, but that's irrelevant. Wait patiently for I5 or pay up for I7, the choice is yours.

On Atom, i'm pretty sure they know that if they make a good chipset for it to run on, they will completely cannibalize their notebook core2s. They are really stuck because they have 2 chips competing for a samilar market, and one has significantly lower margins for profit.


By mindless1 on 2/16/2009 8:43:14 PM , Rating: 2
There has been an enthusiast class of equipment for many years without some of the extremes in pricing we see today. The observation was correct, that this isn't just the simple notion of "must be willing to pay for it", as that kind of generic mentality could be claimed if they cost $1 million dollars each. In the real world, everything has relative worth and value even to supposed enthusiasts.


By haukionkannel on 2/17/2009 3:13:18 AM , Rating: 2
They most propably have better chipset already in labs. They just need a good reason to start producing it. But so far there has not been any reason to do it. The old chipset sell well enough. That is what is going to happen to CPU if AMD dies out...


By Oregonian2 on 2/16/2009 4:43:55 PM , Rating: 2
Curiously, I think the fab being closed down this year in Hillsboro (near here) is/was used to make chipset IC's. Ironic that the new CPU was made in a nearby facility. :-)


"So if you want to save the planet, feel free to drive your Hummer. Just avoid the drive thru line at McDonalds." -- Michael Asher














botimage
Copyright 2015 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki