Print 31 comment(s) - last by haukionkannel.. on Feb 11 at 2:42 PM

Product cancellations are usually bad, but not if they are replaced with something better.

Intel's P1268 32nm development process is progressing better than expected. It is doing so well, in fact, that they are cancelling Havendale and Auburndale, the 45nm mainstream value versions of Nehalem. These were supposed to be the first mass produced chips with on die integrated graphics and an integrated memory controller.

Havendale was a dual core version of Lynnfield, itself the Core i5 mainstream version of the Nehalem Core i7. It would've used the same LGA-1156 socket as Lynnfield

Auburndale was the mobile version of Havendale, but it had more in common with Clarksfield, the mobile variant of Lynnfield.

Having graphics on die saves motherboard manufacturers money because there is no longer a northbridge to buy and integrate. By lowering platform costs, Intel wants to bring Nehalem technology to a new market, at a price point it previously was not able to. This would also drive DDR3 adoption, something that DRAM manufacturers have been anticipating.

The original launch date for Havendale was towards the end of Q4, missing most of the crucial Christmas buying season. Auburndale would've been introduced in Q1 of 2010. This was to allow time for the production ramping of Lynnfield and Clarksfield into the mainstream market.

With 32nm development so advanced, Intel made the decision to pull in Clarkdale and Arrandale from the middle of 2010 to Q4 of 2009. They seem confident that they will be able to ramp in time to meet demand from the critical Christmas season.

Clarkdale is the 32nm successor to Havendale, built using two logic cores and a graphics core using Intel's "Multi-Chip Packaging". The logical cores are built on a 32nm process, but the integrated memory controller and graphics core are built on a 45nm process. It is capable of running four threads at once with a new generation of Hyper-Threading, promising increased efficiency. A server variant of Clarkdale is also to be introduced later in Q1 of 2010.

Arrandale is the mobile version of Clarkdale, also with integrated on die graphics. It will also allow switchable graphics within Windows 7 and Windows Vista, enabling the use of a higher performance GPU when plugged in.  Both Clarkdale and Arrandale will use 5 series chipsets, exclusively with DDR3.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

I feel bad for AMD now...
By jiteo on 2/10/2009 4:42:46 PM , Rating: 5
AMD: We have this awesome idea called Fusion!
Intel: *yoink*

RE: I feel bad for AMD now...
By soydios on 2/10/2009 5:56:28 PM , Rating: 4
how is Fusion coming along anyway?

RE: I feel bad for AMD now...
By Clauzii on 2/10/2009 6:38:44 PM , Rating: 4
I'd rather wait for an AMD/ATI solution.

UNLESS Intel will make a historic footprint and release a GPU that can actually be used for serious work and not just some 2D+little'bit'o'this'o'that'but'veryyyyy'slow.

RE: I feel bad for AMD now...
By Roy2001 on 2/10/2009 8:22:53 PM , Rating: 2
Do you play games with integrated graphics?

RE: I feel bad for AMD now...
By Clauzii on 2/10/2009 11:16:07 PM , Rating: 2
I would.
I can live with moderate resolutions, so when AMD reaches 45/32nm and will be able to include a 4850 on die (maybe with it's own 128bit GDDR5 bus to the outside world), count me in!

RE: I feel bad for AMD now...
By Clauzii on 2/10/2009 11:29:48 PM , Rating: 1
Oh, let me correct and say 'on chip'. 'On die' would be strange since graphics and CPUs are on different nano sizes.

Would AMD be able to push their CPUs to 40nm, to combine it with the GPU parts, or am I asking the technically impossible?

RE: I feel bad for AMD now...
By bridgeman on 2/11/2009 12:43:23 AM , Rating: 1
Actually, you should say 'in package'. 'On chip' normally means the same thing as 'on die'.

I expect the primary reason for using separate dies is yield. As AMD and nVidia have learned over the past few years, defect density is a huge problem on large dies. Other good reasons for separate dies include having already developed the 45nm parts and keeping the 45nm fab lines busy.

RE: I feel bad for AMD now...
By jonmcc33 on 2/10/2009 11:33:26 PM , Rating: 1
Only 5% of the people that use computers will play games on them. There's a reason that Intel has a clear lead on GPU market share.

By StevoLincolnite on 2/11/2009 1:22:35 AM , Rating: 2
Not always, I know several people who bought a system with Integrated Intel Graphics and were gamers, just because they are gamers, doesn't mean they know everything there is to know about Graphics hardware.

Casual games are also getting big, like Bejeweled, Spore, The Sims etc.

RE: I feel bad for AMD now...
By Murloc on 2/11/2009 11:10:55 AM , Rating: 2
flash games?
Facebook flash apps lags on my intel intregrated gpu.
But I can play gta3 and aoe2, that's nice.

RE: I feel bad for AMD now...
By rudolphna on 2/11/2009 11:14:25 AM , Rating: 2
you can definetely play alot of games at low res on AMD/nvidia integrated graphics. I have a a AMD 690G graphics chip in a laptop with a Turion X2 processor and it can play World of warcraft just fine for my son (and myself sometimes I admit it) at 1280x800 with no AA.

RE: I feel bad for AMD now...
By monomer on 2/10/2009 7:20:40 PM , Rating: 2
Actually, back in 1999 or so, Intel was in the middle of designing something like this, codenamed Timna, which included a CPU, memory controller, and graphics chip on a single die. It was canceled pretty late in development due to issues with the RDRAM/SDRAM memory controller, I believe.

By itzmec on 2/10/2009 4:16:51 PM , Rating: 2
having integrated on die graphics, will this effect the overclock(ability)of the cpu? heat issues?

RE: .
By pattycake0147 on 2/10/2009 4:25:15 PM , Rating: 2
I'd say that would depend on how independent the clocks and voltages are. It would be nice to be able to lower the GPU so it doesn't interfere (create as much heat). If that were possible you could use the on-die graphics when on the internet or word processing, then turn on a dedicated card when doing something more demanding (games). I don't know what the plans from Intel are, but its my $.02 worth.

RE: .
By Pryde on 2/10/2009 9:42:09 PM , Rating: 2
Isn't Intel greatly limiting i5 Overclocking and leaving that to the i7 crowd. They don't want a remake of cheap q6600 OC competing with q9000 series.

RE: .
By Sagath on 2/10/2009 4:27:20 PM , Rating: 2
I would guess this depends a lot on bios functionality via the motherboard. If intel allows independent clocking (say for example like nVidia with shader/ram/core clocks) then you should be able to.

Will they? Well, that's the real question. My opinion is it will be an independent clock. They are competing too much with AMD/Nvidia to cripple their own chips by limiting i5 overclocking...or atleast I hope so.

Also, its nice to see the emergence of the GPU/CPU + uberGPU development we have been hearing about for so long. I just hope intel (and windows?) alows the shutting down of add in PCIe 3d cards to work with ALL manufacturers cards.

RE: .
By kattanna on 2/10/2009 5:15:00 PM , Rating: 2
i'd be interested to see how much the on die GPU in full on mode will be able to starve the CPU cores by consuming all the memory bandwidth.

though, honestly, i dont see too many people buying an integrated GPU and doing high end 3D stuff.

RE: .
By rudolphna on 2/11/2009 11:17:34 AM , Rating: 2
with high speed DDR2 and DDR3 I dont think that will be as much of an issue as it was in the past, especially with intels history of poor performing graphics parts. My concern is that the EU is going to complain that intel is violating ant-trust by FORCING anyone using an Intel CPU to use intel graphics.

RE: .
By danrien on 2/10/2009 5:43:00 PM , Rating: 2
absolutely, yes. i am guessing integrated graphics will not be included with enthusiast chips.... ever. unless the process technology reaches a point where it doesn't make sense to not have them in the same package/die. but that would probably be around oh 1nm.

By Shig on 2/10/2009 3:47:04 PM , Rating: 2
Can't wait for I5 now.

RE: Wow
By Master Kenobi on 2/10/09, Rating: 0
RE: Wow
By SerafinaEva on 2/10/09, Rating: 0
RE: Wow
By Alpha4 on 2/10/2009 5:02:01 PM , Rating: 4
I would rate you down for being rude, but I'm pleased to see "you're" used in proper context.

I just wanted to speak that part of my mind :D

RE: Wow
By Clauzii on 2/10/2009 6:34:04 PM , Rating: 2
But if it is crippled, how is it at all to compete with the Phenom IIs, that does pretty well against the current Core line of CPUs?

Are we again going to see Intel do well in servers and AMD on the desktop? History repeating maybe.

RE: Wow
By Pryde on 2/10/2009 9:44:28 PM , Rating: 1
Not doubt i5 will offer better performance than PhII at a very reasonable that will drive PhII prices even lower.

not a true IMC?
By Falloutboy on 2/10/2009 5:56:05 PM , Rating: 2
so i5 does not have the IMC on the processor die, its on the graphics die? why move it off the processor? wouldn't this cause additional latency?

RE: not a true IMC?
By ceefka on 2/11/2009 5:55:00 AM , Rating: 2
Yes, a little more latency than having in on die with the CPU (e.g. Core i7), but not nearly as much as having it in a separate northbrigde (e.g. Core2).

Clarkfield - Clarksfield - Clarkdale
By demiller9 on 2/10/2009 6:33:18 PM , Rating: 1
Is Intel trying to confuse me, or have they just confused Jansen Ng? Are these three chips all one and the same?

By Pakman333 on 2/11/2009 4:18:44 AM , Rating: 3
I think it's just you.

No problem!
By haukionkannel on 2/11/2009 2:42:45 PM , Rating: 2
I am sure that you can play mine sweep and poker with this chip, so it is good enough for 95% of the customers.
These are value versions of i7, so that is what you can expect from them.
All in all, they can be actually very profitable to the Intel, and not so bad choice to anyone, who don't play games. All in one chip can make it possible to do really small and cheap motherboards!

It's allso interesting to see that Intel actually moves their low end products first to smaller production node. The same thing that ATI used to do with their GPUs.
It allso tells that they are not worried about ATI in high end CPU ground. They want to kill AMD allso in low cost segment too. I hope that the don't manage to do it, but they are aiming toward it guite clearly. They allready have the higher end segment, now they can battle in low end with cheaper production costs!

"Well, we didn't have anyone in line that got shot waiting for our system." -- Nintendo of America Vice President Perrin Kaplan

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki