backtop


Print 48 comment(s) - last by ilkhan.. on Dec 3 at 12:31 PM

Intel partners reveal plans for new processor socket designs

In a memo copied to DailyTech, Intel partners recently discussed details regarding Intel's next-generation socket designs.

Intel's next-generation processor family, codenamed Nehalem, integrates the memory controller directly onto the processor die -- a feature already standard on AMD's K8 and K10 core architecture. 

Where companies traditionally increase pin count for new processor designs, Intel's LGA715 (also dubbed Socket H) will actually decrease the amount of pins from 775 to 715.  Since the memory controller will reside on the processor, Nehalem processors no longer need the additional signaling from the processor to the Northbridge. 

Guidance released to Japanese PC Watch claims this new desktop socket will actually utilize a 1160-pin LGA1160 design instead. Intel officials would not reveal exact pin count details.

LGA1366, on the other hand, will greatly increase the pin-count for cross-CPU communication via Intel's QuickPath Interface.  AMD increased its pin-count for server chipsets when it migrated from PGA940 to LGA1207 design last year.

Server Nehalem processors will use Registered DDR3 memory; desktop processors will utilize the unregistered variant.   While not electrically compatible with DDR2, DDR3 still uses 240 pins for signally thus eliminating the need to increase pins on account of the system memory.

As one Intel engineer who agreed to speak on conditions of anonymity put it, "We try to reduce the pin counts as much as possible to eliminate cross talk and other interference." He adds, "But we do try to leave some pins for overhead and future use."

Nehalem-based CPUs will use Intel's second-generation land grid array (LGA) design.  The use of "pins" in context of the land grid array is a bit of a misnomer as the processor interfaces with the socket design via pads rather than pins.  This LGA design is recognized by both AMD and Intel for its ability to increase "pin" density and durability.  

Corporate roadmaps from Tyan and Supermicro both detail LGA1366 designs for sampling by the end of Q2 2008.  Desktop LGA715 variants, on the other hand, won't see mass production until the second half of 2008, with a target launch of Q4 2008.

Intel guidance slates LGA1366 Tylersburg chipsets for a Q3 2008 launch.  Desktop Havendale and Lynnfield chipsets using the LGA715 design are on the record for Q4 2008.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

obselete?
By xsilver on 11/28/2007 7:19:26 PM , Rating: 3
whats sad is that the shiny new x38 motherboard (or x48) motherboard will be truly obselete in just 1 year.

How long can we expect s775 processors to last until before being discontinued? q2-q3 2009?




RE: obselete?
By KristopherKubicki (blog) on 11/28/2007 7:24:50 PM , Rating: 2
How long did LGA755 last? I would suspect about that long.


RE: obselete?
By Anonymous Freak on 11/28/2007 11:22:55 PM , Rating: 2
There was no LGA755. LGA775 is Intel's first LGA socket, replacing the PGA Socket 478, which lasted from 2002 to 2005. LGA775 was released in 2004, and will last until late 2008. For reference, the original Pentium 4 socket, Socket 423, only lasted about two years. So Intel is getting better.

With luck, DDR3 will last a long time, allowing this new socket to last a while.


RE: obselete?
By KristopherKubicki (blog) on 11/29/2007 10:13:15 AM , Rating: 5
Yeah sorry that was a typo


RE: obselete?
By deeznuts on 11/29/2007 1:18:14 PM , Rating: 4
Man even YOU can't edit? Sucks.


RE: obselete?
By KristopherKubicki (blog) on 11/30/2007 11:17:14 AM , Rating: 2
People tend to think through their post a little bit more when they realize they can't change it later


RE: obselete?
By deeznuts on 11/30/2007 1:27:46 PM , Rating: 4
I sure don't. I'm at work, I need to switch screens as soon as possible! :D


RE: obselete?
By TimberJon on 11/30/2007 4:13:11 PM , Rating: 2
haha. Running two screens, and wearing 20 hats, there is never a lack of windows to help cover up something on my screen. I specifically cleared it with the higher ups that I be allowed to browse DailyTech on a continuous basis with free reign because it is "industry related".

Granted! DT is the best.


RE: obselete?
By Jedi2155 on 12/2/2007 1:18:03 AM , Rating: 2
I don't need too, I can even sleep at work!!!

Then again...it is a government job....


RE: obselete?
By keitaro on 11/28/2007 7:39:40 PM , Rating: 2
It seems I always end up upgrading to the latest platform when the current CPU interface design is on its last leg. I upgraded when S939 was about to switch over to M2. It looks like I'll be going LGA775 when the following year will have LGA715 starting out.

Here's one part that I'm curious about... with current DDR2 market as it is now, will the first run/generation of LGA715 processors be capable of differentiating DDR2 and DDR3? I ask this because eventually DDR3 will come down in price and DDR2, much like DDR right now, will phase out. The part that bugs me is the fact that if we wanted to use DDR3 that there's the possibility we needed a new CPU (and maybe a new board too) rather than just ditching the old RAM and installing the new ones. Not only would it be less of a hassle but it's likely to save folks like me a whole lot of trouble of ditching old hardware just to use the current ones.


RE: obselete?
By xsilver on 11/28/2007 7:56:04 PM , Rating: 5
Here's 2 things that have been observed in the last 10 years of computing

1) Companies prefer that you change out all your components because that means more sales for them. They don't engineer products to last long (one exception being socket A)

2) In hindsight, you must have been an incredible fortune teller to be able to master the art of buying hardware. Just at the right time too, in order to get some extended use out of such hardware before it being obselete and have no upgrade path.


RE: obselete?
By glomag on 11/28/2007 8:34:16 PM , Rating: 2
Ha, socket A! That's what I'm using in my main desktop PC right now. I do have a 2.2ghz turion based laptop though. By the time I can afford new hardware (i.e. AM2 right now) I'm too tempted to wait for the new stuff to come out. Then when it is released, surprise, I can't afford it. Maybe I should just stop whining and get a job. Anyone need me to build you a computer?


RE: obselete?
By KingstonU on 11/28/2007 10:32:40 PM , Rating: 5
Sometimes it feels worse when you're like me and you work and have the money to buy a shiny new system that could play Crysis but there's not point because you don't have the time to play it anyway :(


RE: obselete?
By Bluestealth on 11/28/2007 11:47:15 PM , Rating: 2
I am thinking of selling my computers guts and getting a mATX computer...
1. I don't really play that many games, very often, and everything I really want to play is in development for a while
2. My enthusiast motherboard CANNOT get/stay in S3/S1 standby, when will this problem end :(
3. It bleeds power due to idling video card as well as having a nice loud humm
4. I hate my X-Fi and want to smash it into pieces, fix your damn software already


RE: obselete?
By goku on 11/29/2007 12:44:41 PM , Rating: 1
I bet you're using vista, aren'tcha?


RE: obselete?
By Bluestealth on 11/30/2007 1:42:28 PM , Rating: 2
No :( XP on my desktop, but Vista doesn't work either (Have tried RTM)... Gentoo Works with S1 sometimes. S3 on all OSes I have tested causes the computer to wake up after a few seconds and/or have video never come back up.

Still haven't the X-Fi driver to install cleanly and work on Gentoo.

BTW I think all the standby/resume problems revolve around the 965P-DQ6 being a pile of garbage. I will NEVER buy another Gigabyte product. It is fairly stable but it cannot handle restarts/startups well, or use standby even when not being overclocked.


RE: obselete?
By Treckin on 11/28/2007 11:52:22 PM , Rating: 2
Hear Hear!
I used to blow SOOO much time on computer games in high school. Starting with Doom 2, it went to tribes, sin, half life, starCraft, Diablo 2, etc. I cant believe that people still have the attention span and extra time to pump into that!


RE: obselete?
By Crassus on 11/29/2007 10:36:29 AM , Rating: 2
I hear you. 10 years ago I was on top of all the hardware news and product releases, and I build a couple of systems - for other people ('cause I had no money, being a poor student and all).
Now I could afford a new system, but I'm not even close to making good use of my current gaming rig (S939 4200+, 7900GT). Heck, I didn't even attempt to o/c the 7900 so far, and I have it for more than a year now. I'm currently trying to catch up on HL2 ... playing about 2 hours a week (on a good week). No point of getting a better system, I'd rather get a better monitor at work, where I spend 10+ hours a day. Man, that's so sad ... (/whining)


RE: obselete?
By Spuke on 11/29/2007 11:34:35 AM , Rating: 2
I'm not up on all the details myself but I try to keep current. I just upgraded my system to a X2 6000. Seems upgrades are coming every 3 years for me. I'll buy Crysis when I get my 8800GT. Right now I'll stick to the demo as I'm not gaming much anyways (maybe every 3 to 4 months I'll play for 30 minutes to an hour). When I'm finally done with school, then I'll play more. Right now it's just not that important.


RE: obselete?
By Etern205 on 11/28/2007 10:59:44 PM , Rating: 2
From the looks of it, looks like right now if you want your motherboard socket to last for some time, then you will have to get it when if first comes out. But of course Intel will still screw you with their change of chipsets in order to support newer cpus.


RE: obselete?
By Screwballl on 11/29/2007 11:11:57 AM , Rating: 2
I find that buying during the first big price drop allows for a good upgrade time. 775 started during the pentium 4 days so is a bit outdated even now. The DDR3 and new chipsets just lengthened its life another year or two. I myself built one in March with a E6600, X1950GT (PCIe) and P965 chipset but I tend to build a new one every 2-2 1/2 years. This allows the current computer to still be usable and able to run much of the current software for around 4 years. I am a gamer on a budget but I know my previous build (AthlonXP 3000+, 1GB DDR, 9600XT AGP8X) is still very capable for many games, just not the latest ones. Even this E6600 can be strained with many brand new games like Crysis and COD4 (but this is not a problem as I refuse to step into WindowsMe ver2 aka Vista).
As technology progresses, I find it is best to build a new system with close to top of the line (depending on your budget) every 18-24 months. In my case the E6600 system is used for games (under XP) and every day usage (under Fedora Core 8 x64), and I use my XP3000+ system for my work from home job (XP). The wife is happy with her P4 1.8GHz with 512MB PC800 RDRAM (a previous build).
It all depends on your personal usage as to when you should build or buy. If your AthlonXP system is still running great and have no need for DirectX 10 or high end games/software then there should be no need to upgrade. Even if you choose, you can still get a X1950GT AGP video card to extend the life of the system another year.


That's a lot of pins for QuickPath
By Micronite on 11/28/2007 7:13:56 PM , Rating: 2
1366 vs 715... Just a QuickPath difference?
I think there's more here than meets the eye.
Didn't I read elsewhere that there may be versions with 3 DRAM channels, 2 DRAM channels, or 1 DRAM channel? Perhaps that's the reason for the huge difference in pins.




RE: That's a lot of pins for QuickPath
By KristopherKubicki (blog) on 11/28/2007 7:16:55 PM , Rating: 2
Both LGA1366 and LGA715 will use QuickPath. The additional memory is going to add to that pincount, but a lot of it is point-to-point communication with the other CPUs.

Remember, as it is right now, socket-to-socket communication is done via the Northbridge.


RE: That's a lot of pins for QuickPath
By GeorgeOrwell on 11/28/2007 7:26:11 PM , Rating: 2
Is the CPU going to talk to the GPU via QuickPath as well?


RE: That's a lot of pins for QuickPath
By KristopherKubicki (blog) on 11/28/2007 7:32:23 PM , Rating: 2
No. The GPU will connect to the Northbridge via PCIe, which then talks to the CPU. I guess that will change a bit with Larrabee but by then the GPU will be on the CPU die.


RE: That's a lot of pins for QuickPath
By DerwenArtos12 on 11/29/2007 5:53:15 AM , Rating: 2
Is Intel still going to call it a north-bridge? Rather, My question is: will they still use an two chip chip-set once they move the memory controller onto the CPU? Semantics I know, but the changeover to a single chip(in most applications) changed board layout a lot and with the Intel still trying to wedge in BTX couldn't this present a GREAT opportunity to really mess with us?


By cheburashka on 11/29/2007 2:36:19 PM , Rating: 2
They haven't called it a northbridge for a while now. Currently it is a CPU, (G)MCH, and ICH. In the future it will be (G)CPU and PCH.


By PlasmaBomb on 11/29/2007 2:13:13 PM , Rating: 2
According to this they are -
http://pc.watch.impress.co.jp/docs/2007/1127/kaiga...

I guess we will have to wait and see :)


Haverdale?
By jhtrico1850 on 11/28/2007 7:28:19 PM , Rating: 2
Isn't in Havendale?




RE: Haverdale?
By jhtrico1850 on 11/28/2007 7:31:18 PM , Rating: 2
http://pc.watch.impress.co.jp/docs/2007/1128/kaiga...
Also, Havendale and Lynnfield look like CPUs not chipsets.


RE: Haverdale?
By KristopherKubicki (blog) on 11/28/2007 7:41:04 PM , Rating: 4
Correct, it is Havendale instead of Haverdale.

The distinction between GPU and chipset is going to get very blurry in 2008 / 2009. To the best of my understanding Havendale is the collective name for the entire MCM package, which includes a GPU and the Nehalem CPU. PC Watch includes the memory controller on the "GPU side" of the processor, but they also list the CPU as an LGA1160 while my stuff still says LGA715.

He's generally pretty good with those details though. Mine don't line up with his, but it could be he's looking at older docs or vice versa.


RE: Haverdale?
By imperator3733 on 11/29/2007 4:56:21 PM , Rating: 2
It seems to me that LGA1160 makes more sense, since with LGA775 there would just be a 64-bit (right?) FSB to the northbridge. With these CPUs, there would be either a 64/128/192-bit memory controller, plus the QuickPath link(s), plus some extra pins for future use. I don't know how many extra pins LGA775 actually has, but it just doesn't seem to me that they could fit all that new stuff in a socket with fewer pins. LGA1160 would also give Intel the choice to later add quad-channel (256-bit) memory controllers to future CPUs.


HDD
By VitalyTheUnknown on 11/28/2007 9:24:57 PM , Rating: 1
Question for tech savvy.
With introduction of new sockets from Intel and Amd (new Motherboards,CPU,RAM} does it automatically mean that new HDDs will show up and replace existing SATA? Because I had so much trouble when SATA took over IDE/PATA, that would be another headache. No more Converters for me.

Feel Free To Correct My Grammar




RE: HDD
By shabby on 11/28/2007 10:19:20 PM , Rating: 2
No, sata is here to stay.


RE: HDD
By Spivonious on 11/29/2007 12:45:02 PM , Rating: 2
There were lots of sockets that supported both PATA and SATA. It was just with the 965 series of chipsets that Intel dropped the PATA support from the southbridge.

There is nothing on the horizon that will replace SATA. And it took about 5-6 years before PATA started to disappear, so don't worry. :)


RE: HDD
By VitalyTheUnknown on 11/30/2007 4:23:17 PM , Rating: 2
Thank god, I am glad to hear that.


Hooray for Intel!
By amanojaku on 11/28/2007 8:15:50 PM , Rating: 4
Unlike most fanboys, this AMD fanboy hopes Intel continues to innovate. AMD was doing well for a while and got lazy. The new Intel CPUs will force AMD to wake up or pack it up, and I doubt it'll pack it up.

I think AMD's R&D puts out better CPUs when you compare its R&D budget to Intel's, but it's the overall performance that people really care about. Barcelona looks hot, especially the 2360SE. If AMD can do the same for the desktop, then make another leap like it did when the Athlon was first introduced we'll have hot competition again. In the meantime, I'm looking for my quad-core Phenom, because someone's gotta pay to get K11 to market!




RE: Hooray for Intel!
By Spivonious on 11/29/2007 12:46:19 PM , Rating: 2
How do you measure the quality of a CPU if not by how it performs?


please change your socket design!
By Etern205 on 11/28/2007 10:47:59 PM , Rating: 2
During the days when pins we're located on the processor, users are advised to handle them with great care because if a pin is bent (it's really hard to straighten it back out even with the lead pencil trick) or broken off, then your toast.

When the LGA cpus came out I was extremely happy that the cpu became pin-less, but I had never thought that the pins are now built into the socket. And here lies the same exact problem only this time it's the other way around. This I find it annoying because lets say you got a board online and when it arrived you notice there is a bent or damage pin on the socket. Now when you try to return it, the company may denied it because they think all of their products shipped out are free of defects and you physically damaged it yourself.

Since Intel is coming out with a new socket then it would be greatful if they can redesign the type of socket design instead of using this pressure point type system.

I was thinking about something like this, a socket where its pins are made out in a solid cylinders or solid pyramids with its top cut off so it's flat.

Another is with the speed of PCIe 2.0, would it help if they can create a socket to use those like the design Intel did with their slot 1 used for thier Pentium 2 and some of their Pentium 3s?




By DerwenArtos12 on 11/29/2007 6:07:28 AM , Rating: 4
As for creating a solid pin on the socket it would create a lot of problems with seating the CPU properly not when first installed but, when being held in place by whatever retention system, the current designs squeeze the packaging against the socket and the pins stay suspended. Having a hard surface would be much more stressful on the packaging and i would almost guarantee the package would warp.

Slot designs are simply inefficient, both thermally and cost-wise. Thermally because they tend to create a dead-spot for air and because you simply cannot secure the card well enough to attach a heat sink large enough. cost wise because a socket design is much smaller, requiring less silicon and because with the number of leads continually going up the cards would have to be MASSIVE to allow for enough leads. For comparison, PCI-E x16 has just over 80 total leads, imagine trying to make that 715 or 775 or 940 and that not even server processors, how about 1300 leads for server CPU's. Just completely infeasible.


Correct me if I'm wrong please.
By spidey81 on 11/29/2007 7:49:26 AM , Rating: 2
As someone who is always trying to help out the little guys, I've been trying to find reasons to upgrade to AMD when it comes time to build a new machine in the spring. This appears to be one reason to go with AMD, as I don't believe it's scheduled for them to go with a new socket until 2010 with "Bulldozer". So the current socket AM2/AM2+ design will give your motherboard/processor choices about a year more of life if you will. But like I said, please correct me if I'm wrong.




RE: Correct me if I'm wrong please.
By imperator3733 on 11/29/2007 4:49:18 PM , Rating: 2
Socket AM3 is coming out in mid 2008 with the 45nm processors and supports DDR3. AM3 processors are supposed to be backwards compatible with AM2+ (they have both DDR2 and DDR3 memory controllers), however, so you could always get an AM2+ motherboard and CPU now and later upgrade the CPU to a 45nm/DDR3 one.


By Black69ta on 12/2/2007 4:18:25 PM , Rating: 2
How is AM3 backward compatible with AM2+? The memory controller is on Die so maybe that is not and issue, both DDR2 and DDR3 have 240 pins but are keyed different so there would have to be different memory sockets on the board right? Also with the controller on the CPU doesn't that mean that with different signaling and timings between DDR2 and DDR3 you couldn't have the same CPU socket using the same traces for the different standards could you?


summary
By ilkhan on 12/3/2007 12:08:39 PM , Rating: 2
So from what im reading here intel will have 3 nehalem desktop sockets. Or do we just have conflicting reports?
LGA1160 desktop with onboard graphics
LGA715 desltop w/o onboard graphics
LGA1366 multi-socket (w or w/o onboard graphics?)




RE: summary
By ilkhan on 12/3/2007 12:31:40 PM , Rating: 2
additonal research before posting appears to be a good thing :/

wiki says that
LGA1567 is the MP server socket (4 QPI, 4xDDR3)
LGA1366 is the server DP and bloomfield socket (1-2 QPI, 3xDDR3)
LGA1160 is the low end (1x QPI w and w/o int graphics, 2xDDR3)

The same socket being used for DP and bloomfield surprises me. Also surprising is no desktop grade octal core chip. They appear to only be using mem channels as the differentiator of sockets.

Which article is right? I dont know. But those are the 2 summaries I can find.


LGA 775 isn't going away
By BSMonitor on 11/29/2007 10:00:01 AM , Rating: 2
At least not in November. As with the Core 2 release(Pentium Ds lasted well into 2007), Intel will continue with new Core 2 parts well into 2009. Who knows by then, they may be selling 3.4 - 3.6 GHz Quad Penryns.

Essentially, you will have to choose between the two platforms. Penryn won't go away as quickly. Pentium D's were garbage, so there was no place for them in the market. But Penryn Core 2's are not. Expect there to be a place for them even after Nehalem is released.




Old memories
By Strunf on 11/29/2007 10:51:08 AM , Rating: 2
"This LGA design is recognized by both AMD and Intel for its ability to increase "pin" density and durability."

Don't you guys somewhat smile when you read this kind of sentence and think about all those bashing up Intel when they first introduced the LGA socket?... Ya sure Intel is not always right just like AMD but is such a fun when time prove people soo wrong.

My next hope is that Ageia goes down the pipe so I can laugh at a few guys I know in real life...




Socket confusion
By System48 on 11/29/2007 11:24:38 AM , Rating: 2
From what I have read in the past, LGA1366 will be Bloomfield with the integrated memory controller and LGA715 will be Bloomfield without the memory controller. Looking at the Japanese website it's not entirely clear exactly what LGA1160 is for. The diagrams show the PCIe link coming from the CPU directly whereas on the LGA1366 they go through the IOH. It's almost as if it has the memory controller plus the IOH, or at least part of the IOH. In the LGA1160 config it looks like there would only be a single PCH chip that will probably provide the same functions as the ICH does. It might be that LGA1160 is geared towards a mobile platform, with the CPU/GPU/IOH packaged together.




"I f***ing cannot play Halo 2 multiplayer. I cannot do it." -- Bungie Technical Lead Chris Butcher

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki