Print 29 comment(s) - last by lagomorpha.. on Feb 8 at 7:38 PM

DDR3 production by the end of the year

Previously, DailyTech reported on how DDR3 DRAM produced on 50nm production lines would be much cheaper and faster than those being produced on current production lines.

Elpida, Samsung, and even beleaguered Qimonda are either in the process of or planning to transition to these new geometries in order to lower costs and increase profits. DDR2 DRAM is currently selling at below the breakeven point for most DRAM manufacturers.

Samsung, the world's top DRAM producer, is announcing that it has developed a 1GB DDR2 SODIMM using 1Gb DDR2 DRAM on a 40nm process. The process size refers to the average half-pitch of a memory cell. A smaller die size means that more dies can fit on a silicon wafer, reducing production costs. Samsung expects power savings of 30 percent if lower voltages are used, made possible by the smaller size.

A smaller size also means that higher memory densities can be introduced, such as Samsung's 4Gb DDR3 chips. These can be used to produce 8GB DIMMs and SODIMMs. Samsung said it plans to apply its 40nm technology to develop a 2Gb DDR3 device for mass production by the end of 2009, lowering DDR3 prices even further.

Intel will be launching its mainstream Nehalem products in the third quarter of this year. Intel will be also be using DDR3 exclusively on its 32nm Westmere CPUs while AMD's Socket AM3, which uses DDR3, is coming out soon.

With DDR2 prices depressed, it makes economic sense for Samsung to concentrate on ramping DDR3 production on 40nm. Applying the economies of scale, the price premium of DDR3 could drop from 100% to 10% by the time Lynnfield and Windows 7 launch together in Q3.

The 40nm process can also be adapted for a new generation of GDDR5 chips, which will be used with AMD and NVIDIA's DirectX 11 GPU parts to produce cheaper and faster video cards.

Samsung is also currently researching 32nm process technology for future DRAM production.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

We're slowly reaching the point...
By quiksilvr on 2/5/2009 7:55:22 PM , Rating: 2
When computers will no longer have moving parts. Movies are getting digitized, slowly but surely making the optical drive obsolete within a few years. HDDs to SSDs. Smaller CPUs and GPUs are making the cooling fan becoming less of a necessity and passive cooling is becoming more effective. It's crazy to think that just 10 years ago we were psyched about the Pentium 3 processor.

RE: We're slowly reaching the point...
By Karma007 on 2/5/2009 8:40:11 PM , Rating: 1
Well, for desktops, fans will always be there...... I think.....

RE: We're slowly reaching the point...
By gerf on 2/5/2009 9:08:35 PM , Rating: 2
There are fan-less PSUs, and there have been desktops built with no moving parts for quite some time.

RE: We're slowly reaching the point...
By Murloc on 2/6/2009 7:09:46 AM , Rating: 2
High-end and cheap PCs will always use fans.

RE: We're slowly reaching the point...
By CommodoreVic20 on 2/6/2009 9:24:03 AM , Rating: 3
Will always have fans?

So 100 years from now you think desktops will still be clunking boxes with whining mechanical fans spinning in them?

By omnicronx on 2/6/2009 10:23:59 AM , Rating: 2
Could be, you never know. Active cooling solutions whichever method is employed will probably be more effective than passive cooling. Will we get to the point where the vast majority of users will have fanless computers? maybe.. but active cooling will always have a place in high performance computing.

RE: We're slowly reaching the point...
By foolsgambit11 on 2/6/2009 5:28:08 PM , Rating: 2
So 100 years from now you think there will still be desktops, whining mechanical fans or no?

RE: We're slowly reaching the point...
By TheFace on 2/7/2009 2:04:33 AM , Rating: 2
You think 100 years from now there will still be DESKTOPS?

By lagomorpha on 2/8/2009 7:38:48 PM , Rating: 3
You think 100 years from now there will still be PEOPLE?

RE: We're slowly reaching the point...
By SilentSin on 2/5/2009 9:23:27 PM , Rating: 3
Fans might become more of a secondary cooling source rather than the primary in the not too distant future. They might not be used in the traditional sense they are now on a hsf anyway.

The problem with ever decreasing transistor size and increasing density is that there is less surface area for a traditional heatsink to actually do its job. Without some major new engineering feat that we have yet to see which would allow for much more efficient transistor operation with less heat created there will need to be a change.

There will be a point where the heat density on a chip simply cannot be cooled with a flat piece of copper glued on with some thermal paste. There won't be enough energy transferred fast enough to the base plate of the heatsink and the chip will roast itself. That is where new designs such as microchannel cooling- - and other exotic physics magic like ionic wind -yes the very same tech behind those $500 air "purifiers" just on a nano scale- . Those will help to cool these chips much more directly and more efficiently. The problem is they are expensive to implement and must be placed on the actual packaging or silicon which isn't so easy.

I think the necessity of such solutions is a pretty long ways off and we may yet see some clever tricks that allow for the more traditional hsf to persist, but those technologies will probably see the light of day in some shape or form.

RE: We're slowly reaching the point...
By rudolphna on 2/5/2009 10:11:35 PM , Rating: 2
I think for higher performance and even midrange systems fans will have to be used for a long time. There is no other effective way to move air through the heatsink. You can have a good heatsink, but if there is no airflow to carry heat away from it... it just heats up and heats up till it crashes.

RE: We're slowly reaching the point...
By masher2 on 2/5/2009 11:14:36 PM , Rating: 3
Computers ran with nothing but passive heatsinks for many years...some early models didn't even heatsinks at all. Even today, the Via mobos that run my home automation system are fanless...and even a high-end system can be passively cooled, if one uses large enough heatsinks.

Within 2 more process nodes, your average midrange cpu should have power consumption well below 10watts, enough to easily be cooled passively.

RE: We're slowly reaching the point...
By ChoadNamath on 2/6/2009 12:20:44 AM , Rating: 1
Within 2 more process nodes, your average midrange cpu should have power consumption well below 10watts, enough to easily be cooled passively.
Sure they should, but going by CPU makers' track records, 22nm processors will just have 16 cores and 32 MB of cache that wipe out any power savings.

By masher2 on 2/6/2009 11:21:24 AM , Rating: 3
There will certainly be 16-core processors within 2-3 process nodes...but I don't see them being used in even midrange systems. I think we'll begin seeing larger and larger diffferentiations between the server and desktop/notebook cpu spaces.

RE: We're slowly reaching the point...
By JonnyDough on 2/6/2009 4:24:58 PM , Rating: 2
You're dead wrong. Engineers can turn off cores that aren't in use. Chips are becoming more powerful AND more energy efficient. Just because something has more cache and more cores does not mean that it will use more energy. At full load, yes. But the amount of work per watt consumed is steadily increasing. Chips are consistently becoming MORE efficient, and are requiring less power despite the gains in cores and cache. Why? Because once an energy efficient smaller manufacturing process core is done doing its job it powers down.

By foolsgambit11 on 2/6/2009 6:26:28 PM , Rating: 2
All true statements, but irrelevant to the discussion at hand. The performance per watt doesn't mean much when you're dealing with a cooling solution. You need to be able to dissipate the heat generated by the processor under full load, otherwise you won't be able to run the processor at full load at all. Then, what's the point in having all that power if you can't use it (except maybe instantaneously)? So the cooling solution must be matched to peak heat output.

RE: We're slowly reaching the point...
By Alexstarfire on 2/6/2009 12:21:00 AM , Rating: 1
Except that would still be very stupid. Passive coolers have quite high temperatures which not only reduces the life of the chip, but it also means you can't OC as far. In my mind a high-end system is going to be OCed, and you'd be foolish to OC on a passively cooled anything.

RE: We're slowly reaching the point...
By omnicronx on 2/6/2009 10:29:52 AM , Rating: 2
I think it is time for you to leave your bubble and realize only a small percentage of people(perhaps a faction of a percentage) actually OC their computers..

If you havnt noticed the whole netbook revolution going on, we have kind of reached the peak in how much power the average user needs. Unless some new use for computers is on the horizon in which large amounts of power are needed, I see nowhere to go but making smaller more efficient computers.

Don't get me wrong there will always be a place for the power user, but you are certainly not the majority.

By Reclaimer77 on 2/6/2009 5:46:11 PM , Rating: 2
Don't get me wrong there will always be a place for the power user, but you are certainly not the majority.

So because we're not the majority we don't count ?

He's right you know. There isn't going to be some breakthrough in passive cooling that will become the standard, BECAUSE it would leave overclockers in the cold.

Even if I DIDN'T OC I wouldn't want passive cooling. Passive cooling ROASTS the inside of your case and all the components, especially the CPU, consistently are subjected to much higher temps. All for what ? Having a "silent" PC ? Fans aren't loud anymore, they just aren't. If you do some research and spend a few bucks, you can get fans that pull more CFM's, while running silent AND slower.

I love how everyone has a crystal ball that somehow they know PC's will start to use such little power they won't need to be actively cooled any longer. Totally ignoring the fact that some of the biggest breakthroughs in modern PC's have been IN cooling technology. Just a few short years ago nobody had heatsinks with high fin density and heatpipes. Now they are standard.

Unless some amazing breakthrough takes place in CPU technology, I can't see the day anytime soon where CPU efficiency will offset it's under load temps to the point where passive cooling can be standard for all users.

By SilentSin on 2/6/2009 12:23:14 AM , Rating: 2
Within 2 more process nodes, your average midrange cpu should have power consumption well below 10watts, enough to easily be cooled passively.

I'm not so convinced about that. As long as software keeps pushing hardware to the point where people need to upgrade (that's the whole reason there is a computer hardware business ) there will always be a need for more powerful processors. That means more functionality/transistors/power. For example look at the 65nm Conroe Cores to the 45nm i7. Far more performance and functionaloty offered with the i7, but also at the expense of similar or higher TDPs, regardless of the process reduction. The Prescott cores are another, older example.

Even through multiple process nodes from 130 down to 45nm there have been a few mainstay power envelopes that CPUs have stuck with. Within 2 process nodes (22nm or thereabouts) there may be CPUs that can run most anything we have today with only 10W, but that's not that much different than it is right now with Atom and VIA's offerings either. I think there will be a place for the 65W+ CPUs for some time to come. There may be more options below that threshold that come to market, but the space heaters won't go the way of the dinos yet.

By ekv on 2/6/2009 2:15:59 AM , Rating: 2
Effectively I disagree that a high-end system can be passively cooled, however, an average midrange computer can be. Depending on your budget Zalman has some interesting options. To wit,
Zalman TNN300AF Noise Free Multimedia Micro Tower Case w/350W Fanless PSU, $699.
The added bonus is that fan noise doesn't interfere with the movie on the HTPC. [Keep in mind, for this kind of money you'd be better off getting an XBox/360 and a NetFlix account. Unless you're driving a 56" plasma display...]

I've been reading about Intel's roadmap and various other semiconductor industry sources ... and projected power consumption is about what you suggest -- CPU, memory, SSD, DVD (Blu-ray), LCD display (LED backlight) etc. Of course, the fly in the ointment is the GPU. It would have to be seriously bastardized to fit under a 10W regime, ie. not quite "midrange".

I agree about Via mobo's. They're great for what they do and are damned well a lot cheaper than Zalman 8)

By AnnihilatorX on 2/6/2009 3:35:12 AM , Rating: 2
When transistor size shrink towards quantum dots, the holy grail of electronics, then power consumption will be at least thousand times less than current transistors.

By Targon on 2/6/2009 8:20:30 AM , Rating: 2
A possible solution to the lack of surface area might be for the CPU to have pins that extend out the top of the chip which a cooling system could go into. Most if not all of us are used to the CPU having pins which go into a motherboard, so why not pins on the top where a heat sink or other cooling system would connect into. The density of these pins(which could even be heat-pipes or something similar) might allow for the necessary level of cooling.

I am not saying that would be the best solution(or even a good solution), but it is an idea. The problem as I see it is that the increasing amount of cache on processors will probably more than compensate for the process shrink. So, the actual part of the CPU which handles instructions and calculations may get to the point where the heat generated may not be the issue, but the cache would take up too much room to allow for easy cooling.

RE: We're slowly reaching the point...
By Totally on 2/6/2009 3:34:59 AM , Rating: 2
i'd prefer to keep optical drives because their media is non-volatile. I think they'll be here until something else that just a reliable comes along.

By mmcdonalataocdotgov on 2/6/2009 1:01:26 PM , Rating: 2
Well, optical storage is volatile, but a longer lasting solution than SSD.

By Sunday Ironfoot on 2/6/2009 8:47:42 AM , Rating: 2
Interesting point. But with CPUs and GPUs, manufacturers have a tendency to keep adding more features, cores, cached etc. as transistors get smaller, wiping out any potential power efficiency gains. We'll still have fans for some time to come.

But hard drives and optical drives, yes I can see those going, with software companies using USB drives for distribution, or the internet. Installing Windows 7 from a USB pen drive, that would be cool!

By mmcdonalataocdotgov on 2/6/2009 1:00:05 PM , Rating: 2
Monitor and PC on/off switches, mice and keyboards have moving parts, too. But I get what you mean. Other than that, all solid state with no electro-mechanical.

It's funnier to think that 20 years ago I was psyched to get a 386SX.

Windows 7
By xenos123 on 2/6/2009 9:57:19 AM , Rating: 2
Windows 7 launch together in Q3.

Wait, what? Last I heard was Q1 2010.

price drop
By Kinshinlink on 2/6/2009 4:58:56 PM , Rating: 2
100% down to 10? i guess ill hold off on a ram upgrade

“So far we have not seen a single Android device that does not infringe on our patents." -- Microsoft General Counsel Brad Smith
Related Articles

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki