Modules using Samsung's latest 3Xnm process are expected later this year

Lower power due to smaller process

DDR3 DRAM has just overtaken DDR2 as
the predominant memory technology used in today's new computers.
Newer RAM has traditionally been more expensive than previous
generations, but DDR3 pricing has gone down over the last few years
due to mass production and die shrinks to smaller process nodes. Not
only does this result
in price cuts, but also lower power consumption and higher
possible speeds.

Samsung just started producing DDR3 on its 40nm
process last year, but is already working on its newest
generational node. The company describes it as being 30nm-class, but
is generally acknowledged as being around 32nm. The process size
refers to the average half-pitch of a memory cell. A smaller die size
means that more dies can fit on a silicon wafer, reducing production
costs. The company estimates the new chips will increase its
cost-efficiency per wafer by sixty percent.

The new 2Gb
chip can be used to create power-efficient 4GB modules operating at
1.35 volts. Samsung expects power savings of 30 percent compared to a
similar chip produced on a 50nm process, with a 4GB module consuming
only three watts per hour when used in a newer-generation notebook.

“Our accelerated development of next generation 30nm-class DRAM
should keep us in the most competitive position in the memory
market,” said Soo-In Cho, President of Samsung Electronics' Memory
Division.

Mass production of the new chips is expected to start in the
second half of the year, with volume ramping up for the busy holiday
shopping season.

Electronic devices DO consume Watts per hour. THAT is common knowledge. They don't consume Watts, that is only a measurement of how quickly they consume Joules. How many Watts (or more specifically kW) they consume in an hour (i.e. per hour) is a quantity of energy COMMONLY used by energy companies.

3 Watts in one hour = .003 x Price you pay per kW-hr

One of those moments where one thing is not necessarily incorrect, but doesn't need all the wording.

Watts is Joules/seconds, which is already a energy/time scale.

We say our computers use 200/300/400 Watts, not Watts per hour, because in the end we're multiplying Watts by the hour itself to get total Joule (energy) usage.

Utility charges by the kW-hr because that term figures out into raw Joule usage, but when we talk consumption we just talk rates, in which case Joules/Seconds, or Watt (by itself) is more tha sufficient.

Applying some dimensional analysis here (fancy term for making sure that the units being compared are relevant):

The unit called "Watt" is equivalent to Joule per second. To find out how much energy (in Joules) is being consumed for a given period of time, we convert that period into a quantity of seconds, and multiply it by the power rating (which is in Watts).

The unit called "Kilowatt-hour" is equivalent to "1000 * Watt * 3600 * seconds", since 1 KW = 1000 W and 1 hour = 60 mins = 3600 seconds. Therefore, 1 KWh is equivalent to 3,600,000 Joules, or simply 3.6 MJ. The point is that KWh and J are the same object expressed differently.

The reason why utilities like to use KWh instead of J is because J is a pretty small quantity. Since humans in this post-modern age guzzle electricity like Europeans guzzle beer and Americans guzzle debts, using KWh makes counting numbers easier for everyone.

I hope I've made things clear for the layman. Good day!

I think this is what you were saying, but I found this post very confusing at first.

Its not a kW/hr, its a kW*hr. Its not a kW per hour, its a way for the power company to tell you you have used 1000 W for 3600 s, or 3.6 MJ of energy. So a 100 W light bulb, run for 10 hours, will consume a kW-hr of energy. Power is Watts, Energy is Joules. Power = Energy / Time.