backtop


Print 45 comment(s) - last by rburnham.. on Apr 6 at 3:06 PM


Graphene is an unusual single-atom thick carbon semiconductor.  (Source: i09)

Researchers measured the heat of a graphene transistor for the first time using atomic force microscopy. The results were surprising -- the material significantly self-cools.  (Source: Alex Jerez, Beckman Institute for Advanced Science and Technology)
Future computers may not need a heat-sink -- their thermal electric properties result in net-cooling effect

Heat is a sad fact of life for current generation electronics.  Any Android, iPhone, or BlackBerry user can tell you that smartphones tend to get pretty hot at times.  And by today's standards a balmy 85 degrees Celsius, while hot enough to cook an egg, is a pretty "good" operating temperature for a high-powered PC graphics processing unit.

But that could all soon change, according to the results of a new study by researchers at the University of Illinois.  Examining graphene transistors, a team led by mechanical science and engineering professor William King [profile] and electrical and computer engineering professor Eric Pop [profile] made a remarkable discovery -- graphene appears to self-cool.

I. What is Graphene?

Graphene is somewhat like a miniature "fence" of carbon.  The material consists of a single-atom thick layer composed of hexagonal units.  At each point of the hexagon sits a carbon atom that is bonded to its three close neighbors.

The material behaves like a semiconductor, despite being made of organic atoms.  It offers remarkable performance at an incredibly small scale, thus the electronics industry views it as a potential material to power electronic devices of the future.

A variety of methods exist for producing graphene.  The earliest method was an exfoliation technique that involved stripping individual graphene layers off a layer of graphite (the material found in pencil lead) -- this technique (as of 2008) cost as much as $100M USD to produce a single cubic centimeter of material.  However, rapid advances in production have allowed manufacturers to begin scaling up production to the point where tons of exfoliated graphene can now be produced.

Other techniques promise to drop the price even further.  One method, epitaxial growth on silicon cost $100 per cubic centimeter in 2009.  Its limitation is that, obviously, it requires silicon (eliminating some desirable properties like flexibility).  South Korean researchers have tested another promising method, nickel metal transfer.

Graphene is fascinating from a physics perspective.  In 2005 physicists at the University of Manchester and the Philip Kim group from Columbia University demonstrated that quasiparticles inside graphene were massless Dirac fermions.  These unusual particles help give rise to the material's unique characteristics.

II. Graphene as a Self-Cooling Device

Despite the extreme interest in the material, a great deal of mystery still surrounds Graphene.  Because it is so extremely thin, it is difficult to test and measure accurately certain properties of the material.

Overcoming technical challenges, the University of Illinois team used an atomic force microscope tip as a temperature probe to make the first nanometer-scale temperature measurements of a working graphene transistor.

What they found was that the resistive heating ("waste heat") effect in graphene was weaker than its thermo-electric cooling effect at times.  This is certainly not the case in silicon or other semiconductors where resistive heating far surpasses cooling effects.

What this means is that graphene circuits may not get hot like traditional silicon-based ones.  This could open the door to dense 3D chips and more.  

Further, as the heat is converted back into electricity by the device, graphene transistors may have a two-fold power efficiency gain, both in ditching energetically expensive fans and by recycling heat losses into usable electricity.

Professor King describes, "In silicon and most materials, the electronic heating is much larger than the self-cooling. However, we found that in these graphene transistors, there are regions where the thermoelectric cooling can be larger than the resistive heating, which allows these devices to cool themselves. This self-cooling has not previously been seen for graphene devices."

Professor Pop adds, "Graphene electronics are still in their infancy; however, our measurements and simulations project that thermoelectric effects will become enhanced as graphene transistor technology and contacts improve."

A paper has been published [full text] in nanotechnology's most prestigious peer-reviewed journal, Nature Nanoscience.  University of Illinois graduate student Kyle Grosse [profile], undergraduate Feifei Lian and postdoctoral researcher Myung-Ho Bae [profile] are listed as co-authors on the paper. 

III. What's Next?

The study should provide even more motivation for semiconductor manufacturing companies like Intel, GlobalFoundries, and TMSC to lay down the process work necessary to mass-produce circuits based on graphene transistors, capacitors, etc.

As for the University of Illinois team, they plan to next use their new measurement technique to analyze carbon nanotubes and other novel structures that are of interest to future electronics applications.

Their work is funded via a grant from the Air Force Office of Scientific Research and the Office of Naval Research.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Moore
By MegaHustler on 4/4/2011 4:22:57 PM , Rating: 5
Looks like Moore's law might hold up for some time after all...




RE: Moore
By Shig on 4/4/2011 5:16:47 PM , Rating: 2
The digital renaissance is just getting started.


RE: Moore
By Azethoth on 4/4/2011 5:32:57 PM , Rating: 2
Yeah, Moore's does not fade out and die until we are hand crafting custom quantum circuits I guess. I just hope we hit the singularity before Moore's dies, but given current projections into the 2030's to 2060's its looking bleak.


RE: Moore
By Alexvrb on 4/4/2011 8:05:16 PM , Rating: 2
Haven't you seen The Matrix? Terminator? I, Robot (movie version)? War Games? Self-aware AI is not a good idea, I tell you.


RE: Moore
By shiznit on 4/4/2011 10:12:23 PM , Rating: 4
but immortality is.


RE: Moore
By derricker on 4/5/2011 1:47:24 AM , Rating: 2
Ever heard of boredom?


RE: Moore
By SPOOFE on 4/5/2011 4:27:13 AM , Rating: 2
Yeah, but if you finally get that bored, you can just shut yourself down with a timer set to reactivate you in a billion years or two.


RE: Moore
By therealnickdanger on 4/5/2011 9:44:42 AM , Rating: 4
But didn't you seen Tron Legacy? You still age in the digital realm. LOL


RE: Moore
By SPOOFE on 4/5/2011 1:34:53 PM , Rating: 3
I'll see your Tron Legacy and raise you one Ray Kurzweil.


RE: Moore
By rburnham on 4/6/2011 3:06:36 PM , Rating: 2
As long as I have my computer, no.


RE: Moore
By geddarkstorm on 4/5/2011 6:01:37 PM , Rating: 5
It's a cool dream, but it's never gunna happen. Physics disagrees.

Also, did you know that each neuron in your brain is a fully functional CPU capable of complex pattern decoding? You have 100 billion of these, with another 1 trillion ganglio- and astrocytes, each of which can also participate in communication. And did you also know, that your axons can reverse themselves, allowing your circuits to run backwards at will? And that the main neural connections can be re-wired at will, just by you thinking about things differently and wanting to? What you've got already is really dang cool.


RE: Moore
By sfi2k7 on 4/5/11, Rating: 0
RE: Moore
By SPOOFE on 4/5/2011 1:37:22 PM , Rating: 2
Intel is already one stepping ahead of their competition; if they could push that to two or more, I think they definitely would, simply because it would justify even MORE profitability per chip sold. I would imagine that if investors found out that the company was deliberately not being as profitable as it could, they'd be pretty upset.


RE: Moore
By Master Kenobi (blog) on 4/5/2011 5:34:23 PM , Rating: 2
Well you need to look at it from a business strategy. If they maintain 1 year/step ahead of their competition they always have the edge, meanwhile they are preping the next stepping for mass production and starting the small scale production of the stepping beyond that, and finishing the research and starting to construct the method for the next stepping. It really is a well oiled machine they have working there, not really a surprise when you look at how much money Intel sinks into R&D every year.

Could Intel accelerate their process and shrink faster? Yes.
Would it be more profitable? Possibly, but the number of errors per wafer might increase more than Intel is willing to tolerate.
Does this guarantee a steady release schedule that keeps Intel chips selling in high quantity year after year? You bet.
Intel could be more profitable in the short run by shrinking faster, but in the long term the slower shrink pace (Still faster than anyone else in the industry) will in the longterm sell them more chips overall and lead to higher profits each year.


RE: Moore
By kattanna on 4/5/2011 12:04:50 PM , Rating: 2
http://www.sciencedaily.com/releases/2011/03/11033...

quote:
Physicists have managed to control the rotation of light by means of a ultra thin semiconductor. The advance could potentially be used to create a transistor that works with light instead of electrical current.


there is also this.

hopefully by 2020 we will see some serious advances again with CPU's. i miss the days of old when i didnt need benchmarks to tell my new CPU was faster then my old one.. LOL


RE: Moore
By geddarkstorm on 4/5/2011 6:07:40 PM , Rating: 2
Light probably won't be used in on-chip calculations, it really can't actually beat out electronics for that. Interchip communications on the other hand, it wins by orders of magnitude, absolutely.

On the other hand, Spintronics, that's what really excites me. Massive amounts of potential there for transistors and incredibly fast, incredibly low powered non-volatile memory.


Past computers didn't need any heat-sink either
By Freddo on 4/4/2011 5:32:24 PM , Rating: 2
Pretty much all home computers of the 80s, like the C64 and Amiga500 were without any kind of cooling. Didn't get a computer with cooling until I bought a Pentium 133MHz back in 1996 or whenever it was.




By Amiga500 on 4/4/2011 5:48:27 PM , Rating: 5
Thats 'cos Amiga500s are cool as f**k....

;-)


RE: Past computers didn't need any heat-sink either
By Gungel on 4/4/2011 5:49:53 PM , Rating: 2
My first System was an i8086, it came with a passive cooler and a fan in the case. Never saw a computer without any kind of heat disperser.


By spamreader1 on 4/5/2011 9:20:40 AM , Rating: 2
My first 8088 didn't have a heat sink on anything but the mosfets. The only fan in the system was the power supply.


By Fritzr on 4/5/2011 1:48:31 AM , Rating: 2
C-64 & VIC-20 used heat spreaders to cool the CPU. They doubled as RF shielding and the cooled chips had thermal paste & metal tabs that pressed onto the chip when properly installed.

All of the 8bit systems were designed with heat dispersal in mind.

The 486 overdrive chips made by Intel & AMD were sold with heatsink and fan as part of an integrated package that plugged into the CPU socket.

Cooling has been a problem from the beginning. Coolers have gotten more powerful along with the improved ability of ICs to serve as space heaters.


Thermodynamics
By sleepeeg3 on 4/4/2011 8:11:35 PM , Rating: 4
This article reads like graphene would violate a few laws of thermodynamics... More heat is being removed from the system via a TEC effect than is being generated - 110% efficiency? Theoretically driving enough voltage through them would allow them to self-cool to reach absolute zero with unlimited overclocking potential. What am I missing here?




RE: Thermodynamics
By SPOOFE on 4/5/2011 1:27:24 AM , Rating: 2
quote:
More heat is being removed from the system via a TEC effect than is being generated - 110% efficiency?

The processor itself is not a closed system; electricity, for instance, isn't "generated" within the processor itself. Power supply, "motherboard" (or whatever analogous device they use), and other hardware are also part of the overall computing system, and taking all that into account, thermodynamics is perfectly satisfied.


RE: Thermodynamics
By wordsworm on 4/5/2011 1:55:42 AM , Rating: 2
That's not what I got out of it. It's saying that the material can recycle lost heat energy back into electrical current. He did not imply that there would be a surplus.


RE: Thermodynamics
By HoosierEngineer5 on 4/5/2011 8:28:58 AM , Rating: 2
Probaby because this article may have been written on April 1st?


Maybe I'm dumb...
By smackababy on 4/4/2011 5:54:30 PM , Rating: 3
But isn't everything self cooling in some capacity? Nothing holds heat forever.




RE: Maybe I'm dumb...
By SPOOFE on 4/5/2011 1:24:51 AM , Rating: 2
Under the application of a near-constant electrical current that can be used for the processing of information? "Everything" shrinks down to a much, much smaller category under those conditions. :)


RE: Maybe I'm dumb...
By AnnihilatorX on 4/5/2011 7:23:39 AM , Rating: 2
Not when you have a constant heat source, i.e. the electrical current. Basically, instead of the electrical current producing heat, which you don't want, the heat recycles back into electricity.


Price
By Jeremy87 on 4/4/2011 5:14:20 PM , Rating: 2
"cost as much as $100M USD to produce a single cubic centimeter of material."

According to what wikipedia says about the distance of stacked graphene sheets, this equals about $3 per cm^2 of graphene layers. How many layers do you need?




RE: Price
By cjohnson2136 on 4/4/2011 5:18:07 PM , Rating: 2
Not sure but the $3 per cm^2 is that a R&D price or a pass production price. Just curious cause mass production is always cheaper then the R&D cost.


RE: Price
By PlasmaBomb on 4/4/2011 9:03:38 PM , Rating: 2
The thickness depends on the methods used... typically graphene is referred to as a monolayer.

The lowest thickness I have seen reported is 5.3 Angstroms... so it would take ~18.9 million of them to form a 1 cm^3 block...


WTH is an "organic atom"?
By Irene Ringworm on 4/4/2011 7:47:30 PM , Rating: 5
[Ed.]-- No such thing as an organic atom. The organic/inorganic designations can be applied to molecules but never an atom. You may mean "graphene is composed of carbon, the atomic building block of organic chemistry" but that's both awkward and irrelevant. Neither graphene nor any allotrope of carbon would be properly considered "organic".

And while we're at it, "thermal electric" /= "thermoelectric" except for imported personal refrigerators.




Scorpion says to graphene Subzero
By FITCamaro on 4/4/11, Rating: 0
RE: Scorpion says to graphene Subzero
By KrayLoN on 4/4/11, Rating: -1
RE: Scorpion says to graphene Subzero
By cjohnson2136 on 4/4/2011 5:03:15 PM , Rating: 3
If you read the title of his comment you would see he knows which is which lol


By 2bdetermine on 4/4/2011 6:48:26 PM , Rating: 2
If he wasn't slacking at his job, he would read the subject carefully:-)


yawn...
By zodiacfml on 4/5/2011 3:45:00 AM , Rating: 2
just seems like a very efficient chip to me. probably useful to all sorts of devices but not the kind for desktop or server use.




RE: yawn...
By Silver2k7 on 4/5/2011 7:51:50 AM , Rating: 2
I see so your already running a desktop or server who does not need cooling.. wow must be nice there in the future :P


Is it possible to...
By Qapa on 4/5/2011 1:27:13 PM , Rating: 2
apply heat and get electricity from it then??

That would be interesting from energy POV.

But it could be problematic for electronics -> mobile phone on the sun would "create" bits and bytes and process them as commands...?! argh!




RE: Is it possible to...
By rudolphna on 4/5/2011 1:37:34 PM , Rating: 2
Those kind of things already exist. Heard of voyager? Heard of something called a radio isotope thermoelectric generator? Heat from nuclear decay is turned into electricity by thermometer couples. Granted they are more crude and can do less than these but its not like something of this sort doesn't already exist.


Cooking an egg
By rikulus on 4/4/11, Rating: -1
RE: Cooking an egg
By ppardee on 4/4/2011 7:05:36 PM , Rating: 4
quote:
And by today's standards a balmy 85 degrees Celsius, while hot enough to cook an egg, is a pretty "good" operating temperature


Yes, 85C is hot enough to cook an egg, as the article stated. You are being deceived, my friend, but you're eyes are the deceivers!


RE: Cooking an egg
By ppardee on 4/4/2011 7:06:47 PM , Rating: 2
And my fingers betray me! *"YOUR eyes"


RE: Cooking an egg
By Senju on 4/4/2011 11:07:48 PM , Rating: 2
You know, I think my 3DO system I bought in the 1990s did not have any cooler inside it. I remember touching the top of it after a good 2 hours of play and it was soooo hot. I always wonder how such boards could operate under such heat.


"We’re Apple. We don’t wear suits. We don’t even own suits." -- Apple CEO Steve Jobs














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki