backtop


Print 59 comment(s) - last by AntiM.. on Mar 12 at 11:09 AM


Professor Andre Geim of The School of Physics and Astronomy at The University of Manchester

Dr. Kostya Novoselov of The School of Physics and Astronomy at The University of Manchester

Graphene-based transistor created by the University of Manchester team
The largest hurdle in semiconductor miniaturization has just been shattered

Using the world’s thinnest material, Graphene, researchers at the University of Manchester have created the world’s smallest transistor. According to Professor Andre Geim and Dr. Kostya Novoselov from The School of Physics and Astronomy at The University of Manchester, the new transistors are only one atom thick and less than 50 atoms wide. The development opens the gate to superfast computer chips at sizes not possible before with standard Silicon transistors.

According to the semiconductor industry roadmap, miniaturization of electronics will face its largest challenge in the next twenty years. This is because Silicon based technology will begin to reach its minimum size limit. 

Graphene, a form of carbon that is only one atom thick, may provide a solid alternative for even further miniaturization of electronics as silicon-based technology reaches its limit.

Graphene transistors were originally created two years ago, but at that time they were very “leaky” meaning current could not be turned off to zero. The “leaky” quality of the transistors effectively limited their uses, and rendered them useless for employment in computer chips and electronic circuits. But over the course of the past two years the research team at the University of Manchester was able to overcome this problem, and have created fully-functional and stable Graphene transistors.

Graphene transistors remain stable and conductive even when they are only a few nanometers wide. This is in contrast to all other known materials, including the dominant silicon transistors, which “oxidize, decompose and become unstable at sizes ten times larger.” This is the barrier that current silicon-based technology is approaching and is likely to also be its downfall.

"We have made ribbons only a few nanometers wide and cannot rule out the possibility of confining graphene even further - down to maybe a single ring of carbon atoms," says Professor Geim of the University of Manchester.

Graphene provides a solid alternative to Silicon and according to Geim can lead to even further reductions in size. Geim expects future electronic circuits to possibly be carved out of a single Graphenesheet.  

Dr Leonid Ponomarenko, who is leading this research at The University of Manchester, is optimistic of the technologies’ future.

"The next logical step is true nanometer-sized circuits and this is where graphene can come into play because it remains stable - unlike silicon or other materials - even at these dimensions."

Geim believes that Graphene is the only viable successor to Silicon after the currently dominant technology reaches its limit.  Graphene-based circuits, however, are not likely to be completely ready until 2025.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

What I'm wondering is
By thebrown13 on 3/2/2007 4:11:40 PM , Rating: 1
Wtf do you do when your transistors are one atom wide and one atom long? Probably not possible, but then what?




RE: What I'm wondering is
By Frank M on 3/2/2007 4:32:30 PM , Rating: 5
I think that you rest.


RE: What I'm wondering is
By Xenoterranos on 3/2/2007 4:36:45 PM , Rating: 1
dimensional folding?


RE: What I'm wondering is
By masher2 (blog) on 3/2/2007 4:39:49 PM , Rating: 4
There's always quantum computing which allows your computing units to calculate many states simultaneously...though a quantum computer only works on stochastic algorithms, so its not really suitable for general use.

Beyond that, its theoretically possible to build transistors out of quarks and gluons (the individual constituents of protons and neutrons), though we aren't close enough to an understanding of quantum chromodynamics to even guess how we might go about constructing one.


RE: What I'm wondering is
By Tyler 86 on 3/2/2007 5:20:51 PM , Rating: 1
Quantum computers can be used to perform standard operations as well, provided implementation.

Quantum mechanics is the ideal route for computing to take.


RE: What I'm wondering is
By masher2 (blog) on 3/2/2007 5:46:40 PM , Rating: 4
As I said, a quantum computer can only run stochastic (randomized) algorithms. To perform a 'standard' operation, you need a randomized version of that operation. So far, we've found very few of those (e.g. Shor's algorithm for factoring, etc). It's not even clear whether stochastic algorithms exist for all problems and, even if they do, whether or not they'll be efficient.

It's already been proven that, for an abstract NP-complete problem in which an efficient stochastic algortithm doesn't exist, a quantum computer provides only a quadratic speedup, which means they're really not any more useful than a standard computing device.


RE: What I'm wondering is
By Samus on 3/2/2007 7:01:31 PM , Rating: 2
Thats why 'current' quantum computers are complimented with a basic cpu to randomize data streams. The performance can be impressive when you consider it takes the same amount of time to add 1 + 1 than it does to find a quadratic such as x = ( -b +- squareroot( (b) squared - 4ac ) ) / (2a)


RE: What I'm wondering is
By Naviblue on 3/2/07, Rating: -1
RE: What I'm wondering is
By thilanliyan on 3/2/2007 5:17:05 PM , Rating: 2
lol. Too bad these won't come out till +2025. I wonder about the health effects of nano-sized carbon components. IIRC carbon nanotubes had health risks.


RE: What I'm wondering is
By KernD on 3/2/2007 8:16:05 PM , Rating: 3
Carbon nanotube are a risk to your lungs health only, it acts like asbestos, which means lungs cancer and you die...

But that is only a problem if it gets in the air, but I don't think there is any of my processor's silicon or other components that gets in the air.


RE: What I'm wondering is
By mino on 3/3/2007 8:34:54 AM , Rating: 1
Also those "tubes" will be a part of something bigger and also prett small.

To chause cancer a tube has to be at a size scale of a cell. These thing will several prders of magnitude smaller...


RE: What I'm wondering is
By dogchainx on 3/3/2007 9:03:35 PM , Rating: 3
Incorrect. If the "tubes" had to be a scale size to that of a cell to cause cancer, then that would mean most everything that did cause cancer would have to be the scale size of a cell. Cancer can be caused by damage to DNA, which is a very long double helix, and is an order of magnitude smaller than cellular sizes. Its a molecule, and it looks like it can be damaged by "those tubes" which on on about the same scale. Damage to DNA information for controlling a cell's division and death cycle is the the main reason for cancer (if not the only reason for it...).

Its like saying running over a tree could cause a flat tire. Not really...but a sharp stick from a branch could.


RE: What I'm wondering is
By guwd1 on 3/7/2007 7:38:56 AM , Rating: 3
quote:
Carbon nanotube are a risk to your lungs health only, it acts like asbestos, which means lungs cancer and you die...


Incorrect. Propper reasearch has yet to be made regarding nanotubes effect on other cells. It's likely dangerous for all cells because they are so small they can't be blocked out by the cell, it's slips through the cells barrier. Once inside the cell has to remove 'the foreign substance' by means of manufacturing and attaching other stuff onto the nanotube making the particle larger and thus handleable. In a general exposure scenario, most likely a non negliable amount of nanotubes enters the cell, the cell constantly has to dedicate valuable resorces to this task making it less capable of handling other important tasks. Think of the cell as a warrior, if it has to battle two evils/enemies simultaniously it's obviously more likely to fail, resulting in death/cancer/virus-infection/'who-knows'... as I said proper research has yet to be made, but it's mostly a matter of assessing risks and exploring the 'who-knows'-part afaik.


RE: What I'm wondering is
By guwd1 on 3/7/2007 7:50:20 AM , Rating: 2
quote:
But that is only a problem if it gets in the air, but I don't think there is any of my processor's silicon or other components that gets in the air.


I have no idea about that, but I think the major concerns are all the waste-nanotubes that accumulates in production facilities, and the fact that nanotube waste isn't well regulated by goverment/other yet since the risks aren't properly assessed. Is it ok to just throw the waste on the dump? Sippering out of a pile? Will the companies care if it could save them money? That's why some health officials are stressing that such reaserch be made so that good regulations and laws can be written. Let's hope they hurry-up cuz I sure wouldn't like to be the one working at such a companys manufacturing plant.


RE: What I'm wondering is
By euczechguy on 3/5/2007 12:51:01 AM , Rating: 2
u cannot clock quantum computer cuz it has 'no frequency'...
it is completely impossible to even influence quantum bits...
we think we can but there is no prove we could...

what we're dealing with here is complete and dangerous uncharted area of subatomary matter...

although there is found the future of technology...


RE: What I'm wondering is
By AntDX316 on 3/8/2007 7:03:02 AM , Rating: 2
u cant clock it but u can maybe measure it and rate it with the name Quantum Folds per second :)


RE: What I'm wondering is
By sdsdv10 on 3/2/2007 4:42:49 PM , Rating: 2
Sub-atomic particles, of course! Fermions and Bosons for everyone...


RE: What I'm wondering is
By vdig on 3/2/07, Rating: -1
RE: What I'm wondering is
By fk49 on 3/3/2007 1:44:55 AM , Rating: 2
Wait..what?

Silicon makes the tiny transistors INSIDE the chips vdig. Microscopic circuits. That turn on and off. 1 and 0. Chips on processors and video cards already have hundreds of millions of transistors and, generally, increasing the number of transistors increases performance. If we can make smaller transistors, we can squeeze more onto the same chip.

This has nothing to do with form factors or printed circuit board sizes. They're talking about replacing the technology that runs inside of that CPU core, not changing the way people handle chips.


RE: What I'm wondering is
By Spivonious on 3/5/2007 10:25:59 AM , Rating: 2
But I think he was saying that as we can cram more performance onto smaller chips, then why couldn't we put an entire computer onto say a 2 square inch chip? Then you have an interface board of some sort that contains all your I/O interfaces, and you could plug in extra computers into sockets on this board.


RE: What I'm wondering is
By Garreye on 3/2/2007 4:57:31 PM , Rating: 2
Find a way to make each atom act like 2 transistor...probably not possible either...


RE: What I'm wondering is
By surt on 3/2/2007 6:14:52 PM , Rating: 2
You start building multilayer (3d) chips. Plenty of room for 2^20th layers (maybe as many as 2^30th) in a desktop sized chip. That's good for at least another 30 - 45 years of moore's law.


RE: What I'm wondering is
By masher2 (blog) on 3/2/2007 7:12:23 PM , Rating: 2
> "That's good for at least another 30 - 45 years of moore's law."

Moore's Law doesn't apply here. Fundamentally, its merely a statement of geometry...that a linear decrease in feature size yields a quadratic increase feature density. Or, in simpler yerms-- doubling "x" quadruples "x^2".

Now while 3D chipbuilding techniques are highly exciting and can yield some impressive gains...they won't be quadratic gains. And thus Moore's Law doesn't apply.


RE: What I'm wondering is
By surt on 3/2/2007 8:32:26 PM , Rating: 2
I meant Moore's law in the more conventional sense of 'we can double the number of transistors every year'.

quote:
The term Moore's Law was coined by Carver Mead around 1970.[4] Moore's original statement can be found in his publication "Cramming more components onto integrated circuits", Electronics Magazine 19 April 1965:


http://en.wikipedia.org/wiki/Moore's_law


RE: What I'm wondering is
By masher2 (blog) on 3/2/2007 9:18:52 PM , Rating: 3
> "I meant Moore's law in the more conventional sense of 'we can double the number of transistors every year'"

Right, but that statement is a direct result of quadratic growth. We move to new litho process nodes in linear time, which results in quadratic density increases (doubling of transistor counts in the same space, at the same cost).

3D circuit fabrication gives us a linear growth path, but not a quadratic one.


RE: What I'm wondering is
By surt on 3/3/2007 7:13:17 PM , Rating: 2
Just double the number of layers each year. 2 layers, 4 layers, 8 layers ....
2^30th layers.


RE: What I'm wondering is
By masher2 (blog) on 3/4/2007 4:36:39 PM , Rating: 1
I'm sure you can see yourself why this doesn't work :)


RE: What I'm wondering is
By surt on 3/4/2007 5:35:56 PM , Rating: 2
It doesn't work at something in the neighborhood of 2^30th layers (what will fit within a conventional computer cube), which buys us another ~45-60 years of moore's law, as I originally suggested (depending on just how fast we can paint layers, and how thick the layers have to be to provide insulation).

Then you really have to come up with something novel.


RE: What I'm wondering is
By masher2 (blog) on 3/4/2007 9:47:12 PM , Rating: 2
You're not getting it. Let's pretend its the year 20xx and we can build circuits consisting of 10 layers. Then, in a couple years, we can probably build them with 15 layers, then a couple years after that, our limit will be 20, etc, etc. That's a linear growth function.

There's no reason to expect exponential growth from a layering approach. Why would we? Each new layers adds a linear increment to our total circuit volume, but it doesn't affect the layers which came before it.


RE: What I'm wondering is
By Upset Nerd on 3/3/2007 9:18:03 AM , Rating: 2
Isn't Moore's law primarily dependant on the exponential, and not linear, decrease in one-dimensional feature size while the quadratic gains in feature density is merely an added "bonus" on top of that so to speak?


RE: What I'm wondering is
By arazok on 3/2/2007 7:36:33 PM , Rating: 4
You lay off your engineers, and replace them with marketing staff. They will re brand the same old technology with 'must have' features for decades.

I can`t wait for my 800 core CPU with integrated automatons inside.


RE: What I'm wondering is
By AlmostExAMD on 3/3/2007 1:47:58 AM , Rating: 2
Oh don't panic, When we get to that scale machines will become self aware and start plotting to take over the world! :)


RE: What I'm wondering is
By shady28 on 3/8/2007 11:31:38 PM , Rating: 2
Beyond miniaturization, I think the next thing will be design changes. That is to say, we are all still thinking about two state transistor technology. Think, something that isn't a transistor and doesnt use binary. Multi-state devices. This is basically what a neuron in your brain is - they can have many different states (it isnt known exactly how many, but the number is high - think millions).

Even a successful ternary (3) state machine would be revolutionary. Some work has been done on this, but the designing of such a machine is apparently too complex right now.

A machine capable of processing in base 3, 4 or whatever internally would theoretically be significantly faster than current binary machines. ie, the number 10 in binary is 1010, in base 3 it is 31.


RE: What I'm wondering is
By AntiM on 3/12/2007 11:09:49 AM , Rating: 2
Maybe you stop using transistors r their equivalents. Maybe you stop thinking in terms of binary 1's and 0's, and start thinking about swithes that work like brain cells.
Circuits (I don't even know if you would call it a circuit) that use photons instead of electrons and switches that react to wavelengths of light. That's where I would put my money in the very long term.


interesting
By TheDoc9 on 3/2/2007 4:38:54 PM , Rating: 2
This could be the begining of one of the biggest achievements in tech history.




RE: interesting
By JonB on 3/2/2007 5:07:48 PM , Rating: 5
Skynet will be here soon. All we need to work on is amorphous metal construction for our robot masters.


RE: interesting
By nanokompressor on 3/3/2007 3:00:54 AM , Rating: 3
You're terminated.


RE: interesting
By Visual on 3/5/2007 6:18:45 AM , Rating: 2
You've been erased!


RE: interesting
By Spivonious on 3/5/2007 10:28:13 AM , Rating: 2
Get down!!


RE: interesting
By Filibuster on 3/9/2007 10:20:29 PM , Rating: 2
My CPU is a neural net processor. A learning computer.


RE: interesting
By Visk on 3/3/2007 7:43:36 AM , Rating: 3
With this new processor, you'll be able to run Oblivion at MEDIUM settings!


RE: interesting
By arturus on 3/4/2007 8:54:29 AM , Rating: 2
heh, yea but with bloom lighting turned on!


Forget Sub-atomic processing...
By CollegeTechGuy on 3/5/2007 3:13:00 PM , Rating: 2
What we need to focus on is the 4th dimension of the universe. Create a CPU that has a Delorean attached to it and when you make a request to complete an instruction the CPU looks into the future to get the answer. So the process will be done the very instance, or even before you sent the instruction. Now that is the future, or going back to it, of computers.




RE: Forget Sub-atomic processing...
By Eris23007 on 3/6/2007 7:58:01 PM , Rating: 3
It better have a flux capacitor too... and consume "one point twenty-one jigawattts!!!" of power...


By Eris23007 on 3/6/2007 7:58:26 PM , Rating: 3
oh yeah, and a Mr. Fusion!


University of Manchester
By Zurtex on 3/2/2007 10:04:20 PM , Rating: 3
Yays, University of Manchester!

I have lots of fun being here, especially being closely tied in with a lot of physics students and helping them out with their problems. So many things keep getting discovered here :). My house mate programs particle physics engines on to graphics cards, fascinating field of research, when he gets paid for it this summer, University of Manchester expect to make huge strides in the amount they're doing, never really ever been able to do it this fast before, a single 8800GTX outperforms some basic super computers, so well worth cost / performance results.




which uni of manchester?
By otispunkmeyer on 3/5/2007 4:01:43 AM , Rating: 2
in the UK? australia? US? dont they all have places named manchester?




RE: which uni of manchester?
By masher2 (blog) on 3/5/2007 8:17:34 AM , Rating: 1
The link in the article points to the (U.K. based) University of Manchester.


hmm.
By AntiV6 on 3/2/2007 4:31:23 PM , Rating: 2
Can/will Graphene based chips be used in Nanomedicine?




By jak3676 on 3/2/2007 5:59:45 PM , Rating: 2
The photo shows a label measureing 100nm. The connection alone looks to be about 200nm wide. Aren't we already past that with the smallest components measuring 65nm, with test platform hardware at 45nm. Unless I'm missing something, I think the label is incorrect.




Electricity vs Photons?
By Mitch101 on 3/2/2007 8:25:54 PM , Rating: 2
Even if the photon switch was larger. Wouldnt light still be faster than electrical characteristics? Of course somewhere you would have to convert one to the other.




Bugger :-)
By xphile on 3/3/2007 1:02:57 AM , Rating: 2
It's 2007 and I'm slowly getting my head around code changes that start making REAL use of cpu two cores in programs. At this size I now have a little under 18 years left to get my head around how to code for maybe 1048576 cpu cores, plus the integration of all the graphics cores into the CPU as well!

I hate you Gordon Earle Moore :-)




By nurbsenvi on 3/5/2007 1:18:45 PM , Rating: 2
At least tell me about these future technologies 5 years before commercialization not 18 years in advance cuz I'm not that patient of a person.

Besides I don't think we will be here in 2025.




...cost comparison?
By Schrag4 on 3/5/2007 2:04:37 PM , Rating: 2
I know it's projected for 2025 so I shouldn't waste my time even thinking about it (since I'm not involved in developing the technology), but how will this new technology stack to up silicon in terms of cost? Will home users still be using silicon chips and big corporations and governments be using the new technology, due to cost restrictions? Just curious.




MAN URE DUMB!
By Fubar0606 on 3/2/07, Rating: -1
RE: MAN URE DUMB!
By Url on 3/3/2007 1:23:42 PM , Rating: 3
No!!! You go read a book!!


RE: MAN URE DUMB!
By Spivonious on 3/5/2007 10:30:08 AM , Rating: 3
take note: drugs + internet = stupidity.


RE: MAN URE DUMB!
By WhiteBoyFunk on 3/6/2007 8:57:45 AM , Rating: 1
Micah you newb look what you got yourself into...


"A politician stumbles over himself... Then they pick it out. They edit it. He runs the clip, and then he makes a funny face, and the whole audience has a Pavlovian response." -- Joe Scarborough on John Stewart over Jim Cramer











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki