Print 13 comment(s) - last by DrizztVD.. on Feb 2 at 4:08 PM

Long-standing device gets a nanotechnology boost

It's taking a dive into nanotechnology, but the III-V tunneling field effect transistor (TFET) is finally creeping close to the widely used metal-oxide-semiconductor field effect transistor (MOSFET).

III-V TFETs are a three terminal extension of the tunneling diode, a device invented in 1957, which earned inventor Leo Esaki a Nobel Prize in Physics.  Nicknamed the "Esaki transistor/diode" in his honor, the device went largely overlooked due to low driving currents in most applicable materials.

But a team led by electrical engineering professor Sean Rommel at the Rochester Institute of Technology (RIT) has tuned the transistors to approach MOSFET performance.  A key to the tuning was the work of graduate researcher David Pawlik who grew sub-120 nanometer vertical TFETs on a test chip that allow hundreds of diodes to be tested per sample.  The research allowed multiple kinds of homojunction and heterojunctions to be tested.

Working with fellow graduate researchers Brian Romanczyk and Paul Thomas, as well as collaborators at SEMATECH (a non-profit research consortium backed by top chipmakers) and Texas State University, the team recorded a record peak current density of 2.2 MA/cm^2.

The benefit of the III-V TFET is that they operate at a much lower voltage than MOSFETs and thus consume less watts of power.  The record setting design ran at -0.3 V.

Knapps tunnel
As its name implies, a tunneling FET is similar voltage wise to driving through a hill, instead of down one, says Professor Rommel. [Image Source: NCWpics]

Professor Rommel likens the traditional MOSFET to driving down a hill, voltage-wise, while the TFET, driven by quantum effects, is more like digging a tunnel through the hill.  He comments on the record current levels, "The tunneling field effect transistors have not yet demonstrated a sufficiently large drive current to make it a practical replacement for current transistor technology, but this work conclusively established the largest tunneling current ever experimentally demonstrated, answering a key question about the viability of tunneling field effect transistor technology."

He suggests in the paper that a peak current of 10 MA/cm^2 should be possible with high levels of doping in indium-based heterojunctions.

The results could be applied in everything from smartphones to solar cells.  Professor Rommel suggests tuned TFETs could reduce processor power consumption by a factor of 10, allowing longer battery life for phones and other devices.

The work was presented at a December at the International Electron Devices Meeting (IEDM) in San Francisco, Calif.  The work was funded by The National Science Foundation (NSF), SEMATECH, and RIT's Office of the Vice President of Research.

Sources: RIT, ResearchGate [paper]

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Back in the dark ages
By DrizztVD on 2/1/2013 4:04:37 AM , Rating: 2
Shadowself, dunno if you're gonna read this, but what's your take on the future of photonics/optical computing?

As I understand it the fundamental physics of photons allows for much faster signal propagation through a computing chip, giving rise to a significantly higher fundamental limit to processor speed. The question that needs to be asked is can this physics be exploited economically to replace electronics in a large number of applications?

Added to this is the likelihood of using photonic research as a stepping stone to quantum computation. This is purely speculative though.

RE: Back in the dark ages
By ShieTar on 2/1/2013 5:34:42 AM , Rating: 1
I'm not working in the field, but I have the feeling that right now the only predicted real-world application of photonic computing seems to be to remove the electronic parts from the optical transmission systems of telecommunication. Of course if this is implemented in the network providers infrastructure, it may trickle down into personal network electronics, especially once Fibre-to-the-Home becomes a mass product.

For purely computational purposes, the ongoing success of parallelisations of tasks makes single-module speed increasingly less relevant, while power efficiency becomes increasingly important, both in private mobile devices and in large-scale supercomputing installations. Since power is always stored electrically, the efficiency handicap of electrical to optical conversion makes fully photonics CPUs a lot less promising concept than it was considered 20 years ago.

The same problem will probably remain forever the case for quantum computation. While the computing atom/molecule itself may be quiete fast and efficient, it always needs to be thermally and optically decoupled from its environment, so in all likelyhood you will never be able to have a quantum computer without a high-cost, high-energy-consumption thermal vacuum chamber. I'd love to be proven wrong here, but right now I don't see even a theorectical concept on how to get there.

RE: Back in the dark ages
By 3DoubleD on 2/1/2013 1:43:22 PM , Rating: 2
I'm doing my PhD in a related field. While I don't work directly on transistors (I work on III-V nanowires), I am exposed to plenty of research of those who do so I thought I'd weigh in here.

If I had to guess, I think large companies like Intel, IBM, AMD, ect. will be banking on the introduction of optical interconnects to somewhat extend Moore's Law as they run out of room at the bottom for silicon, which should happen ~2020. When they can no longer shrink their transistors and take advantage of traditional Moore's law scaling, they will need to turn to either a different material than silicon or introduce different enhancements not related to transistor scaling.

While a shift to a different material is inevitable, I think silicon photonics is a much more mature field than new emerging materials, like graphene. Furthermore, silicon photonics does not create the need for radically new manufacturing equipment or processes. If you think about it, companies like Intel have built their entire empire on silicon and the amount of expertise that has been developed around it will not be easily abandoned.

The 2020s will be a fascinating time for the integrated circuit industry - and while we all hope the transition from Moore's Law to whatever else follows has been meticulously planned, I wouldn't be surprised if it is a rather disruptive time in the industry. If you think about it, when Intel is the first to run into what I like to call "Moore's Wall" (I'm sure I didn't invent that term), Intel could lose their all-so-important manufacturing process lead in less than a decade from now! Be the first to be able to mass manufacture a modern CPU out of graphene and Intel's lithography advantage would mean squat!

Anyway, I imagine optical interconnects and perhaps more active optical components will be gradually introduced in the coming years; however, the silicon MOSFET will be the bleeding edge for the vast majority of electronics for at least another decade, if not several more. It would certainly be a healthy sign if upcoming designs like Haswell showed some sort of progress on the optical interconnect front.

The future is very exciting!

RE: Back in the dark ages
By DrizztVD on 2/2/2013 4:08:38 PM , Rating: 2
Thanks. Very useful information, since I'm asking the question because I'm thinking of doing a masters in Photonic/optronic Engineering. Naturally to engineer something you need to have the fundamental research already done and sorted. So I'm hoping that I'd have niche applications to work on at first and then as time goes on it becomes a bit more mainstream.

Maybe in 15 years time they can build a simple all-optical processor running at like 500GHz. Who knows. Maybe not, Epic Fail.

RE: Back in the dark ages
By Stiggalicious on 2/1/2013 11:16:40 PM , Rating: 2
Seeing the advances taken in optical transceiver circuits and the prices of the devices utilizing them seems to be a very promising technological advancement for the mainstream.
As of right now, we're approaching the limits of using copper as a transmission medium for data over any length greater than a couple meters (I'm talking like 100Gbits). Ethernet, which traditionally runs over copper, will soon (and already are for building-to-building connections) go to optical communications. As the speed increases even further, the maximum distance for copper will decrease even further. You'll then see things like PCIe eventually be replaced with optical interconnects. The CPU, however, will remain in silicon for the forseeable future. Photonics are still nigh-impossible to manufacture at a remotely reasonable price.

"I mean, if you wanna break down someone's door, why don't you start with AT&T, for God sakes? They make your amazing phone unusable as a phone!" -- Jon Stewart on Apple and the iPhone

Related Articles

Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki