Print 58 comment(s) - last by Mojo the Monke.. on Sep 24 at 12:45 PM

The University of Rochester with help from MIT pulls the wraps off the first true 3D processor

While quantum computers and fiber optic computers are certainly ideal candidates for a silicon PC replacement, they remain in the distant future.  In the meantime, one key unexploited domain, which may give silicon a stay of retirement, is 3D chip technologies.

Today virtually all chips on the market are flat, two dimensional designs.  While this is somewhat efficient from a cooling perspective, it offers definite limitations in terms of computing resources per given space.  A 3D chip could theoretically be much more compact, while being equally efficient.  This would have the added perk that it could reduce defects, as larger dies typically lead to more defects.  It would also limit propagation delays by shortening interconnects and make the chip harder to reverse engineer.

While some chips designs have claimed to be "3D", most of these designs are merely stacked chips with a few communications interconnects and not mass interoperation between stacked layers.  Now the University of Rochester has demoed perhaps the first true 3D processor design.  The chip is optimized in 3 dimensions and runs at a speedy 1.4 GHz.  Its unique design allows it to become the first chip to offer full functionality in three dimensions in tasks involving synchronicity, power distribution, and long-distance signaling.

"I call it a cube now, because it's not just a chip anymore.  This is the way computing is going to have to be done in the future. When the chips are flush against each other, they can do things you could never do with a regular 2-D chip," stated Eby Friedman, Distinguished Professor of Electrical and Computer Engineering at Rochester and faculty director of the processor.

Professor Friedman worked with engineering student Vasilis Pavlidis to develop the design.  He says that while Moore's Law of transistors in a given chip area doubling with time may come to halt in a 2 dimensional world, as some are suggesting, extending processors into 3 dimensions will allow it to continue as fast as ever.

The hardest part according to the researchers is getting the levels of the chip to properly interact.  Professor Friedman compares the problem to a scenario where a standard microprocessor is like the U.S. traffic system, and then the 3D processor is like 3 or more U.S. traffic systems stacked atop each other and expected to coordinate traffic between levels.  He says the problem is even tougher as the processors are different, so it’s more like stacking the U.S., China, and India, where traffic laws are different, atop each other.

However, the advantages are the special purpose processors designed for functions like MP3 encoding could be achieved on a particular layer.  Professor Friedman predicts that a 3D processor in a device such as the iPod could be tenth the current processor's size with ten times the speed.

While the chip uses many standard processor design tricks, it also uses new ones to account for different impedances that might occur from chip to chip, different operating speeds, and different power requirements.  It is also was uniquely manufactured at MIT, through a technique in which millions of holes were drilled in the insulation between layers, allowing virtually every transistor to be connected, if desired, with those above or below it.

The future, Professor Friedman says is vertical.  He states, "Are we going to hit a point where we can't scale integrated circuits any smaller? Horizontally, yes.  But we're going to start scaling vertically, and that will never end. At least not in my lifetime. Talk to my grandchildren about that."

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

So what can this chip do?
By ChronoReverse on 9/16/2008 1:18:06 PM , Rating: 2
1.4GHz really doesn't tell us anything if we don't know the capabilities of the chip

RE: So what can this chip do?
By quiksilvr on 9/16/2008 1:19:14 PM , Rating: 2
I was thinking the same thing. Is it like 1.4 GHz with respect to Core 2 Duo processors or Pentium 4?

RE: So what can this chip do?
By bobsmith1492 on 9/16/2008 1:22:35 PM , Rating: 2
More like a 486, maybe?

RE: So what can this chip do?
By an0dize on 9/16/2008 1:36:15 PM , Rating: 1
I'm sure it would blow both of them away at that speed if they got it fully functional...

RE: So what can this chip do?
By Digimonkey on 9/16/2008 2:31:02 PM , Rating: 2
In real world performance. Not likely, these demo chips usually come with a minimal instruction set. It still would've been nice to see how many flops it could do.

RE: So what can this chip do?
By Pavelyoung on 9/19/2008 1:28:11 AM , Rating: 3
About as many as a politician. Which means that for for a demo chip its got to be pretty inpressive

By Shining Arcanine on 9/17/2008 7:36:59 PM , Rating: 2
1.4GHz refers to the number of clock cycles. It says little about the performance.

RE: So what can this chip do?
By JasonMick (blog) on 9/16/2008 1:23:15 PM , Rating: 2
It can assimilate you :)

j/k. The press release was very vague, but I'm guessing each layer performs different basic math. The big deal is that theres more interconnects between the chips and no synchronization issues.

RE: So what can this chip do?
By Myg on 9/16/2008 2:54:28 PM , Rating: 2
heh, actually; your not far off there.

As we offset human work with more CPU based stuff, the more computers will ebb themselves into our lives and take over many functions.

"Resistance is futle, we are progress... I mean the borg"

RE: So what can this chip do?
By grath on 9/16/2008 9:11:49 PM , Rating: 5
...many standard processor design tricks, it also uses new ones to account for different impedances that might occur from chip to chip...

Resistance was futile, so we tried impedance, but they adapted.

[Rim Shot]

RE: So what can this chip do?
By mmntech on 9/16/2008 1:24:41 PM , Rating: 3
It's more of what it is (or its potential) than what it does right now. The process can squeeze more transistors into the same footprint. Therefore, you can take a processor using the same fab techniques as today but double performance by simply stacking transistors on top of each other. Cooling is going to be the big problem with this though.

RE: So what can this chip do?
By Bateluer on 9/16/2008 1:40:07 PM , Rating: 2
Interesting proof of concept design, but I still want my Quantum Computer.

RE: So what can this chip do?
By R0B0Ninja on 9/18/2008 7:54:41 AM , Rating: 2
Can't you just cope with an optical computer for now?

By Mojo the Monkey on 9/24/2008 12:45:32 PM , Rating: 2
Yeah, it'll have to wait until I can have my petri dish rat brain multi-core computer up and running.

...they can already control remote cars, why not my outlook schedule?

By Cobra Commander on 9/16/2008 2:28:39 PM , Rating: 5
It's a proof of concept in terms of design and fabrication, as far as I'm concerned.

Its performance is irrelevant.

RE: So what can this chip do?
By BladeVenom on 9/16/08, Rating: -1
RE: So what can this chip do?
By FaceMaster on 9/16/08, Rating: -1
RE: So what can this chip do?
By Flunk on 9/16/2008 5:23:54 PM , Rating: 3
The 1.4Ghz is irrelevant too, this is a proof of concept. It's not like you can just plug any processor into your current computer system and run windows on it. The vast majority of processor designs are completely incompatible with each other.

What it does show us though is that this is fully possible and that in itself is so fabulous it could allow us to have hundreds or thousands of cores easily. Now Intel and the other big chipmakers just need to figure out how to design and fabricate 3D chips and we will be able to buy them at the local computer store. Barring any show stopping problems we could be using 3D chips within 10 years. Maybe less if they use stacked designs first (DRAM and NAND chips are already available stacked).

RE: So what can this chip do?
By slayerized on 9/16/2008 5:40:21 PM , Rating: 3
Professor Friedman predicts that a 3D processor in a device such as the iPod could be tenth the current processor's size with ten times the speed.

This is a very loose statement with very little insight. I am not sure what he means by 1/10th the size; does he mean Si footprint? I am sure how he assumes there will be miniaturization, with the same amount of functionality, using the same process. I do understand they might be negating some routing overhead to the vertical dimension; yet it doesnt answer the question of size. Speed however will increase because of reduced interconnect latencies; but by 10 times, I am not quite sure.

Also, using a device like an ipod is not necessarily the best example. The functionality in ipod is achieved by board level integration of functionalities, as opposed to chip level integration. All this being said, I think it is cool what they are doing, although I am not sure if they are the first to come up with something like this. IBM has done some 3D-ICs in the past as well.

RE: So what can this chip do?
By fic2 on 9/16/2008 6:24:12 PM , Rating: 2
IBM has done some 3D-ICs in the past as well.

I think IBMs have just been layers of the same chip with only an interconnect in the vertical direction. This seems to be talking about having transistors in the vertical direction which would be the only "true" 3-D processing cube.

RE: So what can this chip do?
By cheetah2k on 9/16/2008 10:28:07 PM , Rating: 2
I'd be interested to see how AMD can make this work with their Fusion concept.

I live for the day when GPUs integrate seemlessly with CPU's in a 3D stackable environment and ultra low bus interconnect latencies.

By Sulphademus on 9/16/2008 1:53:00 PM , Rating: 5
Cooling this thing should be interesting.

Are they going to need to carve tunnels for heat pipes? Obviously those transistors at the center are going to get alot hotter than those nearer fresh air. Could integrated CPU/Heatpipe+Heatsinks be needed here?

RE: Cooling
By Chadder007 on 9/16/2008 2:53:56 PM , Rating: 3
Thats what I was wondering. Wouldn't the inner layers get uber hot?

RE: Cooling
By Myg on 9/16/2008 3:00:18 PM , Rating: 2
I suspect this technology will be held back for a good while until they figure out a sufficient way to cool the lot without having to create a heatsink inner-structure that would multiply the die area significantly.

RE: Cooling
By SiN on 9/16/2008 3:11:32 PM , Rating: 2
i suspect the stacking of the layers would be the first priority ;P

RE: Cooling
By Shining Arcanine on 9/20/2008 1:03:40 AM , Rating: 2
These are silcon cubes. I think you mean die volume.

RE: Cooling
By Fenixgoon on 9/17/2008 12:44:52 AM , Rating: 2
i think it was IBM that layed out a schematic for liquid cooling on a 3D chip. DT might have even ran an article on it. basically, it involved the creation of a network electrically insulating channels that could run the cooling liquid around the transistors.

RE: Cooling
By Myg on 9/17/2008 3:25:07 AM , Rating: 2
It would have to be pretty heavy liquid tho? Im sure with an enviroment like that; water would turn to steam pretty quick.

Can it?
By GoodBytes on 9/16/2008 2:18:01 PM , Rating: 2
Soo... does that mean it will be able to not only show the solutions but also how to get there, so that I can recopy the work and get 100% on my math assignments :)

RE: Can it?
By grath on 9/16/2008 9:05:44 PM , Rating: 4
Bah! We programmed our graphing calculators to do that almost two decades ago.

Welcome to 1990, now go do your homework!

RE: Can it?
By jlips6 on 9/17/2008 12:01:48 AM , Rating: 3
that "bah!" amused me to no end.

I actually have a friend who had to retake a semester of analysis because he was using a program he wrote to do his homework. He was pretty pissed about that, especially because his "if the program works than it is proof that i learned the concept." argument didn't work.
Having a computer do your work can be looked at from both sides.
one side is especially prominent when you are taking your SAT's.

the moral is never tell anybody you don't do your work.

moore's law
By MRsnufalufagus on 9/16/2008 2:27:23 PM , Rating: 4
I hate when I see references to the idea that Moore's Law is going to come to a halt. The reason I hate it is that it means that people out there don't really understand what it is.

It is not a phenomenon of silicon. It is a phenomenon of economics, gently nudging technology forward. It comes as a result of the fact that to win a marathon, you only have to come in first. you do not get any additional benefit from winning by a larger margin. Big computer R&D budgets are aimed at sufficient improvements in technology, not revolutionary improvements in technology. Just like pharma R&D budgets are aimed at more effective AIDS medication, not a cure. (and who can blame them for being human and seeking attainable rewards. they are not god, and we should not hold them to that standard.)

When 2D silicon becomes obsoleted by 3D silicon or quantum computers, or whatever else comes along, there could possibly be a little bump in the curve like we saw with the VAX, but in the long run, the curve will continue in the same direction for as long as the economy of our species does.

RE: moore's law
By granulated on 9/16/2008 5:32:43 PM , Rating: 2
ahhh thanks for that.

I go so annoyed by the way we are drip-fed technology improvements.

RE: moore's law
By typo101 on 9/19/2008 10:37:20 AM , Rating: 2
Although I agree Moore's "Law" was based on the economics of the industry, can we really ignore that fact that there seems to be some physical limitations to scaling of our current IC technology. If breakthroughs like this do not occur, all the economics in the world will not be able to keep up with Moore's Law.

Cooling not really an issue
By wetwareinterface on 9/16/2008 3:24:51 PM , Rating: 2
The cooling of this wouldn't be a real big problem.
Just leave a gap in the horizontal plane of the transistors and lay down a conductive material to allow the channeling of heat via a peltier effect to a "heatsinked" side of the cube. The peltier channel could double as power rail also.

Integrate it further in the vertical axis to channel heat to areas that are unused at the moment. Heat will jump the transistors so you could leave gaps in the peltier material to allow voltage to flow across transistor arrays and still pull heat away to the outside edge.

Downside is the cost of the additional material laydown on silicon vs. pure silicon. Probably need to be carbon nanotubes to work properly.

RE: Cooling not really an issue
By Tesseract on 9/16/2008 4:55:32 PM , Rating: 2
Except that the interesting point of this article was that they could have millions of interconnects between layers - which wouldn't be possible through your peltier channel.

RE: Cooling not really an issue
By rudolphna on 9/16/2008 6:51:19 PM , Rating: 2
I think a better way to do it, would be to have the socket on one side. Around the other 5, have a heatsink that fits snugly over top of it, and slides down over it. With thermal paste, of course. There will be like 3 times the surface area with which to conduct heat away from teh processor with.

RE: Cooling not really an issue
By Don Tonino on 9/17/2008 12:56:48 PM , Rating: 2
For achieving the maximum heat exchange, a thin plate is the best possible shape, as it allows the highest surface/volume ratio. Going 3D could be an extremely good thing for many reasons, but surely cooling won't be any of these (as long as we think of a traditional cooling...)

A possible way to achieve a very high surface/volume ratio, that would make for somehow longer connection though, would be to arrange the whole structure along a Menger sponge and using some liquid cooling...

What about getting the heat out?
By psychobriggsy on 9/16/2008 1:27:58 PM , Rating: 2
The first step might involve simple stacking at the wafer level - fabbing multi-core CPUs above each other for example. Then the actual gate and transistor level stuff might become more vertical as well. The hardware language (Verilog, etc) would need to be updated to understand and generate 3D designs as well, and build upon core libraries of 3D components.

Heat is the biggie however. If a 2D chip can generate so much already, then layering them on top of each other is a recipe for creating an inferno, even if you cut speeds and voltages because you are gaining in implementation area.

The two layer chip could rotate a CPU, so that cores were above cache, and vice versa. Maybe there would be a way of including 'cooling rods' in the manufacturing process, so that the 'cube' would have metal pins on the side for cooling purposes.

What does this save though? Each layer of fabbing costs, although presumably the wafer itself is the major cost so reducing the number of wafers required by fabbing multiple times on top could lead to significant savings - in theory.

In addition you need to handle the case that certain layers could be flawed whilst still keeping some value from the non-flawed layers. Probably not the difficult issue with this technology however.

RE: What about getting the heat out?
By theapparition on 9/16/2008 3:39:14 PM , Rating: 2
This doesn't sound all that much different from what's being done now. If my memory serves me correctly, the Athlon64 design was something like 9 layers. Interconnect between the layers was limited, though. I think the only advance here is that they are able to signifigantly increase the interconnects between layers.

It will truely be a 3D chip when they're able to dope silicon in 3 axis.

By psychobriggsy on 9/17/2008 10:51:07 AM , Rating: 3
It actually turns out to be better use of connecting different fabricated dies vertically using through silicon vias. Nothing 3D at all in the fabrication stage, it's all packaging and alignment, and presumably some logic to control access to the vias.

By Polynikes on 9/16/2008 1:51:57 PM , Rating: 1
They're talking about using specific "levels" of the chip for certain tasks, much like different cores can be used for different processes, or having multiple threads per core.

We still haven't gotten multi-core and multi-threading into the mainstream for many applications; do we really need another complication thrown in just yet?

Maybe not, but I'm looking forward to how this tech develops.

It's funny, I'm from Rochester, and U of R is just another school around here to me, but every now and then they're in the news for something big like this and I gain new respect for the school. Guess they're not "just another school."

RE: Wow...
By Silver2k7 on 9/22/2008 1:20:22 AM , Rating: 2
thats where multicore will end up anyway.. with probably atleast 1 or several special purpouse cores..

critical ground being made
By eyebeeemmpawn on 9/16/2008 1:53:32 PM , Rating: 2
this is a very critical area for the the future of semiconductors. Newer, smaller devices are at the point where designs are constrained by wiring. Opening a whole new axis for interconnection will welcomed I'm sure.

By Gzus666 on 9/16/2008 2:23:57 PM , Rating: 2
Hey, that is my hometown. Sounds like a novel idea, guess we will see when they get this off the ground.

My CPU...
By shin0bi272 on 9/16/2008 7:18:11 PM , Rating: 2
is a neuro-net processor... a learning computer.

Let me know when they hook 15 or 30 of these up to a single interface and get it to be super conducting at room temperature. See image below ;)

By kylegtheassman on 9/16/2008 7:58:36 PM , Rating: 2
Its going to be a good future and a vary fast one in time, it will get faster and faster and better.

By dflynchimp on 9/17/2008 5:59:44 AM , Rating: 2
How long until they start making computer chips the shape of human brains? I could really use an upgrade one of these days...

About cooling
By JonConstantine on 9/19/2008 6:47:29 AM , Rating: 2
Maybe a structure similar to a menger (Menger-Sierpinski) sponge ?

why not build it in 3d..
By Silver2k7 on 9/22/2008 1:15:26 AM , Rating: 2
if you just add all the parts you want to have then think about how you want to build it.. then you don't have *different trafic systems* think of a special purpus circuit as a spehere somewhere in the cube.. its spread out among the different layers..

tater chip
By RoberTx on 9/22/2008 6:05:08 AM , Rating: 2

IBM 3D cooling
By catavalon21 on 9/22/2008 9:18:11 PM , Rating: 2
They claimed they used water...

Good start out of the gate.
By Innocent Hawk on 9/16/2008 11:34:39 PM , Rating: 1
1.4 GHz is actually enough to impress me. It seems like many times when a new technology reaches proof of concept they make headlines, but only to reveal the technology lags far behind what is capable today. An important stepping-stone for the future definitely, but the distance these technologies needs to go to close the gap with current tech (which is still advancing further and further ahead also) can seem gargantuan. 20 years away minimum.

This 3D Processor actually has an impressive start at 1.4 Ghz. That's 400MHz faster than the minimum requirement to run Vista Premium. Given the speed at which technology advances (see the various "Moore's Law" comments above), 3D processors could catch up or even surpass the speeds of current processors in under 10 years. It all depends on how willing manufacturers and the public are to adopt this new technology.

Grammar Police
By ajdavis on 9/17/2008 9:41:48 AM , Rating: 1
The number of grammar mistakes/typos in this post is atrocious.

Gr8... but
By swizeus on 9/16/08, Rating: -1
By qrhetoric on 9/16/08, Rating: -1
RE: wow
By icanhascpu on 9/16/08, Rating: -1
RE: wow
By codeThug on 9/23/2008 12:49:03 AM , Rating: 1

"I want people to see my movies in the best formats possible. For [Paramount] to deny people who have Blu-ray sucks!" -- Movie Director Michael Bay

Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki