backtop


Print 22 comment(s) - last by Dfere.. on Dec 11 at 9:29 AM

IBM's new optical interconnect may shape the future of supercomputers -- and all the other ones, too

A growing issue involved with the new gamut of multi-core processors, from dual-core to quad-core and beyond, is shuttling data between the cores themselves. Problems arise as the bleeding edge of electrical and thermal physics are pushed by the vanishing space between transistors and now between cores. IBM may have something of a solution for this, they recently announced, in the form of an optical coupler.

Due to the wholly unfair restrictions of physics, there is only so much information that can be sent through copper interconnects. Heat and leakage both contribute to this, as smaller structures are much more prone to self-destruction caused by thermal disintegration and interference from their neighbors in the silicon sea.

Light, it turns out, doesn't have much of an issue with either. IBM thinks it will provide a means to safely connect the growing number of cores being packed onto a single chip.

The optical modulator, known as a Mach-Zehnder elctro-optic modulator, is actually much simpler than it sounds. If you're familiar with Digital Light Processing (DLP) television technology, you already understand the basics. In IBM's modulator, a laser is focused on the device, which is also connected to the copper side of the data bus. Rather than working on a reflection-based technology like DLP, the modulator simply uses the electrical binary to operate a shutter-type mechanism to turn the input laser into pulses. The pulses are then sent along a waveguide to a receiver, and translated back into electrical signal.

The modulator's waveguide structure, as well as the size of the device are both limited by the wavelength of light used. IBM's current generation coupler has a 500nm waveguide, and the device itself is a mere 200 micrometers long, half the wavelength of the laser's light. This makes it easily small enough to fit between current generation multi-core processor cores.

The device is many times more efficient than current bus technology. IBM bills speeds at 100 to 1,000 times faster than electrical, with energy consumption of 50 milliwatts or less. The theoretical speed of the interconnects could virtually eliminate anything similar to an information bottleneck as we know it now. And as any student of science can tell you, less power results in less heat, which means less power wasted as well.

The device, as IBM sees it, will in time allow the power of a supercomputer on a chip that feels at home in a laptop. Reduced heat production and electrical costs will not only benefit laptops and other micro-devices, but businesses with giant servers as well. Less space and less heat equates to less money spent on maintaining expensive climate controlled server environments.

The technology is nowhere near ready for integration, unfortunately. IBM projects a 10 to 15 year development for the system. Eventually, rather than the racks full of dual and multi-core processors sported by supercomputers like Blue Gene/L, we could see a computer many times more powerful housed in something the size of a current mid-tower ATX case.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

By KingstonU on 12/9/2007 5:58:21 PM , Rating: 2
They are currently at 45nm and 32nm will is being developed now to be ready for 2009-2010 release. Then it's 22nm around 2011-2012.

I remember reading that the smallest transistor node possible is 18nm, because smaller than that and electron transfer begins produce errors as the electrons interfere with each other or something.

So this we will need this new optical technology to be the next step in advancing computers.




By Master Kenobi (blog) on 12/9/2007 6:09:41 PM , Rating: 4
I think Intel already figured that out. If you take a look at the technical details of their new QuikPath board architecture, it works over electroncs or photons. It seems like the engineers went out of their way to accomodate both, more than likely for this exact reason. They see the limitations coming and are already manuvering themselves to work past it.


By cheetah2k on 12/9/2007 10:16:38 PM , Rating: 2
You make it sound like Intel are using "The Force"?

Yoda: "Always in motion is the future"


By Master Kenobi (blog) on 12/10/2007 9:34:28 AM , Rating: 4
It's "Always in motion, the future is..." Learn your quotes boy!


By masher2 (blog) on 12/10/2007 11:40:48 AM , Rating: 2
Always good to get in the day's dosage of anastrophe.


By AntiM on 12/9/2007 6:25:53 PM , Rating: 2
I wonder if we'll ever see some sort of optical transistor?
Unlike todays transistors that only have two states, (on or off) this device could have many states. It could output a certain wavelength of light depending on what wavelength was at it's input. It could have thousands of states, just like we have thousands or millions of colors on our displays. Binary computing will seem primitive with such a device.


By Master Kenobi (blog) on 12/9/2007 6:35:18 PM , Rating: 1
They refer to such a device as a "Quantum Computer"..,

Read, and be enlightened.
http://en.wikipedia.org/wiki/Quantum_computer


By AntiM on 12/9/2007 8:24:25 PM , Rating: 4
A Quantum Computer operates on a different principle than what I'm talking about. I'm talking about using wavelengths of light instead of ones and zeros. A quantum computer works on the principle that an atom can exist in many different states at the same time.


By mars777 on 12/10/2007 4:48:33 AM , Rating: 2
A particle can have a quantum couple somewhere in space, not that it exists in different places. If we involve that we should say that a particle exists in every point of space in the same time :D


By RyanHirst on 12/10/2007 4:05:41 PM , Rating: 2
If we involve that we should say that a particle exists in every point of space in the same time :D

It's worth noting that this apparently absurd idea (a single-photon universe) is nevertheless a valid interpretation of the the mathematics of relativity.
It's a pretty idea that has no functional relevance. E.g. 'is a plane really just a single point because mapping it to infinity onto the Reimann Sphere implies the Point at Infinity?' The question has no meaning. The fact that we can map a plane in such a way that its extension to infinity is represented by a point doesn't turn the PLANE at infinity into a point.

Still, it's fun to think about. Everything we see just one phonton everywhere at once.


But the software
By kyleb2112 on 12/10/2007 1:58:07 AM , Rating: 4
I hope software designers catch the multi core bug soon, or we're going to have a lot of cores looking for work.




RE: But the software
By DeepBlue1975 on 12/10/2007 12:53:58 PM , Rating: 2
Maybe something like a "core vacation programme" could be introduced, so that unemployed, or partially employed cores can go and enjoy themselves in their spare time.

You know, something like playing soccer with electrons, getting a bath of low amperage DC current, taking the BUS to travel memory lane, write a binary postcard to their friend devices that live on the other side of the board, and so on.

Could be fun. Wish I were a multicore processor to have such kind of holidays.


shrinkage
By melgross on 12/9/2007 5:28:14 PM , Rating: 4
No mater how much the circuits may shrink, so that we can have that supercomputer in a box, needs will always be there for massive machines.

When we get a Petaflop on the desktop, or even in our phones, the TRUE supercomputers will be many thousands, or millions, of times faster still, and far bigger.




finally!
By nerdye on 12/10/2007 11:07:18 AM , Rating: 2
I have been waiting for this type of announcement involving optics used as interconnects to cpu's and all hardware for a long time. Thank you IBM!




RE: finally!
By initialised on 12/10/2007 11:45:53 AM , Rating: 2
This idea is not new, just on the threshold of being necessary. I looked into doing clocking with optics for my degree project in 2001. I think it's great for the system bus but not good enough for intra chip communications. The main limitation is the size of the laser and receiver.

Just think how the current PC layout limitations can be stripped away if they are all on a single optical bus.


LMAO
By Fnoob on 12/10/2007 11:23:08 AM , Rating: 2
"Due to the wholly unfair restrictions of physics"

So true. Gravity pisses me off all the time.




RE: LMAO
By Dfere on 12/11/2007 9:29:27 AM , Rating: 2
Wait till you get older.


When oh when will I get my mini-itx?
By bupkus on 12/9/2007 11:35:00 PM , Rating: 3
quote:
we could see a computer many times more powerful housed in something the size of a current mid-tower ATX case.


When oh when will I get my mini-itx?




Cool
By Runiteshark on 12/10/2007 12:09:12 PM , Rating: 2
Now IBM can buy AMD and teach them how not to suck. Seriously though, they can afford it.

I don't get why everyone though Ruiz was so great during the 939 days either. The research and direction the company was set in was the work of Jerry Sanders, not Ruiz.




Wavelength
By PlasmaBomb on 12/10/2007 1:54:15 PM , Rating: 2
You do realise that 200 microns isn't half of 500 nanometers right?




Intel is First?
By baldun on 12/11/2007 12:56:15 AM , Rating: 2
How is this different with Intel's High-
Speed Silicon Modulation published last 2004?

http://www.intel.com/technology/magazine/silicon/s...




Amusing tangent thought
By eman 7613 on 12/9/07, Rating: 0
"We’re Apple. We don’t wear suits. We don’t even own suits." -- Apple CEO Steve Jobs

Related Articles
IBM Lands Atop Supercomputer List, Again
November 13, 2007, 9:03 AM
Pricing Phenom: AMD's 2.4 GHz Almost Here
November 6, 2007, 7:54 AM
Intel Slates "Nehalem" for Q4 2008
October 26, 2007, 10:58 AM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki