backtop


Print 31 comment(s) - last by cocoman.. on Jan 19 at 11:27 AM


a) Cleaved coupling section of a bent waveguide bus and a resonator; b&c) Delay line consisting of several ring resonators
Researchers have effectively delayed light's travel for the purpose of chips

IBM has announced that its researchers have built a device capable of delaying the flow of light on a silicon chip, which could lead the further development of using light instead of electricity to transfer data. Researchers have known that the use of optical instead of electrical signals for transferring data within a computer chip might result in significant performance enhancements since light signals can carry more information faster. The engineering challenge is buffering data on the chip, which is difficult given light’s speed. Thus, a means of using light effectively is to delay its travel.

Long delays can be achieved by passing light through optical fibers. IBM scientists were able to delay light by passing it through a new form of silicon-based optical delay line built of up to 100 cascaded "micro-ring resonators," built using current silicon complementary metal-oxide-semiconductor (CMOS) fabrication tools. When the optical waveguide is curved to form a ring, light is forced to circle multiple times, delaying its travel. The optical buffer device based on this simple concept can briefly store 10 bits of optical information within an area of 0.03 square millimeters. This advancement could potentially lead to integrating hundreds of these devices on one computer chip, an important step towards on-chip optical communications.

"Today's more powerful microprocessors are capable of performing much more work if we can only find a way to increase the flow of information within a computer," said Dr. T.C. Chen, vice president of Science and Technology for IBM Research. "As more and more data is capable of being processed on a chip, we believe optical communications is the way to eliminate these bottlenecks. As a result, the focus in high-performance computing is shifting from improvements in computation to those in communication within the system."

The report on this work, "Ultra-compact optical buffers on a silicon chip," is published in the premiere issue of the journal Nature Photonics. This work was partially supported by the Defense Advanced Research Projects Agency (DARPA) through the Defense Sciences Office program "Slowing, Storing and Processing Light."



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

There is more to this than meets the eye
By Nyne on 12/27/2006 4:33:48 PM , Rating: 2
I cant say much since this is my field and it would break certain rights I have agreed to but I can hint so listen and follow the clues.

look at the current tech being developed right now. In late 07 and beginning of 08 laser Television. Material that can bend light giving the illusion of invisibility. Controlling the Speed of Light and its density.

I wish I could say more but the rest must be left for the imagination. Well here is one more clue what does star wars, star trek and halo have in common. This advancement is but a mere beginning to the next decade. By 2017 we will be laughing at what use to be now.




RE: There is more to this than meets the eye
By nurbsenvi on 12/28/2006 8:30:54 AM , Rating: 3
quote:
In late 07 and beginning of 08 laser Television. Material that can bend light giving the illusion of invisibility. Controlling the Speed of Light and its density.


ok
summing up all the clues you leaked here I came to a very important conclusion...




mmm...I want donuts....

just kidding

The real answer is:

In 10 years we will be watching a laser TV on a couch, while wareing an invisible suit, through a scope that delays and darkens your vision... so when your wife comes in to the house she wouldn't know if you are there the only thing she will notice is a scope that floats in thin air and a TV that's on while no one is watching.

you haven't said a jack there mate... just spill the bean like The inquirer.






RE: There is more to this than meets the eye
By Nyne on 12/28/2006 9:15:01 AM , Rating: 2
This is stupid of me and I could get fired but I will let dailytech and you guys know.

We are currently working on a project were we take a cube of light bending material and our experimenting on laser guided paths that will result in a 3rd dementional television. The rusult looks like that of star wars chess for now or like the computer holograms on halo. We belive one day it would be possible with light slowing technology to design something like that of star trek's holodeck however our research team is currently on that. You should have 3d tv as soon as late 09 mostly for gaming though because we can easily make a 3d animation 3d. we are also experimenting on 2d-3d for games and stuff but right now anything 2d is desplayed like a rear projector on a clear screen. It will be amazing and hopefully by twenty o nine we will have the price down to 3 or 4 thousand.


By nurbsenvi on 12/28/2006 9:41:10 AM , Rating: 2
Thanks for sharing that info
very generous and brave of you.

now where did I put the phone number for the US patent office...

Well I think that the technology is good but I can't think of it as a domestic application but more of military or 3D CAD modeling application maybe? Industrial design will benefit quite alot I think.

but it will be cool to have musicals like the Cat's played out on 300inch 3D holographic projector in your house.


By Ringold on 12/29/2006 4:12:50 AM , Rating: 2
As a Star Wars geek.. for 3-4 grand, in "late 09" dollars, assuming I don't find myself unemployed or set back, I think I would pick me up one of these boondoggles. It'd make for an interesting piece! It could do that SW game, or it could do that game at the Battle School from the book Ender's Game, or could play chess.

Or, of course, would probably do a fine job of looping a scene of a woman poll dancing. I'll save that app for when the 'guys' come over.

I'd love the holodeck thing if not for the whole not being able to touch anything. Unless, of course, you've really got some brilliant guys and figured out how replicators work, too.


By gibletsqueezer on 12/29/2006 5:35:23 AM , Rating: 2
3D displays for games by 2009 eh? Gee - these guys (plus countless others) must be ahead of the curve - http://www.ddd.com/index-2.html

But - keep up the holodeck work, just don't let your mother see what your doing, she just might stop you playing with your light saber


By cocoman on 1/19/2007 11:27:39 AM , Rating: 2
Read the best of CES 2007 in anandtech. They already have your 3D TV working in the show.


Human Beings are entering the Light Era
By Senju on 12/27/2006 7:11:39 PM , Rating: 5
Look at the last breakthroughs in the last year; Bending light, controlling the density, controlling the speed and direction. We are entering a new era from electronic to light control systems. We will not see the effects tomorrow but exciting new applications will come to the market in the next ten years and will change our lives even more than we can imagine. I think I will record this entry and read it ten years from now... :D




RE: Human Beings are entering the Light Era
By Sharky974 on 12/27/2006 9:53:16 PM , Rating: 1
Yeah sure. None of that works and you can pretty much always count on science to fail.

I thought we were supposed to have the common cold (let alone much worse diseases) cured by now too. That's what the papers told me..


By evildorf on 12/27/2006 11:32:54 PM , Rating: 4
"...and you can pretty much always count on science to fail."

Yes, that makes perfect sense. I look through history, and all I see is science failing. This whole Internet thing? Created in a day by a series of happy coincidences. Good thing science didn't have anything to do with it. Might have mucked it up.


By zsdersw on 12/28/2006 9:47:55 AM , Rating: 2
quote:
Yeah sure. None of that works and you can pretty much always count on science to fail.


You're either a complete idiot.. or you were being sarcastic/facetious.

To everyone else: Any bets on which it is?


By gibletsqueezer on 12/29/2006 5:42:09 AM , Rating: 2
the "common cold" is not a disease you moron. Stop listening to papers and read something


wow
By slickr on 12/27/2006 11:35:56 AM , Rating: 1
Still it would be until 2012 until this technology is actually capable of doing something and more importand ready for sales!
Hah, how did i miss the chance of buying stock in IBM when it was low, instead i bought 9% in Intel which would have been around 20% at IBM.




RE: wow
By rushfan2006 on 12/27/2006 12:08:47 PM , Rating: 2
LOL...I think you should clarify what you mean when you said "I bought 9% in intel".....that statement would imply, on first glance that you are saying you own 9% of intel stock.....that would/should/is worth MILLIONS of dollars.

Unless you are in fact very wealthy already...then my point would just be "GOOD GOD MAN!!!"...:)



RE: wow
By KristopherKubicki (blog) on 12/27/2006 12:11:32 PM , Rating: 2
quote:
that would/should/is worth MILLIONS of dollars.

Billions


RE: wow
By deeznuts on 12/27/2006 1:09:50 PM , Rating: 2
quote:
Billions


10.5B to be more precise.


RE: wow
By Samus on 12/27/2006 7:08:51 PM , Rating: 2
I own 0.00034% of intel's volume, and thats still a few thousand. i wish i had a full percent. ;)


RE: wow
By Nyne on 12/27/2006 4:36:17 PM , Rating: 1
no you will see stuff as soon as end of 07 with this and computer use of this by late 2009.


haha
By sprockkets on 12/27/2006 11:42:43 AM , Rating: 1
Love the picture you put there from starwars, classic




RE: haha
By starcutter on 12/27/2006 1:44:36 PM , Rating: 2
I think another key point here that cannot be missed is what this will mean for improving the thermal efficiency over current processor technology. Today's top processors generate in excess of 120 watts.

Low-K dielectrics, SOI, and ever decreasing lithographic processes all help to address current leakage and other drivers of inefficiency but the limits of electron/hole current flow through silicon is (at this point) reaching its limitations. Using photons instead of electrons should open a whole new methodology for logical processing. The speed and more importantly the thermal limitations should be drastically reduced.


RE: haha
By ChipDude on 12/27/2006 6:32:56 PM , Rating: 1
ha ha is right.

We'll be decades away from leveraging onchip optical interconnect if ever. At the very best we'll see board level routing. After you add in conversion from electrical to optical and back much of the benifit in spead will evaporate. You'll still see significant benefit in impedence matching, power conveservation in driving long board level RCs. But I have a sad fact for you, the majority of the leakage that is currently burned in a chip isn't for IO signal driving but for on chip signal switching and this will do nothign for that.

Lots of hype from IBM that I'm sure has burned billions in R&D, got them lots of papers but not one dollar of income. You're better of avoiding IBM stock.


RE: haha
By mino on 12/28/2006 6:39:08 AM , Rating: 4
R&D is what IBM is(was?) about for the better half of a century.

Also one thing to remember. Those generous R&D budgets of IBM, HP, TI and the likes is the stuff by which US of A pays for the oil and consumables it gets from the whole outside world.

Go on, go for your profits byt cutting "unneccesary" R&D. A few decades on that route and China will have a nice laugh at the "glory" of the US of A.

Consider that every single moment you bash anyone for having too big of an R&D fo you to swallow.

Of course, in case you not US, all that moral stuff goes into closet for you.

/me not US based BTW...


DailyTech Physicisist Strike Again!
By dcollins on 12/27/2006 9:57:55 PM , Rating: 2
Once, I'm flabbergasted at the "well this is still X years away from a real product, who cares?" comments. Computing devices were first theorized in the 1890's and it wasn't until the advent of silicon devices in the 70's that this work ever used to create a usable product. And another 20 before consumers saw the fruits of these labors.

Light based computing may be 10-20 years out and may never materialize, but if no one performs such experimental R&D, we'd never have the hard working, practical computing devices we have now.

Don't quote me on the dates, I didn't feel like wikipediaing the specifics. :)

--Dac




RE: DailyTech Physicisist Strike Again!
By masher2 (blog) on 12/28/2006 5:53:24 AM , Rating: 1
Actually, computing devices were first theorized in the late 1700s. Babbage built a (nonworking) computing device in the early 1800s...20 years later, working models were being sold. And electronic programmable computers were first built in the 1940s, not the 1970s.


By fumar on 12/29/2006 3:14:31 AM , Rating: 2
You are correct about the first working computers. I believe it was used for ULTRA (the project dedicated to decoding the German enigma code)


How big is the film(?) that is used to create CPUs?
By nurbsenvi on 12/28/2006 8:10:00 AM , Rating: 2
I've always wondered about this:

How big is the lithography film(?) that is used to create CPUs?

does one film contain a image to cover the whole wafer? or just one?

anyone?




By Hawkido on 12/28/2006 10:46:14 AM , Rating: 2
I believe the Litho MASK is huge, like 5 to 10 times the size of the wafer and there are multiple masks that the light is filtered and focused throught to etch the wafer. the entire wafer is etched at the same time, any other way would result in more botched chips due to contaminates created during the etching process. I am not a Chip engineer, but maybe there is one on here that can correct anything I said that is wrong, but the bits and pieces I have read show this to be close. I hope someone posts a more accurate description, I would like some confirmation or more/better info myself.


By xbdestroya on 12/27/2006 12:15:26 PM , Rating: 3
Good article about Japanese research into quantum computing, and their own recent achievements with light:

http://www.theregister.co.uk/2006/12/21/japanese_l...




Heres the original.......
By crystal clear on 12/28/2006 7:32:12 AM , Rating: 2
You forgot or considered unecessary to include this-


About IBM Research Division

IBM Research is the world's largest information technology research organization, with about 3,000 scientists and engineers in eight labs in six countries. IBM has produced more research breakthroughs than any other company in the IT industry. For more information on IBM Research, visit http://www.research.ibm.com

Photos available at:

http://domino.research.ibm.com/comm/pr.nsf/pages/n...

Additional information on silicon nanophotonics available at: http://www.research.ibm.com/photonics
posted by SQL at 12:53 PM 0 comments Social bookmark this

Unquote-

This post was made by SQL on Friday, December 22, 2006

Quote-

IBM Milestone Demonstrates Optical Device to Advance Computer Performance

IBM (NYSE: IBM) today announced its researchers have built a device capable of delaying the flow of light on a silicon chip, a requirement to one day allow computers to use optical communications to achieve better performance.

Researchers have known that the use of optical instead of electrical signals for transferring data within a computer chip might result in significant performance enhancements since light signals can carry more information faster. Yet, "buffering" or temporarily holding data on the chip is critical in controlling the flow of information, so a means for doing so with light signals is necessary. The work announced today outlines just such a means for buffering optical signals on a chip.

"Today's more powerful microprocessors are capable of performing much more work if we can only find a way to increase the flow of information within a computer," said Dr. T.C. Chen, vice president of Science and Technology for IBM Research. "As more and more data is capable of being processed on a chip, we believe optical communications is the way to eliminate these bottlenecks. As a result, the focus in high-performance computing is shifting from improvements in computation to those in communication within the system..............

IBM scientists were able to meet this size restriction and achieve the necessary level of control of the light signal by passing it through a new form of silicon-based optical delay line built of up to 100 cascaded "micro-ring resonators," built using current silicon complementary metal-oxide-semiconductor (CMOS) fabrication tools. When the optical waveguide is curved to form a ring, light is forced to circle multiple times, delaying its travel. The optical buffer device based on this simple concept can briefly store 10 bits of optical information within an area of 0.03 square millimeters. That's 10 percent of the storage density of a floppy disk, and a great improvement compared to previous results. This advancement could potentially lead to integrating hundreds of these devices on one computer chip, an important step towards on-chip optical communications
http://technology-news-earnings.blogspot.com/

Unquote-
Hey this is not your article-you did not write it.




Duh
By Fnoob on 12/28/2006 7:38:07 PM , Rating: 2
""Today's more powerful microprocessors are capable of performing much more work if we can only find a way to increase the flow of information within a computer," said Dr. T.C. Chen, vice president of Science and Technology for IBM Research"

Ummm, duh. Anyone else wonder why we have 3+Ghz processors bottlenecked with a 66-133Mhz PCI bus?





Slow Light
By immortalstarmaster on 12/29/2006 12:40:29 PM , Rating: 2
Ok, more information. That's great! But when are we going to stop using the two-wheeled cart and come up with a four-wheeled cart? The Chinese went a couple of thousand years without thinking, "Hey, let's add two more wheels and we can carry a much larger load!" We're now guilty of the same mentality. The computer model we are using is still Univac with a few more ports. When are we going to add two more wheels to the cart and, for a decent measure, trying "independent suspension" to protect the axles? Or are we still afraid of Dr. Strangelove?




"The Space Elevator will be built about 50 years after everyone stops laughing" -- Sir Arthur C. Clarke











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki