backtop


Print 43 comment(s) - last by wingless.. on Sep 25 at 12:02 PM

High performance light-based computers in the horizon

Researchers at Intel and the University of California, Santa Barbara have announced
the world's first Hybrid Silicon Laser
, or HSL. An HSL is a silicon-based laser emitting device. According to Intel, creating a laser emitting silicon chip is a breakthrough that will propel the world of computers into the light-based transmission era.

Called Indium Phosphide, the material contains properties that allow it to emit light when voltage is applied. Intel researchers were able to integrate Indium Phosphide into traditional silicon chip manufacturing techniques, thereby creating a silicon-Indium Phoshide hybrid chip -- one that could process traditional electrical signals and transmit laser light. The laser light generated by an HSL chip could be used to transmit data and thus power other silicon photonic devices said Intel.

“Silicon Photonics is a critical part of tera-Scale computing as we need the ability to move massive amounts of data on and off these very high performance chips" claimed Intel Chief Technology Officer Justin Rattner. Intel said that HSL could bring along terabit-capable transmission processors that are low cost and easy to produce. Computers would be a multitude more powerful than those we use today. The technology however, is still a number of years off.

Currently, silicon chips can detect light, route light and even modulate light said Intel, but the problem is getting silicon chips to produce light. Intel is taking Phoshide lasers commonly used in other industries and bringing along new types of applications. Voltage is first applied to the HSL. The Indium Phosphide element then produces light, which then enters a silicon waveguide to create continuous laser light. Using this technique, Intel also maintains a low cost production of HSL devices. According to Intel:

The hybrid silicon laser is a key enabler for silicon photonics, and will be integrated into silicon photonic chips that could enable the creation of optical “data pipes” carrying terabits of information. These terabit optical connections will be needed to meet the bandwidth and distance requirements of future servers and data centers powered by hundreds of processors.

The application potentials for HSL chips are truly exciting. The industry in general has been talking about laser or light based electronics for a number of years already. With the development from a company like Intel -- and hopefully others like AMD -- the industry is getting the right push it needs. With multi-core processors now the mainstream, computers will only get faster. HSL devices will drive the future of computing said Intel, and things are looking only brighter. Communications technology uses a fair number of laser electronics and as the technology is refined, desktop computer and notebooks will be using the technology in the next few years as the limits of traditional silicon is reached.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

whew
By caater on 9/18/2006 6:02:55 PM , Rating: 2
optical interconnects on the motherboard.. gotta love that.
future looks promising.




RE: whew
By retrospooty on 9/18/2006 6:17:21 PM , Rating: 2
"future looks promising"

what we need are apps that take advantage of the speed. As of now ther are very few games/apps. that truly utilize the CPU's that we have now. This looks like WAY overkill unless something drastically changes.


RE: whew
By Zandros on 9/18/2006 6:51:54 PM , Rating: 2
Perhaps one day, we could have just one computer in each house, running, say, five virtualized operating systems and so each family member can have their own computer, but not.

Or we'll just buy a bunch and take some world records in F@H. :)


RE: whew
By Gatt on 9/19/2006 12:13:07 AM , Rating: 2
I've been thinking that for a while. Oct-Core server supporting displays throught the house for anything from On-demand TV to gaming simultaneously.

The death of consoles that are nearing their physical limits, and PC's morphing to an invisible server that handles a variety of chores.

It's coming, even MS is banking on it, which is why they're spending obscene amounts of money to push Sony to release things before Sony's ready, as they have with PS3 and BR. Sony was the only one that could've beat them to it.


RE: whew
By Tyler 86 on 9/18/2006 7:00:46 PM , Rating: 5
The demand is already there for supercomputing...

As for apps that take advantage of the speed, you should take note;
Graphics, audio, and general purpose processors all have a limit in size, scale, and speed, on typical electricity conducting silicon.

3D graphics in particular has yet to hit a wall in utilization.
In fact, it's becoming increasingly easier to feed these increasingly paralell, massive, and compressed architectures enough juice/data to accomplush what a designer expresses...


General purpose CPUs, however, are not best suited when designed to be used 'fully' - that may seem strange, but take the Pentium 4 or D for example.
It's streamlined NetBurst architecture allowed for very impressive specialized feats at high clock speeds and reasonable prices.
Although, it absolutely neglected it's "general purpose", and thus, frankly, it sucked.

Intel has since changed their ways, and now instead of trying to streamline and specialize specificly what can be singular tracked, streamlined, and specialized, similar to (early generation) graphics processors, they're putting more effort into abstracting, generalizing, general purpose prediction, and dynamic resource/capability management.

Lasers instead of electrons might be a cheap path to clockspeed, and so long as they keep their current, very objective view on general purpose processing, but I'm sure they're not preaching optical inter-chip communication for the absolute immediate future for a reason - nonetheless, the future does look promising, after all, light is much faster, and stable (or rather, coherant), than electrons.

As you probably know, on-chip memory, or cache takes up a sizable portion on most modern processors - sometimes as much as half to 2/3rds, or in the case of the Itaniums, 3/4ths.

If on-chip memory and cache could be taken off-chip, the die size would be smaller, thus each individual die would be cheaper, because more dies could be produced.

The main issues of off-chip memory is latency, and bandwidth - that is why cache and on-chip memory exists.
Another issue with off-chip memory is the scale and complexity with which the memory bus has become accustomed.

RDRAM, as a serial (versus DDR(2,3...) SG/SD-RAM as paralell) interface, has overcome that complexity obstical, by not requiring all traces to be timed absolutely perfectly.
Alternate serial technologies reduce the obstical by transmitting over a minimal number, or densely defined traces with more effort in implementation - ala HyperTransport.

An optical memory bus could allow fairly cheap, paralell, perfectly timed, massive amounts of data to be transfered to off-chip memory, and other destinations.

Yes, it's overkill for now, but something drastic happening is hardly 'new' in the electronics industry.

We will get more applications that take advantage of the speed and possibilites, but we already have a few applications that 'truely utilize' even the latest processors - even if you don't think much of them; Folding@Home like applications for example.


To reiterate the previous poster, the future looks promising.

... and the new post coming in during my entry, yes, F@H...


RE: whew
By Xavian on 9/18/2006 7:23:12 PM , Rating: 2
true, however imagine if this technology could be applied to GPU's, lightspeed GPU's and interconnects could transfer massive amounts of infomation in a very short amount of time. Combine this with flash or holographic based hard drives and a lot of bottlenacks that exist in current pc's will be eliminated.

The future is bright, the future is lightspeed :P


RE: whew
By Xavian on 9/18/2006 7:26:57 PM , Rating: 2
bottlenecks not bottlenacks :)


RE: whew
By kamel5547 on 9/18/2006 10:05:56 PM , Rating: 2
Um... tell that to our modelers who run their models for 3 days currently. Conroe (e6600 model) provides about a 40% cut in times over a 640.

Anyhow the particular app is single threaded and cannot be multi-threaded, each calculation depends on the previous calculation, which means only increases in CPU performance will help, basically the CPU is pegged at 100% for the entire time, memory usuage is tiny (100 MB or so) so what really matters is CPU speed.

There are tons of specialized programs that can use additional processing power from the CPU. Most commercial applications (and by commercial I mean things 90% od consumers wouldn't buy) do use a lot of processing power.

True consumers don't have much, but then its not all about the consumers is it?


RE: whew
By Tyler 86 on 9/19/2006 6:45:18 PM , Rating: 2
Single threaded and cannot be multithreaded? Hogwash. 2+2+2 can be (unnecessarily) multithreaded... and with a large enough application, such as 3D modeling, such multithreading can actually accumulate benefit (unlike 2+2+2 example)...

Automatic parallelization in the likes of the Intel compiler may benefit you significantly.

Which application is this you speak of? I cannot think of any graphical rendering application that is single threaded that isn't out-dated...

Perhaps you're just modeling under budgeted circumstances?

'Most' commerical applications do not use much processing power.

Most, excluding the major applications, such as SQL servers, web servers, mail servers, graphical rendering applications, mathmatical applications, accounting applications, and computer-aided design applications...

"True consumers" don't have much? "Not about the consumers"? What?


RE: whew
By Calin on 9/19/2006 4:38:49 AM , Rating: 2
I vote for faster components/interconnects on the mainboard. Removing all the electrical interference on a mainboard could do wonders to the speed at which you could run processor-to-memory or processor-to-northbridge or even processor-to-processor lines.
Great news


RE: whew
By rushfan2006 on 9/19/2006 11:24:44 AM , Rating: 2
quote:
what we need are apps that take advantage of the speed. As of now ther are very few games/apps. that truly utilize the CPU's that we have now. This looks like WAY overkill unless something drastically changes.


I hear what you are saying. However, don't wish for hardware advances to ever ease up -- I say keep the pace going, make the huge leaps in technology...increase speeds constantly. Personally I'd like to see wireless video technology and then the ultimate either wireless power or if they could some how shrink the thickness of a power cable to like the diameter of a mouse cord. My goal/point with this -- let's make lets clutter and more "freedom" to move/place our ENTIRE system anywhere..not just the mouse and keyboard.

Anyway...software makers (at least I know for games -- I'm kind of a pc gaming junkie..lol)...they INTENTIONALLY develop on the most widespread base of hardware that meets minimum performance requirements of the game while also allowing the largest audience to have access to the product (said game).

Its not like the developers aren't "smart" enough or advanced enough to make use of the technology. Its all about the bottomline..and they do demographics research all the time to find out "what is the average gaming consumer's rig like?".

Despite all the folks on these boards who seem like they are into the latest hardware....the VAST majority of people do not rush out and by the latest vid cards, CPUs or mobo's.



RE: whew
By Tyler 86 on 9/19/2006 7:13:17 PM , Rating: 2
Ergonomics are nice, at times, even essential.

However, 'wireless power' is already here in several forms, from plasma arcs to Tesla's work - the problem is power without conduction is unpredictable, and potentially destructive.

Nanites may some day form a dynamic airborne network of insulators and conductors to channel 'wireless' power, but it'll have very little to do with optics, I believe.

Power cables can be the diameter of a mouse cord easily; just strip the plastic insulation off - but be warned, it is there for a reason.

While freedom is a definant plus to human interface components, and lasers may have a few contributions in the area, none of the goals or points you specify have much to do with this particular development.


Software developers develop upon whatever they want to develop software upon - profit driven ones tend to orient themselves towards the market, naturally.

I agree wholeheartedly that developers can make use of the technology, and that it is all about the bottom line, and that the research you mention is indeed significant in their decision upon the maximums and minimums of their products - It's only logical.

Nonetheless, a large portion, possibly even the majority, of people that play the games the major video game developers have high end hardware, dispite your claim to the contrary.

Here's an example on videocards alone;
338,317 out of 624,014, 54.21% ( 12.27% margin of error ), have significantly high-end videocards. ( from http://steampowered.com/status/survey.html )

People that do not play high end videogames with their lesser videocards do not tend to purchase many high end videogames. I wonder why... sarcasticly...


RE: whew
By Tyler 86 on 9/19/2006 10:51:17 PM , Rating: 2
I take that bit about having nothing to do with optics back, I just remembered an ionizing laser expiriment... it is possible to conduct electricity using such a laser... however, it's still a might unpredictable.


RE: whew
By subhajit on 9/19/2006 3:05:17 AM , Rating: 3
I don't understand one thing though. Even if we use light as the medium for data transmission, shouldn't the transmission speed/rate be still dependant on the switching speed of silicon transistor. So unless we do something about the switching speed of silicon transistor (which has almost reached its limit) how can we expect a great leap in performance? Perhaps I am missing something.


RE: whew
By Tyler 86 on 9/19/2006 7:21:11 PM , Rating: 2
Transmission speed is dependant on bandwidth, while transmission resolution is dependant on switching speed.

You can potentially pack much more into a beam of light than you can into a column of electrons.

Instead of merely increasing the transistor speed to keep up with the rate at which data is recieved by such significantly high-bandwidth bus, also increase it's transistor count.

Easily, double, triple, quadruple, etc... the performance.

This is already done without optical interconnects to meet the demands of powerful processors - it can easily be adapted to an optical bus, with a significant plus to performance.


Star Trek here we come
By dgingeri on 9/18/2006 6:38:57 PM , Rating: 3
The next step is to get chips that run off the light we send and/or are able to store/retransmit the light as memory functions. Then we're well on our way to isolinear chips of Star Trek and 10k time the performance.

Transistors will be a thing of the past, an oddity like we think of coal powered steam engines in horseless carriages or vacuum tubes of old electronics.

Hopefully, we'll be able to do this with less power and be able to run fanless computers, finally. I'd love my power bill to actually go down for once.




RE: Star Trek here we come
By Tyler 86 on 9/18/2006 7:05:48 PM , Rating: 3
Light produces heat.
How much heat is produced during light production?
How much electicity is consumed to produce light?

I don't know about your powerbill going down, but would you settle for the same 10k performance boost?


RE: Star Trek here we come
By Knish on 9/18/2006 7:15:59 PM , Rating: 3
quote:
Light produces heat.

All radiation produces heat, but this is hardly tangible compared to the heat produced by say the oscilation of a transistor a few million times per second.


RE: Star Trek here we come
By Tyler 86 on 9/19/2006 10:46:03 PM , Rating: 2
"The researchers believe that with this development, silicon photonic chips containing dozens or even hundreds of hybrid silicon lasers could someday be built using standard high–volume, low–cost silicon manufacturing techniques."


Dozens or hundreds of lasers, routed by silicon...

It might produce less heat than typical processor solutions, even to the point of being fanless for 1:10 or even 1:50 scale performance... but at 1:10,000 (10K) performance...

Now, literally, do you see where I'm coming from?


RE: Star Trek here we come
By Tyler 86 on 9/19/2006 11:25:45 PM , Rating: 2
I don't know if I made that clear enough...

10,000 electrical processors produce enough net heat to... well, the world may never know, because every such arrangement of processors is cooled excessively... but I assume, provided the processors themselves did not cook off in the process of cooking... they could potentially melt a house or several houses...

To be able to perform the work of 10,000 processors using light... Even if the heat produced is negligible 1:1, at 1:10,000, you're most likely going to need a fan or two... :)


So does this mean
By hellokeith on 9/18/2006 6:14:18 PM , Rating: 5
that if you took one of these laser silicon chip computers on a space craft, the faster you go the slower the computer is? (because of relativity) ;)




RE: So does this mean
By dnd728 on 9/18/2006 6:25:58 PM , Rating: 3
not 4 u


RE: So does this mean
By johnsonx on 9/19/2006 12:03:39 PM , Rating: 2
Why did that post get modded up? It makes no sense. Or am I just lacking a sense of humor (it sure doesn't seem funny)?

The whole point of relativity is that everything is relative to your frame of reference (or point of view if you like). So even if you could achieve relativistic speeds in a space craft, which you can't as of yet, the aparrent speed of everything on the craft would be the same from a point of view on the craft.

Second, relativity affects everything, not just light. Indded electricity and light are both just different forms of the same thing.

So from a frame of reference outside a spacecraft moving past at a relativistic speed (relativistic speed being a speed above about .5C where effects of relativity become obvious), then EVERYTHING on the craft would appear slower and elongated whether it be man, machine or light-based computer chip.


RE: So does this mean
By Tyler 86 on 9/19/2006 7:32:53 PM , Rating: 2
You're right, it makes no sense.

It's not relative to relativity, that would break reality.
It is relative to it's source and destination.

Yes, it would appear slower than it is, much to the percievers chagrin.

It would not truely slow down, at all - the entire craft would merely block, reflect, and refract more light... among other consequences - it's contents (provided some form of gravitational and relativisitic seperation from the outside, 'inertial dampeners') would be unaffected.

However, if affected by the pull of gravity, I presume you would hit a 'frame rate' of sorts, causing uncertain, destructive, or atleast hilarious, 'bendiness' as result...

Example; Am I'm talking out of my ass? Perhaps I went too fast...


Exciting!
By sceptus on 9/18/2006 7:23:00 PM , Rating: 2
I'm not sure if this will be the eventual technology of future computers, but this sure is still exciting.

Note that powerful hardware has far more widespread and better uses than gaming - labs, for example, would benefit from faster and more powerful computers. For example, I just bought a new computer with new, fast hardware, but it isn't for gaming - it's so I can run my computer algebra systems more quickly, so that Photoshop can operate more quickly, so I can render 3D faster, etc. Of course I might use it for gaming too :)




RE: Exciting!
By PurpleJesus on 9/18/2006 10:13:02 PM , Rating: 2
Imagine hooking this things up between cpu's and holographic memory chips.


Tuan, Spellcheck...
By jskirwin on 9/19/2006 9:47:34 AM , Rating: 2
It's Indium Phosphide, not Indium Phoshide x2.

That said, this is very significant advance. I wonder if the first appearance would be at the bus, or between the the processor and its cache.




RE: Tuan, Spellcheck...
By Tyler 86 on 9/19/2006 11:06:03 PM , Rating: 2
I passed over it previously in blatant disreguard.

Nice catch, I did a double take on this one.

To make with the cheeze;
When I pull out my gat, you see my phos-hide.
It's got a laser sight, emitted by indium phos-phide.


Not counting on it
By darkfoon on 9/19/2006 4:59:04 AM , Rating: 1
I'm not counting this sort of this to happen.

I was hoping something like this would put an end to things like TEMPEST. But the technology probably wont appear for a very long time, or at all. A scientist running the project will have a "fatal accident" and that's all we'll hear about it. The gov't wouldn't want to lose one of the best ways to spy on citizens.

Don't know what TEMPEST is? Here's a practical example of it:
http://www.erikyyy.de/tempest/




RE: Not counting on it
By Bladen on 9/19/2006 5:37:39 AM , Rating: 2
Actually I think that your government would probably just forcibly or stealthily take info through other means much easier.

I.e. through your ISP.


sounds hard
By sieistganzfett on 9/18/2006 5:51:55 PM , Rating: 2
hey this sounds hard...




ummm
By Lazarus Dark on 9/18/2006 5:55:59 PM , Rating: 2
wow, I'm as technically proficient as most on here but that one hurt my brain. mostly I got that superfast chips produce thier own light to communicate, the rest kinda blurs.
Maybe I'm tired.




New?
By Goty on 9/19/2006 3:42:44 PM , Rating: 2
So, is everyone amazed at the fact that someone created a laser on a silicon chip or the fact that it's a micropocessor using light instead of electrons as the medium for transmission? Because making a laser in that fashion isn't really all that novel. It's basically just the photoelectric effect in reverse.




DLP
By Eris23007 on 9/19/2006 6:44:50 PM , Rating: 2
Apparently noone has considered how to leverage this ability in combination with the already-commercialized ability to switch light, as demonstrated by Texas Instruments' DLP chips.

This actually could be a real first step toward an entirely optical computer.

That said, miniaturizing such a processor to the point where it could do as much as a conventional CPU would take decades. We'll see if anyone bothers to pursue such a course - but it sure is intriguing!




Beauty
By Jesse Taylor on 9/20/2006 11:10:44 AM , Rating: 2
This is a thing of beauty, I just love its crisp lines and it looks so solid. Beautiful.




Intel Cheating?
By pmouse on 9/22/2006 11:16:54 AM , Rating: 2
The title of their press release suggested that they have finally fabricated a silicon based laser.

But in fact, the device is still InP based light source, "glued" together with a wave guide.

The point of Si Photonics is to make everything with Si in one integrated process, other than combining all kinds of extra processes to get different materials into one cheap - you run into all sorts of problems doing this, which induces cost.

The way things were done before was using InP lasers and then couple the light onto waveguide chips. Now Intel removed the need of coupling (which is tricky too), that's nice, but they are far away from making a real Si Based Laser.

Si, as we know it, has an indirect band gap - it can't be used to efficiently generate light. Recent development in nano technology allowed us to trick silicon into thinking it has a direct band gap, by forming so called Si nanoclusters. This effect is known as quantum confinement, and the device (mostly when applied to materials other than Si) is called a "Quantum Dot Device". However, this technology still requires extensive research and development. Making Si Lasers now is not really a realistic goal, single wavelength Si LEDs is the next logical step.

I was really surprised by the title of this Article, thinking that Intel just leaped 20 years forward in time - only to be disappointed after reading their press release.




Ive got one better.
By jemtec on 9/23/2006 8:00:24 AM , Rating: 2
Although I didnt actually figure out the way Intel's engineers were able to produce the light beam break down pattern, there is another way to improve on this tech.
I came up with a similar idea almost 15 years ago.
Since someone has already basically come up with the same layout. :(
They need to make a "light spectrum spread array" that can take millions of colors and turn them into individual pathways for information.
Think of it more as a solar panel that absorbs light beams and converts them into data "bits".
I was using this idea for flash memory as well, (I called it bubble memory) using each light color point into a bit transmitter for data storage.
Lets just say, todays memory storage is a joke.
Now, apply it to processors, and you will have it made.
Additionally, it can be applied to a new generation of optical data storage that can handle millions of times the storage area of the current best tech.
Imagine the applications for AI?
Food for thought.
Not to sound wacko, but I think of this stuff all the time.
Just wish id get a patent on this stuff.
:(





Small weapons grade lasers...
By wingless on 9/25/2006 12:02:52 PM , Rating: 2
It is my dream that one day this technology will be powerful enough to make small portable weapons grade lasers that pack a punch and have great performance-per-watt ability.




the prospect of a new era
By Crazyeyeskillah on 9/19/2006 12:44:34 PM , Rating: 1
I think what we all want to know about this device is wether or not it will be able to accurately replicate the shadow puppets we all embraced in our youth.




Upgrade??
By mac noob on 9/18/06, Rating: -1
RE: Upgrade??
By sceptus on 9/18/2006 7:25:09 PM , Rating: 2
Sorry, you can't. First, the microprocessor for commercial use won't be available for a little while. So there's no point in waiting. And, more importantly, it simply won't be compatible. You would need a new computer anyways.


RE: Upgrade??
By mac noob on 9/18/06, Rating: -1
RE: Upgrade??
By GhandiInstinct on 9/18/06, Rating: -1
"It seems as though my state-funded math degree has failed me. Let the lashings commence." -- DailyTech Editor-in-Chief Kristopher Kubicki











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki