Print 43 comment(s) - last by wingless.. on Sep 25 at 12:02 PM

High performance light-based computers in the horizon

Researchers at Intel and the University of California, Santa Barbara have announced
the world's first Hybrid Silicon Laser
, or HSL. An HSL is a silicon-based laser emitting device. According to Intel, creating a laser emitting silicon chip is a breakthrough that will propel the world of computers into the light-based transmission era.

Called Indium Phosphide, the material contains properties that allow it to emit light when voltage is applied. Intel researchers were able to integrate Indium Phosphide into traditional silicon chip manufacturing techniques, thereby creating a silicon-Indium Phoshide hybrid chip -- one that could process traditional electrical signals and transmit laser light. The laser light generated by an HSL chip could be used to transmit data and thus power other silicon photonic devices said Intel.

“Silicon Photonics is a critical part of tera-Scale computing as we need the ability to move massive amounts of data on and off these very high performance chips" claimed Intel Chief Technology Officer Justin Rattner. Intel said that HSL could bring along terabit-capable transmission processors that are low cost and easy to produce. Computers would be a multitude more powerful than those we use today. The technology however, is still a number of years off.

Currently, silicon chips can detect light, route light and even modulate light said Intel, but the problem is getting silicon chips to produce light. Intel is taking Phoshide lasers commonly used in other industries and bringing along new types of applications. Voltage is first applied to the HSL. The Indium Phosphide element then produces light, which then enters a silicon waveguide to create continuous laser light. Using this technique, Intel also maintains a low cost production of HSL devices. According to Intel:

The hybrid silicon laser is a key enabler for silicon photonics, and will be integrated into silicon photonic chips that could enable the creation of optical “data pipes” carrying terabits of information. These terabit optical connections will be needed to meet the bandwidth and distance requirements of future servers and data centers powered by hundreds of processors.

The application potentials for HSL chips are truly exciting. The industry in general has been talking about laser or light based electronics for a number of years already. With the development from a company like Intel -- and hopefully others like AMD -- the industry is getting the right push it needs. With multi-core processors now the mainstream, computers will only get faster. HSL devices will drive the future of computing said Intel, and things are looking only brighter. Communications technology uses a fair number of laser electronics and as the technology is refined, desktop computer and notebooks will be using the technology in the next few years as the limits of traditional silicon is reached.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: whew
By retrospooty on 9/18/2006 6:17:21 PM , Rating: 2
"future looks promising"

what we need are apps that take advantage of the speed. As of now ther are very few games/apps. that truly utilize the CPU's that we have now. This looks like WAY overkill unless something drastically changes.

RE: whew
By Zandros on 9/18/2006 6:51:54 PM , Rating: 2
Perhaps one day, we could have just one computer in each house, running, say, five virtualized operating systems and so each family member can have their own computer, but not.

Or we'll just buy a bunch and take some world records in F@H. :)

RE: whew
By Gatt on 9/19/2006 12:13:07 AM , Rating: 2
I've been thinking that for a while. Oct-Core server supporting displays throught the house for anything from On-demand TV to gaming simultaneously.

The death of consoles that are nearing their physical limits, and PC's morphing to an invisible server that handles a variety of chores.

It's coming, even MS is banking on it, which is why they're spending obscene amounts of money to push Sony to release things before Sony's ready, as they have with PS3 and BR. Sony was the only one that could've beat them to it.

RE: whew
By Tyler 86 on 9/18/2006 7:00:46 PM , Rating: 5
The demand is already there for supercomputing...

As for apps that take advantage of the speed, you should take note;
Graphics, audio, and general purpose processors all have a limit in size, scale, and speed, on typical electricity conducting silicon.

3D graphics in particular has yet to hit a wall in utilization.
In fact, it's becoming increasingly easier to feed these increasingly paralell, massive, and compressed architectures enough juice/data to accomplush what a designer expresses...

General purpose CPUs, however, are not best suited when designed to be used 'fully' - that may seem strange, but take the Pentium 4 or D for example.
It's streamlined NetBurst architecture allowed for very impressive specialized feats at high clock speeds and reasonable prices.
Although, it absolutely neglected it's "general purpose", and thus, frankly, it sucked.

Intel has since changed their ways, and now instead of trying to streamline and specialize specificly what can be singular tracked, streamlined, and specialized, similar to (early generation) graphics processors, they're putting more effort into abstracting, generalizing, general purpose prediction, and dynamic resource/capability management.

Lasers instead of electrons might be a cheap path to clockspeed, and so long as they keep their current, very objective view on general purpose processing, but I'm sure they're not preaching optical inter-chip communication for the absolute immediate future for a reason - nonetheless, the future does look promising, after all, light is much faster, and stable (or rather, coherant), than electrons.

As you probably know, on-chip memory, or cache takes up a sizable portion on most modern processors - sometimes as much as half to 2/3rds, or in the case of the Itaniums, 3/4ths.

If on-chip memory and cache could be taken off-chip, the die size would be smaller, thus each individual die would be cheaper, because more dies could be produced.

The main issues of off-chip memory is latency, and bandwidth - that is why cache and on-chip memory exists.
Another issue with off-chip memory is the scale and complexity with which the memory bus has become accustomed.

RDRAM, as a serial (versus DDR(2,3...) SG/SD-RAM as paralell) interface, has overcome that complexity obstical, by not requiring all traces to be timed absolutely perfectly.
Alternate serial technologies reduce the obstical by transmitting over a minimal number, or densely defined traces with more effort in implementation - ala HyperTransport.

An optical memory bus could allow fairly cheap, paralell, perfectly timed, massive amounts of data to be transfered to off-chip memory, and other destinations.

Yes, it's overkill for now, but something drastic happening is hardly 'new' in the electronics industry.

We will get more applications that take advantage of the speed and possibilites, but we already have a few applications that 'truely utilize' even the latest processors - even if you don't think much of them; Folding@Home like applications for example.

To reiterate the previous poster, the future looks promising.

... and the new post coming in during my entry, yes, F@H...

RE: whew
By Xavian on 9/18/2006 7:23:12 PM , Rating: 2
true, however imagine if this technology could be applied to GPU's, lightspeed GPU's and interconnects could transfer massive amounts of infomation in a very short amount of time. Combine this with flash or holographic based hard drives and a lot of bottlenacks that exist in current pc's will be eliminated.

The future is bright, the future is lightspeed :P

RE: whew
By Xavian on 9/18/2006 7:26:57 PM , Rating: 2
bottlenecks not bottlenacks :)

RE: whew
By kamel5547 on 9/18/2006 10:05:56 PM , Rating: 2
Um... tell that to our modelers who run their models for 3 days currently. Conroe (e6600 model) provides about a 40% cut in times over a 640.

Anyhow the particular app is single threaded and cannot be multi-threaded, each calculation depends on the previous calculation, which means only increases in CPU performance will help, basically the CPU is pegged at 100% for the entire time, memory usuage is tiny (100 MB or so) so what really matters is CPU speed.

There are tons of specialized programs that can use additional processing power from the CPU. Most commercial applications (and by commercial I mean things 90% od consumers wouldn't buy) do use a lot of processing power.

True consumers don't have much, but then its not all about the consumers is it?

RE: whew
By Tyler 86 on 9/19/2006 6:45:18 PM , Rating: 2
Single threaded and cannot be multithreaded? Hogwash. 2+2+2 can be (unnecessarily) multithreaded... and with a large enough application, such as 3D modeling, such multithreading can actually accumulate benefit (unlike 2+2+2 example)...

Automatic parallelization in the likes of the Intel compiler may benefit you significantly.

Which application is this you speak of? I cannot think of any graphical rendering application that is single threaded that isn't out-dated...

Perhaps you're just modeling under budgeted circumstances?

'Most' commerical applications do not use much processing power.

Most, excluding the major applications, such as SQL servers, web servers, mail servers, graphical rendering applications, mathmatical applications, accounting applications, and computer-aided design applications...

"True consumers" don't have much? "Not about the consumers"? What?

RE: whew
By Calin on 9/19/2006 4:38:49 AM , Rating: 2
I vote for faster components/interconnects on the mainboard. Removing all the electrical interference on a mainboard could do wonders to the speed at which you could run processor-to-memory or processor-to-northbridge or even processor-to-processor lines.
Great news

RE: whew
By rushfan2006 on 9/19/2006 11:24:44 AM , Rating: 2
what we need are apps that take advantage of the speed. As of now ther are very few games/apps. that truly utilize the CPU's that we have now. This looks like WAY overkill unless something drastically changes.

I hear what you are saying. However, don't wish for hardware advances to ever ease up -- I say keep the pace going, make the huge leaps in technology...increase speeds constantly. Personally I'd like to see wireless video technology and then the ultimate either wireless power or if they could some how shrink the thickness of a power cable to like the diameter of a mouse cord. My goal/point with this -- let's make lets clutter and more "freedom" to move/place our ENTIRE system anywhere..not just the mouse and keyboard. makers (at least I know for games -- I'm kind of a pc gaming INTENTIONALLY develop on the most widespread base of hardware that meets minimum performance requirements of the game while also allowing the largest audience to have access to the product (said game).

Its not like the developers aren't "smart" enough or advanced enough to make use of the technology. Its all about the bottomline..and they do demographics research all the time to find out "what is the average gaming consumer's rig like?".

Despite all the folks on these boards who seem like they are into the latest hardware....the VAST majority of people do not rush out and by the latest vid cards, CPUs or mobo's.

RE: whew
By Tyler 86 on 9/19/2006 7:13:17 PM , Rating: 2
Ergonomics are nice, at times, even essential.

However, 'wireless power' is already here in several forms, from plasma arcs to Tesla's work - the problem is power without conduction is unpredictable, and potentially destructive.

Nanites may some day form a dynamic airborne network of insulators and conductors to channel 'wireless' power, but it'll have very little to do with optics, I believe.

Power cables can be the diameter of a mouse cord easily; just strip the plastic insulation off - but be warned, it is there for a reason.

While freedom is a definant plus to human interface components, and lasers may have a few contributions in the area, none of the goals or points you specify have much to do with this particular development.

Software developers develop upon whatever they want to develop software upon - profit driven ones tend to orient themselves towards the market, naturally.

I agree wholeheartedly that developers can make use of the technology, and that it is all about the bottom line, and that the research you mention is indeed significant in their decision upon the maximums and minimums of their products - It's only logical.

Nonetheless, a large portion, possibly even the majority, of people that play the games the major video game developers have high end hardware, dispite your claim to the contrary.

Here's an example on videocards alone;
338,317 out of 624,014, 54.21% ( 12.27% margin of error ), have significantly high-end videocards. ( from )

People that do not play high end videogames with their lesser videocards do not tend to purchase many high end videogames. I wonder why... sarcasticly...

RE: whew
By Tyler 86 on 9/19/2006 10:51:17 PM , Rating: 2
I take that bit about having nothing to do with optics back, I just remembered an ionizing laser expiriment... it is possible to conduct electricity using such a laser... however, it's still a might unpredictable.

"Nowadays, security guys break the Mac every single day. Every single day, they come out with a total exploit, your machine can be taken over totally. I dare anybody to do that once a month on the Windows machine." -- Bill Gates

Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki