Print 100 comment(s) - last by William Gaatje.. on Oct 16 at 11:43 AM

An artistic rendering of the new light-driven wireless network in action.  (Source: Boston University)
Could the networks of the future run on light

Solid-state lighting is one of the hottest topics in the tech industry, and with good reason.  The Department of Energy is sponsoring a $20M USD "L Prize" for advances in LED lighting, a type of light which uses solid-state components (diodes).  The research is a big deal as lighting currently consumes 22 percent of the electricity in the U.S.  If the DOE accomplishes its goals of reducing lighting energy use by 50 percent, it would save billions of dollars and reduce environmental impact.

New research from Boston University's College of Engineering, funded by a National Science Foundation grant, indicates that LEDs may be not only the integral lighting component of the future, but may also form the backbone of future wireless networks.

BU Engineering Professor Thomas Little describes the new research, stating, "Imagine if your computer, iPhone, TV, radio and thermostat could all communicate with you when you walked in a room just by flipping the wall light switch and without the usual cluster of wires.  This could be done with an LED-based communications network that also provides light - all over existing power lines with low power consumption, high reliability and no electromagnetic interference. Ultimately, the system is expected to be applicable from existing illumination devices, like swapping light bulbs for LEDs."

The primary goal of the research is to develop LEDs that do exactly that -- transmit information wirelessly via controlled blinking. 

Little continues, "This is a unique opportunity to create a transcendent technology that not only enables energy efficient lighting, but also creates the next generation of secure wireless communications.  As we switch from incandescent and compact florescent lighting to LEDs in the coming years, we can simultaneously build a faster and more secure communications infrastructure at a modest cost along with new and unexpected applications."

Professor Little and his colleagues imagine LED lighting in the room being hooked up to computer circuitry, which uses existing lighting to implement a wireless network which provides data to computers, personal digital assistants, television and radio reception, telephone connections and thermostat temperature control.  Prototypes of the new network design, according to Professor Little, should start at around 1 to 10 Mbps.  Better yet, bandwidth would be greater than in existing radio frequency (RF)-driven networks.

In the new network, each LED light bulb would act as an access point.  Another perk of the new design is beefed up security.  Unlike RF networks, the new signal would not pass through walls or other opaque objects.  This would help prevent snooping and connection theft.  The new system would also use much less power than RF, as solid state lighting is energetically cheaper to the strong radio signals needed for wireless internet. 

The flickering which drove the network would be performed so fast the human eye could not see it.  The network would ideally be able to operate outdoors as well as indoors.  The first test deployment may be outdoors, with a likely candidate being car interiors.  Professor Little continues, "This technology has many implications for automobile safety.  Brake lights already use LEDs, so it's not a stretch to outfit an automobile with a sensor that detects the brake lights of the car in front of it and either alerts an inattentive driver or actively slows the car."

While the technology seems very promising, one quandary is how to make the communication bidirectional.  Professor Little and his team have not elaborate on this tricky point yet in the initial press.  In order to send data requests, you would need a means of receiving light from devices such as cell phones or laptops, however, you ideally would want to avoid having to have a bright blinking transmitter on your device walls covered in sensors. 

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Another problem
By FITCamaro on 10/7/2008 1:33:26 PM , Rating: 4
How does natural light affect this?

While the technology seems very promising, one quandary is how to make the communication bidirectional. Professor Little and his team have not elaborate on this tricky point yet in the initial press. In order to send data requests, you would need a means of receiving light from devices such as cell phones or laptops, however, you ideally would want to avoid having to have a bright blinking transmitter on your device walls covered in sensors.

But yeah this was the biggest thing I saw as a problem.

RE: Another problem
By ChronoReverse on 10/7/2008 1:41:36 PM , Rating: 1
You know those blinding ultra-bright blue LEDs they like to put on everything these days? They'll finally serve a real purpose.

RE: Another problem
By Googer on 10/7/2008 11:03:39 PM , Rating: 2
Blinking won't be a problem. Here's an example:

Your CRT TV or computer monitor "blinks" at least 30 or 60 times per second and you never notice it. The data transmitted from the lights, will blink at millions of times per second to transmit data at 10mb/s. It's so fast that no living creature should be ever notice it.

I'd also like to add that similar technology was demonstrated near the mid 1990's for the retail industry. Except they decided to modify and use the existing incandescent lights or fluorescent tubes. If memory serves correctly, IBM and a few universities had their hands on the project.

RE: Another problem
By HavocX on 10/8/2008 7:19:21 AM , Rating: 2
Lots of people notice 60 Hz flickering, and it drives me crazy. I've read that most people stops being able to notice flickering at around 75 Hz.

RE: Another problem
By MrPoletski on 10/8/2008 1:16:56 PM , Rating: 2
That's because the flickering of a CRT is a little more complex than that. It is *drawn* 60 times per second and the phosphor has a decay time after the electron beam has stopped energising it (converting it to light).

What also flickers at 50hz or 60 in the Us, is strip lights. They flicker at mains frequency and you can only just see it if you look at them right out of the corner of your eye (because your eye has better colour perception in the main vision and better contrast perception in the peripheral).

This light will not be interfered with by daylight per say, because the LED will give out a particular wavelength, so any detector will be able to filter out everything else leaving the signal.

What is going to be an issue with it is signal interference due to reflections. with your light sensor pointing in the wrong direction, it could pick up multiple diffuse reflections from all directions. The issue is them combining with a large difference in path length and the light interfering deconstructively. 10Mbits would mean the path difference need be less than 30m, doable. 100Mbit less than 3m, difficult.

RE: Another problem
By Myg on 10/8/2008 11:33:46 AM , Rating: 2
"Your CRT TV or computer monitor "blinks" at least 30 or 60 times per second and you never notice it. The data transmitted from the lights, will blink at millions of times per second to transmit data at 10mb/s. It's so fast that no living creature should be ever notice it."

Unless of course; they plays FPS games.

RE: Another problem
By aguilpa1 on 10/9/2008 7:05:49 PM , Rating: 1
As I recall a CRT flickering a 60Hz was horrible and only good CRT's that could hit 75Hz or higher at high resolution where worth having. I can't imagine being bathed in a room full of lights flickering at 60Hz..., heck NO! If they are not 75Hz or higher, there is no way I would ever buy into that.

RE: Another problem
By BarkHumbug on 10/9/2008 3:17:49 AM , Rating: 2
Is that why my D-Link router is shining bright blue? It's transmitting!

RE: Another problem
By MrBlastman on 10/7/2008 1:41:39 PM , Rating: 4
My Track IR, an amazing piece of technology btw, utilizes infrared reflected light and works great. The only problems I have with it are background radiation - sunlight, lamplight or other environmentally created light phenomena.

I fear that this would be a huge drawback to this technology - not to mention how annoying these blinking lights would be in the room.

RE: Another problem
By ChronoReverse on 10/7/2008 1:43:41 PM , Rating: 2
The lights won't be visibly blinking. That's the point.

They'll be the lighting for the room and this potential technology is supposed to hook into those and use it for data transfer at the same time.

RE: Another problem
By Adonlude on 10/7/2008 8:37:44 PM , Rating: 2
But sometimes I like to surf the internet in my living room on the couch with the lights off and the tv on. No more connections in the dark? What about outdoors?

RE: Another problem
By BushStar on 10/8/2008 3:04:24 AM , Rating: 2
Come on, major issue here! When you turn the lights off the network dies ><

Also line of sight between LED access points, when I close a door that room gets cut off from the network...

Pie in the sky IMO

RE: Another problem
By jtemplin on 10/8/2008 12:23:37 PM , Rating: 2
Guys I'm sure if this technology is aggressively pursued they will get it working with infrared and ultraviolet LEDs that emit energy outside our narrow visible spectrum.

RE: Another problem
By jtemplin on 10/8/2008 12:25:13 PM , Rating: 2
I play Wii all the time with the lights off and I don't see the LEDs inside my sensor bar. The wiimote must be sensing their output as you can replace the bar with 2 candles as you may have heard. So...problem solved!

RE: Another problem
By feraltoad on 10/8/2008 5:32:47 AM , Rating: 2
Yeah, I understand what your saying but no one likes a blinking light. Especially, lights that blink millions of times a second. If a light is blinking 15 times a second it is pretty annoying, and now I'm imagining a light blinking millions of times per second. That's millions of time more annoying in my book. Whelp, time to go open a soup can with my trusty hammer. I tried using one of those electric jobs, but smash open a couple of cans with 'em and they just go to pieces.

RE: Another problem
By myhipsi on 10/8/08, Rating: 0
RE: Another problem
By mindless1 on 10/7/2008 4:35:43 PM , Rating: 4
It's not only easily possible to make the blink rate so fast that a human cannot perceive it, it is absolutely necessary for such a high frequency to transmit data at a desirable rate.

RE: Another problem
By Googer on 10/7/2008 10:56:49 PM , Rating: 2
In reality, this technology is nothing new. Very old legacy computers used to have a built in IR port for wirelessly synchronizing data between two computers or devices (Palm!), connecting peripherals, and a multitude of other uses like replacing your TV remote with a laptop.

RE: Another problem
By Mr Perfect on 10/7/2008 1:43:22 PM , Rating: 2
The biggest problem I can think would be line-of-site. If your wireless lightbeaming device does not have a line of site to receiver, your connection is dead.

RE: Another problem
By ChronoReverse on 10/7/2008 1:45:28 PM , Rating: 2
If they're using multiple LEDs as access points, which is likely since it's for lighting the room as well, then the line of sight issue isn't too much of a biggy.

Ultimately it's still how to return information. If it's like satellite Internet where you have a great downlink but a dialup uplink, then this potential technology will be extremely limited.

RE: Another problem
By achintya on 10/7/2008 2:36:37 PM , Rating: 3
Well, this is like a great downlink but NO uplink. At all. So currently the only purpose I can currently think of this is broadcasting, or rather narrowcasting.

RE: Another problem
By Solandri on 10/7/2008 4:54:56 PM , Rating: 3
That's not as big a problem as most people think. Light bounces off walls and other stuff. Your TV remote uses an IR LED. Try pointing it at the wall behind you instead of at the TV. Usually it'll still work.

LOS is more important for radio applications, where you're dealing with noise and greater signal degradation. Due to radio's longer wavelength, it passes through many objects we consider solid. Light OTOH bounces off most of these (except transparent stuff). So there's less noise and signal degradation in the visible light spectrum.

RE: Another problem
By MrTeal on 10/7/2008 10:31:03 PM , Rating: 3
Honestly, I think you have it backwards. LOS is much less important in long wavelength communications, because it passes through materials better and suffers much less atmospheric extinction. The higher the frequency, the more important line of site becomes. Reflections are generally not a good thing; they might help in some trivial applications like a TV remote, but in any higher bandwidth applications, things like multipath can cause huge issues.

RE: Another problem
By BWAnaheim on 10/8/2008 9:22:04 AM , Rating: 2
Good response. Not only would reflections cause multi-path problems from your light/network source to your networked device, but if you notice the picture, the room has a line of desks. Now, the reflections off objects on your desk will cause interference to the networked device for the person at the next desk. Light scatters when it hits objects through which it cannot pass, which is why a single light source can illuminate more than one point. This scattering effect will cause an increased noise floor in a room with multiple users, to the detriment of throughput.

As for the flicker effect, not only do televisions flicker, but so do fluorescent bulbs. While this flicker is noticeable on bulbs with old ballasts or ballasts that are going bad, high frequency fluorescent ballasts like those used in CFL are noticeable under normal operating conditions. I imagine that these new devices will flicker at about 20 MHz to achieve the throughput speeds quoted, so this flicker should be invisible to the human eye. As I was writing this response, though, I recalled that interference patterns between multiple light/network sources may become visible if the differences in flicker rates are within the range of the human eye and the difference in light intensity in those interference zones is also detectable by the human eye. Think of this as the fluorescent light effect on a CRT television or monitor.

If the office has only one light source or a few light sources, these problems could possibly be alleviated, but then people will have problems receiving the light on their networked devices-- their eyes. I think that this solution is an interesting study, but the practical application is going to see some major implementation issues in my opinion.

RE: Another problem
By danrien on 10/7/2008 2:47:24 PM , Rating: 2
LED's probably radiate at a slightly different spectrum than normal lighting. Also, the sun flickers at a random rate, (and never drops to a 0), so that would be another way for a detector to differentiate from normal light. Also, I have seen similar stuff to this in one of my EE classes - using an LED light to measure jump height since LED's also act as light detectors beyond just emitting light. I just want to know what happens when a bank robber walks into a bank with one of these systems equipped and starts turning his flashlight on and off.

RE: Another problem
By William Gaatjes on 10/7/2008 3:17:29 PM , Rating: 2
One way to solve it could be to use leds with different wavelengths for the transmitting and the receiving side.

Or specify a certain wavelength for a certain device.
There is enough room there for many devices. It is the same idea as with fibreoptic cable where a certain amount of fixed wavelengths are used for multiple datastreams. each stream has it's own "colour" so to say.

RE: Another problem
By mindless1 on 10/7/2008 4:32:45 PM , Rating: 2
It's unlikely that it will depend on wavelengths, even a random batch of LED vary in the wavelength enough that there has to be some consistency to maintain a visible light - necessary since they're talking about leveraging visible light room lighting for implementation - that maintains a pleasant hue.

Ultimately the solution will be high frequency flickering, and yet this requires a more complex power supply so it drives up costs. Right now one of the best features of LED lighting besides the reduction in power and maintenance costs is the versatility in power sources, basically pick among many different designs to fit the application, so long as the current is kept where it should be.

RE: Another problem
By William Gaatjes on 10/7/2008 4:52:20 PM , Rating: 3
Well, afcourse the data is hidden in the modulation of the light. But you can use different wavelengths for tx and rx purposes. These wavelengths are afcourse not 1 single number but just 2 narrow bands of the used spectrum. This would allow for deviation by production processes. You could use more small bands for multiple devices. 3 devices would equal to 6 narrows bands. 3 * (tx + rx) for example.

RE: Another problem
By mindless1 on 10/9/2008 12:50:01 AM , Rating: 2
No, you can't use different wavelengths for TX and RX, if the goal is using the same LEDs for room lighting, unless you mean including two different specific wavelength LED in the same room lighting fixture but even then, their wavelengths would be too similar to be effective if they were both part of the room lighting benefit.

You can't just divide up narrow bands when LEDs themselves deviate more than this in aveage production, it would require hand-picking and testing each LED which drives up costs a lot more when we're in an era that the adoption of LED lighting becoming widespread requires reduction rather than increase in cost.

Someday it will be more viable. It just won't be within the next few years for cost-conscious average consumers.

RE: Another problem
By William Gaatjes on 10/10/2008 5:18:41 AM , Rating: 2
When i look into several datasheets, led's are very consistent when it comes to produced wavelengths. I do not think that is the issue. And assembling multiple leds into a light has to be done anyway. Natural light consists of a very wide spectrum. Ever used a led light on different materials, you might not see the colour that is actually there because of the narrow spectrum of led's, even white led's have this problem although they are getting better with every generation. Besides white led's are already a combination of red green and blue led's. That's already 3 seperate wide bandwiths.

1 Led cannot produce the entire spectrum of natural light since the radiaton produced by the led is depended on the bandgap of the used materials. And we want natural light lighting. Trust me on that.

Every element has it's own colour. By mixing elements and other advanced tricks you can make most colours but not a wide spectrum. So in the end you can use different spectrum's for TX and RX. Because a led lamp that produces a similair spectrum as for example the good old edison lamp with it's glowing wire needs multiple led's , or a led that is build up from multiple emitting materials. Afcourse the handheld devices themselves don't need "natural light" TX led's , just there specified spectrum led's.

RE: Another problem
By William Gaatjes on 10/10/2008 6:58:44 AM , Rating: 2
I must mention that most white led's are blue led's with a phosphor coating. The phosphor coating is exited by the radiation of the blue led and the phosphor itself emit's a lower part of the spectrum. Here the improvement must be found in the coating since these led's are still a bit blue.

RE: Another problem
By mindless1 on 10/12/2008 2:00:59 AM , Rating: 2
They are fully able to coat to reach less blue hue, but it incurrs a loss of efficiency. Ultimately we're talking about consumers so some studies should be done to decide what the most blue light they are willing to accept is and build a according to that standard.

RE: Another problem
By mindless1 on 10/12/2008 1:59:22 AM , Rating: 2
No, most room lighting LEDs are binned for their wavelenght, any comprehensive datasheet should mention this binning.

Yes they have to be assembled into one light but that is why you have to know and pick a bin to achieve some particular wavelength.

White LEDs are not already a combination of red green a blue, they are blue LEDs with a coating that results in somewhat nearer "pure" white, the further away from blue it is the less efficient the LED is in producing light.

You can't use very different spectrums because you then escape the whole point which was that the *light* used was visible light within a desired room lighting specturm else the power was wasted and they might as well have used separate LEDs from the room light LEDs.

It is possible to have TX and RX offset enough in spectrum to accomplish this, if the RX has long enough on-cycle time to offset the TX LEDs, but adding to the complexity in this manner seems unproductive since the ultimate limit in range will be the client devices which are not using their TX/RX LEDs at anywhere near the power level the mains powered lighting is so for practical purposes the benefit is lost versus using an equivalent LED TX/RX pair as the hosts use. IOW, separate optical networking modules instead of integration into room lighting.

RE: Another problem
By William Gaatjes on 10/12/2008 7:49:43 AM , Rating: 2
Picking and binning is no issue. I have explained how led's produce the radiation. This is pretty constant.

And i already explained some led's are blue led's with a phosphor coating. These led's give unnatural lighting and is not pleasant to be in all day. Ask you fish for example what good lighting does for them. visible light is a big spectrum.

RE: Another problem
By mindless1 on 10/15/2008 4:59:20 AM , Rating: 2
Picking and binning is CRUCIAL, because we're talking about wavelengths that result in the desired ambient light hue by leveraging the light produced to also be a comm means.

You can't just ignore wavelength, the whole point was these wavelengths were power used to light the room desirably, not just picking among the whole spectrum.

Visible light is NOT a big spectrum when the goal is a desirable hue. The spectrums have to be very close or else you will have a pulsating effect noticable to the human eye.

RE: Another problem
By William Gaatjes on 10/16/2008 11:37:12 AM , Rating: 2

The modulation frequency is much too high for the human eye to notice. If we would be able to pick up light modulations in excess of 50 HZ for example, television would have never worked.

RE: Another problem
By William Gaatjes on 10/7/2008 5:04:55 PM , Rating: 2
And indeed the current through the led is important. The best led powersupplies are current regulating and not voltage regulating. The voltage over the led is fairly constant.

RE: Another problem
By mindless1 on 10/9/2008 12:56:03 AM , Rating: 2
Any LED power supply has to be current limiting or it's defective. You could hand-match one constant voltage PSU to one hand-picked LED with a tested voltage:current ratio but this could never be a cost effective manufacturing strategy.

There would necessarily have to be processing of what the data rate vs frequency result was in total light output so the voltage increased as the frequency changed to transmit data. It is attainable, no question about that, but the issue is doing this cost effectively in *every* light module installed when the present disincentive in deploying LED lighting is already the (for most uses) excessive cost per lighting module.

Actually, the voltage over the LED will not be constant within the idea of data TX/RX, it is necessarily going to rise and drop below an emitter threshold at the frequency needed to achieve the desired data rate, at a minimum the voltage will cross a forward voltage/die threshold in the KHz range and probably higher considering their claims of data throughtput rate.

RE: Another problem
By William Gaatjes on 10/10/2008 5:36:56 AM , Rating: 2
Led's are current driven devices. To get the best results you need a current controlled power supply. This means that the voltage varies when the load (the led) varies because of temperature fluctuations. Good old Ohm's law U = I * R.
No handpicking there, just a good led regulator ic.

The led's will be just switched on and off at a high rate afcourse.
With the dataspeeds mentioned in the article i would say it would be megahertz. We have optocouplers that can switch that fast. But i think the switching frequencies of the led's will be much lower and multiple leds with their own part of the spectrum are used to obtain a parallel nature of sending data.

When there is no data to be send , the led's emit there colour at a controlled current set point.

RE: Another problem
By mindless1 on 10/12/2008 2:05:09 AM , Rating: 2
Actually no, the voltage doesn't vary it is current controlled and the heatsink is a constant so the temp should not vary significantly. You could make it more elaborate in an outdoor installation, in some cases that might be reasonable but for practial economical design choices, it won't happen.

They will vary voltage to some extent, to achieve a median brightness value during data TX/RX, to achieve roughly the same average current. That much will be necessary as consumers won't want lights that aren't at constant intensity. The practical applications will depend on budgetizing the design, people won't pay a lot for built in optical networking in every light fixture when a wifi router can be had for $10 total that covers several rooms, and the client adapters equally inexpensive.

RE: Another problem
By William Gaatjes on 10/12/2008 7:51:53 AM , Rating: 2
Please do some reading up on the specification of diode and Light Emitting Diodes. Also, please look at Ohm's Law.

RE: Another problem
By mindless1 on 10/15/2008 4:55:10 AM , Rating: 2
Diodes are not a resistive load, Ohm's law is not applicable. Please do some reading on diodes.

RE: Another problem
By William Gaatjes on 10/16/2008 11:22:38 AM , Rating: 2
Diodes are not resistors, i know they are non linear components. It is for the sake of simplicity i used Ohm's law for example. I don't have to read, i know the characteristics from diodes from memory. I see it simple, to make a led that has a spectrum similair to a ordinary bulb or what is even better day light, You can pick and bin all you like but a led lamp build from multiple led's is easy to calibrate for the desired color temperature. Because of the way led's produce light.

RE: Another problem
By William Gaatjes on 10/12/2008 10:48:59 AM , Rating: 2
There is something called thermal resistance. The junction temperature of these led's are much higher then the temperature of the heatsink. For a common example, CPU heatsinks do not reflect the actual temperature of the hotspots present in the cpu die.

And when the material get's hotter, more current starts to flow. That's why led's are current driven devices. And the more power you draw, the more serious it get's.

RE: Another problem
By mindless1 on 10/15/2008 4:56:02 AM , Rating: 2
I never claimed otherwise, but it is not linear and your basic statement of it has no direct bearing on the conversation.

RE: Another problem
By William Gaatjes on 10/16/2008 11:43:05 AM , Rating: 2
Actually no, the voltage doesn't vary it is current controlled and the heatsink is a constant so the temp should not vary significantly. You could make it more elaborate in an outdoor installation, in some cases that might be reasonable but for practial economical design choices, it won't happen.

Read your own post please as the quoted section comes from your post.

Please read up on current sources.

Also when looking at the I/U graph of a standard diode, You wil notice that for small forward voltage changes large current changes happen.

By elessar1 on 10/7/2008 1:39:43 PM , Rating: 1
i dont want any insect to get near to my luminous data!!!

think of those anoying mosquitoes getting in the way of your pron while downloading!!! eeek!!!

And....the fact taht we dont see it flickering doesnt mean it wont give you a headache...


RE: flys!!
By ChronoReverse on 10/7/2008 1:42:47 PM , Rating: 2
I hope you're not using an LCD then. The flickering of that backlight should be much lower than what I suspect this system would have to use to provide the kind of bandwidth they're thinking of.

And if you're using a CRT...

RE: flys!!
By Spivonious on 10/7/2008 2:19:33 PM , Rating: 1
My CRT refreshes at 120Hz. So does my work LCD's backlight. I'm good.

RE: flys!!
By ChronoReverse on 10/7/2008 2:30:22 PM , Rating: 3
That's the point, these LEDs would have to flash at rates FAR higher than 120Hz to get 1Mbps. There's simply no way to see flicker.

RE: flys!!
By tastyratz on 10/7/2008 3:57:53 PM , Rating: 2
so we speculate and assume here...

It's also possible that this uses several wavelengths to compensate for low hz. US military testing shows the human eye can detect over 200hz in some situations. If it was able to instantly flicker on and off we very well may be capable of seeing it... or it might flicker just enough to give us a headache. I can personally see a florescent light blink.

Does anyone know about what frequency we are currently up to for led light? What is the current limitation for a detectable on/off flicker via led?

Also, does it diminish as it increases in brightness/power? If it was to be used for solid state lighting of a room it will need to be measurably bright.

RE: flys!!
By mindless1 on 10/7/2008 5:00:32 PM , Rating: 2
Even a garden variety lighting LED can flicker faster than the human eye can detect, I'm not sure of the ceiling but it's well above 200Hz.

As for what frequency we are currently up to, I don't know if anyone had been trying to reach ultra-high frequency yet for room lighting (except probably those this article is about and similar researchers outside of the LED industry), generally the frequency used is one of economy in designing the power supply above Hz, in the KHz range.

Yes it diminishes as brightness/power per die increases but for practical purposes the increase in frequency could still result in the same average current, the pulse current would simply be a treshold above what the average current would've been. That is already what often happens if you drive an LED with a switching supply for efficiency instead of a resistive current limiting supply, but to achieve the lower threshold for digital communication it seems they'd need to minimize capacitance between supply output and LED.

RE: flys!!
By MrPoletski on 10/8/2008 1:19:15 PM , Rating: 2
Megahertz dude. They are the same family as transistors.

RE: flys!!
By mindless1 on 10/9/2008 1:06:14 AM , Rating: 2
Same family isn't quite the same, remember that unlike transistors, visible white light starts out blue, there is a rise and fall time involved below that of transistors. We're not just talking about attainable frequency, we're talking about attaining it within the tolerances needed for a digital threshold value of 0/1 consistently.

RE: flys!!
By MrPoletski on 10/8/2008 1:22:31 PM , Rating: 2
Does anyone know about what frequency we are currently up to for led light? What is the current limitation for a detectable on/off flicker via led?

Dude, what do you think runs gigabit fibre ethernet?

RE: flys!!
By BarkHumbug on 10/9/2008 3:24:44 AM , Rating: 2
+1! (already posted...)

RE: flys!!
By mindless1 on 10/7/2008 4:38:13 PM , Rating: 2
It is not likely your work LED's backlight refreshes at 120Hz. Perhaps the data, screen image refreshes at that but an LCD backlight's flicker rate depends on the driver board, typically operating in the KHz or higher range.

RE: flys!!
By Spivonious on 10/8/2008 9:38:27 AM , Rating: 1
Pretty sure it's a flourescent backlight, which Google told me normally "flickers" at 120Hz. It's a Dell 1708FP if someone knows the actual flicker rate of it.

The screen data refreshes at 60Hz.

RE: flys!!
By mindless1 on 10/9/2008 1:01:09 AM , Rating: 2
Google is telling you about standard flourescent area lighting running off 60Hz AC mains supplies plus ballast, not switching driver boards running off stepped-up, boost switching PSU in laptops.

Data refresh rate has nothing to do with CCFL refresh rate. Normally the CCFL driver board uses KHz range frequency because this allows for a much smaller transformer, lower cost, and as we'd mentioned already, going too low in CCFL drive rate would cause noticable flicker.

Perhaps I don't see the Light....
By Koder on 10/7/2008 2:16:42 PM , Rating: 4
But I'm not convinced that we'll see much in the way of cost efficiencies related to this technology.

Unless you have an LED on in your every room, the network would be limited to the room which you are in. Yes, RF signals require more power; but they also have the ghost-like benefit of traveling through walls. (Boo!)

Let's face it: who would really want to flick on the lightswitch every time they walked into a different room with their laptop- especially in broad daylight?

RE: Perhaps I don't see the Light....
By FITCamaro on 10/7/2008 2:44:26 PM , Rating: 2
Actually that's an excellent point.

RE: Perhaps I don't see the Light....
By ChronoReverse on 10/7/2008 3:40:34 PM , Rating: 2
If we're going to be this high-tech then naturally the lights in the room (which would be all LEDs lights set as access points) would turn on as you walk into the room ;)

RE: Perhaps I don't see the Light....
By Koder on 10/7/2008 3:53:54 PM , Rating: 2
If we're going to be this high-tech then naturally the lights in the room (which would be all LEDs lights set as access points) would turn on as you walk into the room ;)

Good point, however this would only add to the inefficiency of the system.

To have the LED lights automatically turn on, you would have to have motion sensors in every room that are "constantly on" - thus drawing a current ...which equals higher power consumption.

Ironically enough, if motion sensors were included as "part of the system", they would be emitting IR waves - which is the same thing your trusty ol' wireless router does.

How's that for a zero-sum equation? ;)

By ChronoReverse on 10/7/2008 7:20:42 PM , Rating: 2
Because motion sensors draw large amounts of current and puts a heavy strain on infrastructure.

By omnicronx on 10/8/2008 1:14:43 PM , Rating: 2
To have the LED lights automatically turn on, you would have to have motion sensors in every room that are "constantly on" - thus drawing a current ...which equals higher power consumption.
Do the math.. a motion sensor does not use more than 1W of power when idle, (realistically it probably draws no more than 500 milliwatts i.e half a watt and it is probably less than that) so even if it actually consumed an entire watt, we would be talking 5-7 cents a month on your energy bill..

Ironically enough, if motion sensors were included as "part of the system", they would be emitting IR waves - which is the same thing your trusty ol' wireless router does.
Neither emit IR waves.. Wireless emits radio waves, and visible light.. emits just that, visible light.. All three are forms of light at different wavelengths and frequencies.

By BarkHumbug on 10/9/2008 3:45:19 AM , Rating: 2
they would be emitting IR waves - which is the same thing your trusty ol' wireless router does.

Actually, I have an old pair of wireless headphones that uses IR for signal transmission. And then there's that IR-port on old laptops which you could use to connect to other laptops and even some mobile phones, but they really had to be right next to each other.

Wireless IR routers though? Never heard of 'em... :/

By omnicronx on 10/8/2008 1:02:43 PM , Rating: 2
But i think that was the point.. Radio waves travel through walls, light does not. It is much more secure as you must physically be on site in order to access the network. Stealing and cracking a signal from the parking lot would be a thing of the past. Also as the article mentions, it could be a great replacement for RFID's as you could not merely hide a device in your pocket or backpack to intercept the signal, which is how recent studies have shown is a major liability for RFID's.

By slash196 on 10/7/2008 2:26:09 PM , Rating: 2
Wi-fi is already light. It uses radio waves, which are electromagnetic waves, just like visible light. Having your networks communicate with visible light just seems like a huge nuisance and totally unnecessary.

RE: Uh
By Treckin on 10/7/2008 3:40:42 PM , Rating: 1
You, sir, are a moron. Please at least fire ONE neuron before posting. Please.

This is not Myspace.

7th grade physics knowledge FTFML

RE: Uh
By PrinceGaz on 10/7/2008 7:02:58 PM , Rating: 3
You, sir, are the moron. All EM waves (from radio to gamma) are similar in many ways to light. Their properties vary gradually by wavelength, but their are many similarities. They can also all be treated as particles (photons) instead of waves if desired.

But knowing that visible-light and radio-waves are just variations on the same thing probably isn't taught at 7th grade Physics so you can be forgiven for your ignorance, Treckin.

RE: Uh
By overzealot on 10/8/2008 1:47:47 AM , Rating: 2
First rule of internet retort: five seconds of research saves you from many hours of embarrassment.

RE: Uh
By omnicronx on 10/8/2008 12:44:06 PM , Rating: 2
Radio waves, microwaves, infrared, visible, ultraviolet, X-ray and gamma radiation are all different forms of light.

Of course the OP is stretching it a bit to say Wireless = light. Usually when someone says light, they are talking about visible light, which does have a different wavelength and frequency than radio, but they are both EM radiation.

To me saying Wireless = Light is just as informative as saying Sound = Light.. (which technically it is, but when we say sound, and light in scientific terms, the meaning is usually interpreted as visible light and sound.) A physics prof would probably shoot you down for saying it, so I will too.. although this poster is the moron, most of this stuff IS 7th grade physics.

RE: Uh
By mindless1 on 10/7/2008 5:04:05 PM , Rating: 2
It does have it's drawbacks, for example today we talk about 2.4GHz being a problem due to phones, microwaves, other wifi signals, etc, so we'll move to a band already used by everything else that creates light? Through careful control of frequency that problem can be minimized, but so it could also be minimized through equivalent control around 2.4GHz.

RE: Uh
By omnicronx on 10/8/2008 12:05:20 PM , Rating: 2
Having your networks communicate with visible light just seems like a huge nuisance and totally unnecessary.
It makes perfect sense, its a well known fact that going high or low on either end of the spectrum can have adverse effects on the human body. i.e Gamma on one side, sub sonic on the other.. There is a reason 5.8GHZ wireless A is not recommended for home use. Also the same reason why basking in the suns gamma rays for a long time is never a good idea.

RE: Uh
By BarkHumbug on 10/9/2008 3:51:07 AM , Rating: 2
its a well known fact that going high or low on either end of the spectrum can have adverse effects on the human body. i.e Gamma on one side, sub sonic on the other..

Yes, don't wanna play around to much with gamma. Unless you wanna be big and green and very, very angry... ;)

Beefed-up security my ass
By PrinceGaz on 10/7/2008 3:24:02 PM , Rating: 2
Another perk of the new design is beefed up security. Unlike RF networks, the new signal would not pass through walls or other opaque objects. This would help prevent snooping and connection theft.

Whilst it won't pass through walls and other opaque objects, sufficient visible-light will most definitely pass through windows even with curtains closed and could be very easily snooped on from a much greater distance than current RF technologies can be, simply by using a small tripod-mounted telescope in front of the receiver.

With a decent astronomical telescope and a good vantage point, your snooping range could easily extend to a mile or more - simply aim the telescope at whichever light-source you wish to snoop on and you're in business.

Far from being more secure, light based communications are much less secure - unless the room where it is being used has no windows (or windows with light-proof shutters which are kept closed whilst in use). The same security can already be achieved with RF networks by using special paint and window coatings, and you don't need to have all your windows shuttered-closed whenever you want a secure network.

Visible-light communications being more secure is about as daft an idea as I've heard.

RE: Beefed-up security my ass
By mindless1 on 10/7/2008 5:24:43 PM , Rating: 2
Security is relative to the threat. The threat of lots of people sitting somewhere with a light detecting rifle-telescope sensor is far less than your neighbor or some wardriver receiving the signal now. Security is also about making changes.

Once the common hacker knows how to get a wifi signal, change the signal to light then they have to start over, relearn and buy far more expensive equipment instead of using the network adapter that came in their laptop or costs $5 online. If a window coating can prevent RF escape, why isn't it fair to say a basic window blind can prevent sufficient light escape?

Ultimately you are still correct that extending detection range through a window has a security drawback, we should see the marketing like context the claims were made within.

RE: Beefed-up security my ass
By PrinceGaz on 10/7/2008 7:30:33 PM , Rating: 2
The reason visible-light transmissions are so much more vulnerable than wifi, is that it will be much easier to spot potential hot-spots, as you can quite literally *see* them. They will also be much easier to target as you could use even a cheap telescope to focus on a very small target from a considerable distance with an extremely high gain, unlike high-gain RF antennas which in the UHF range are very bulky.

Anyone who is already going to the trouble of snooping on connections to steal information, would have no problem adapting a cheap telescope to allow them to snoop on light communications. In many ways it's easier as you can see exactly where you need to point it, and a good telescope with a spotter-scope would allow you to selectively pick targets all around your neighbourhood one after another until you found one that was useful.

Relatively long-wavelength microwave transmissions (such as those used by wifi) can be easily blocked with coatings that do not stop visible light-- an extreme example of this is in a microwave oven where the mesh on the oven door allows you to safely look into it as it cooks food, despite the fact that a magnetron inside is pumping out around 1KW of microwave radiation. The relatively simple screen on the door along with the metal body of the rest of the oven almost completely blocks those powerful microwave transmissions. We can easily coat windows with unobtrusive highly-effective RF shields and even more easily coat walls as they don't have to be so transparent to visible-light.

On the other hand, using visible-light for network-traffic absolutely requires that visible-light is blocked, which makes a mockery of the desire to be in a room with a window that lets you look outside. A world with light-networking would probably require a world with windowless-houses.

RE: Beefed-up security my ass
By cscpianoman on 10/7/2008 10:25:32 PM , Rating: 2
You would have to have some incredible setup for that. While you may be able to see the window with the blinds drawn, I really doubt, with today's tech you would be able to pick up a signal. There would be a significant resolution problem with trying to pick up such a scattered signal. Think of it terms of the stars, on a clear night you can see stars fairly easily and if you are lucky there is minimal flickering. However, add in clouds, temperature changes, dust in the air and any signal would be seriously destroyed by the time it reaches you. I could be wrong, but I would think it pretty impossible.

By Koder on 10/7/2008 3:02:11 PM , Rating: 2
I may be dating myself a bit here, but I seem to recall having a Timex Ironman Watch that communicated with your PC using visible light.

To do this, you simply held the watch in front of your computer monitor and select "update". The software would then flicker the screen and info would be transferred from your PC to your watch.

By MrPoletski on 10/8/2008 1:27:26 PM , Rating: 2 channel D...

By omnicronx on 10/8/2008 1:42:01 PM , Rating: 2
Heh those watches only worked with CRT monitors.. it wasnt just visible light.. I always wondered how they worked..

Not practical
By Ammohunt on 10/7/2008 1:45:01 PM , Rating: 2
Nice thought but very impractical. Ever mess with the IR LED remote controlled helicopters? any light interfereence(sun?) just reeks havoc with them.

RE: Not practical
By ChronoReverse on 10/7/2008 1:46:43 PM , Rating: 2
We haven't seen anything done with visible light though. There's a possibility that with the shorter wavelengths, the sensors could better distinguish between different sources of light.

I can picture it now....
By SlipDizzy on 10/7/2008 1:47:13 PM , Rating: 2
Caller - I'm not receiving emails on my laptop.

Analyst - Sir/Maam, there is a power outage in your area.

Caller - I know, but I'm pointing my flash light straight at my laptop!

RE: I can picture it now....
By DeepBlue1975 on 10/7/2008 2:00:52 PM , Rating: 2
Analyst - Maybe you're not switching it on/off at the appropriate rate. We have a nice finger digital signaling training program I could offer you...

By jamesmay21 on 10/7/2008 3:03:37 PM , Rating: 2
Call me behind the times but I've never heard of using regular LEDs for general purpose lighting. It makes a heck of a lot of sense though.

This networking concept takes it to another level though.

Looks like some companies are actually selling LEDs to the public already:

By mindless1 on 10/7/2008 5:10:03 PM , Rating: 2
Most people don't because of the high construction cost per lumen. If you have about $100 to spend you can build something reasonable to replace the old incandescent lamp you can buy at a yard sale for $2, then over a few years recoup the cost in not buying bulbs and paying less for energy. Most people have more important budgetary decisions to make, especially when CCFL bulbs are so much cheaper and thus TCO will be quite a bit less over the same period.

Our LED lighting future depends on lowering the cost of LED manufacture because the average person doesn't think in terms of maintenance costs of replacing a household light bulb like a city would when replacing a traffic light bulb.

By Smilin on 10/8/2008 9:25:05 AM , Rating: 2
So sure, you can make it fast enough to not be visibly noticeable. 75hz is tough to see but say you use 200hz instead... that's absolutely useless for data. You would have the bandwidth of like a 300baud modem. There is nothing to say that the LEDs would even have to use the visible spectrum so the flickering is the least of the worries.

You have to not only have enough bandwidth for bidrectional communication you have to have enough for bidirectional communication for every device in the room.

Line of sight is also a huge issue. Even with scattering and reflection you simply can't use your remote from the next room over. If lights, say in the nonvisible spectrum, were placed all over a room you could come close to solving the issue in one direction (assuming nobody ever puts a device in their pocket... /eyeroll ) but you would also need sensors everywhere to solve the issue in the other direction.

There is also the whole problem of "why?". There are so many better solutions to this.

Some ideas are just plain dumb.

RE: Retarded
By MrPoletski on 10/8/2008 1:26:12 PM , Rating: 2
Being as the article said they would be introducing it at a megabit, do you think 200hz is what they had in mind?

To support data transfer you need a carrier wave at least twice the frequency of the data rate. The carrier wave is the amplitude modulated light. That's a fundamental law of physics.

Some simple logic will lead you to understand the minimum frequency required to transmit a 1 megabit signal.

Argh! Sunlight!
By chmilz on 10/7/2008 2:01:47 PM , Rating: 2
Quick! Close the blinds so I can turn a light on to use the internet!

By Segerstein on 10/7/2008 2:56:07 PM , Rating: 2
Security as a feature??? Hell, back in the times of Netscape 4 I used encrypted emails, saved only on my HDD. Now most of the sensitive data travels plain text, and email is stored on ISP's servers.

I think that Bluetooth works just fine for short distances, WiFi for a flat and a few WiFi spots for a house.

I stream radio (or VoIP) via WiFi on my Nokia N95 - and can walk from room to room, with seamless AP handover. I also sometimes browse web on N95 in my bed before I go to sleep - with lights off.

These LEDs would be utterly useless and inferior to what I have now.

By Goty on 10/7/2008 3:05:30 PM , Rating: 2
This is just a very badly conceived idea. What if I want to game online at night with the lights off?

Sniff this
By Choppedliver on 10/7/2008 3:33:00 PM , Rating: 2
Great, now I have to close curtains to keep the wardrivers from sniffing my LED Lightbulb network.

And no more naked shows for the neighbors.

By Believer on 10/7/2008 4:02:07 PM , Rating: 2
This is a great idea in the ways it's meant to be implemented. (Yeah, not everything involves you surfing pr0n and playing games in your house).

Imagine the cost savings in having this in the corporate world, where the lights ARE supposed to be on while you work. (single duplex issues aside) They could transfer the broadcasting messages to the alley monitors by this; Steve got fired, Fried Chicken for dinner, etc etc

Then think of the life-saving potential in the car industry when your car can respond upon a danger broadcast signal derived from the ahead car breaking.

There are some great applications for this, truly ingenious!
Try not to be so narrow minded, this won't replace your home network, it's not really intended to.

Not impressive
By mindless1 on 10/7/2008 4:27:51 PM , Rating: 2
We slowly see implementation of LED lighting today, some would call it fast implementation but when I go to someone's home, odds are high they have zero LED lights besides possibly a night-light or car tail lights.

What's slowing the adoption? The higher cost even though the design can be fairly simple as mere LED array plus heatsink plus switching PSU plus housing. Now here is a suggestion we add a significant % more cost to modulate the LEDs for both TX/RX, which also limits the ranges in which the switching supply itself can operate as well as what kind of diffusion these lights can have - just as with a lightbulb, or actually moreso since typical consumer light bulbs are often frosted, you don't want LED light cast directly within your field of vision.

In the past and now I can get a wifi router for about $10-15 after a rebate. It typically consumes about 4W. If the increased cost and development time of wireless links via LEDs keeps people from implementing them only a small period longer then is more power wasted before the conversion, and contrary to the overly simple idea that light transmission uses less power than radio, with radio you can have one access point for a larger area instead of a much larger number of LED access points which cumulatively may use as much or more power.

I'm not against use of LED access points as separate modules independant of area lighting, that seems the best solution as the redundancy of having an extra LED could be a penalty of only a few dozen mA, but remember that can still create visible light if this alternative is chosen so it still can be part of the room lighting power budget, but rather in a way that does not slow development of mainstream room area lighting nor drive it's cost up and remember that the LED access points will always be checking for communication even when there's nothing around to communicate with. They could operate in a low power sleep mode but still you have some active electronics using power where they otherwise wouldn't.

Someday this tech will work well and be a good idea, there are the positive aspects the article mentioned and more that I left out when mentioning the above, but today we need to forget about getting fancy with LED lighting and focus on just budgetizing it to the point where the average joe starts using it, let alone replaces enough fixtures that the hypothetical power savings are actually realized.

By winterspan on 10/7/2008 6:52:54 PM , Rating: 2
Putting aside the obvious practical concerns like interference from other lighting, having to have receivers on the lights/walls, and needing all new devices with LED transmitters and photodiode receivers, what are the possible benefits of this versus RF networks like WiFi/Bluetooth/Zigbee/NFC?

- They mention power efficiency, but that would depend on how it works, the range of the lights, how many you would need, etc.
- Throughput? doesn't sound like it
- Cost ? not sure

Don't get me wrong, I think "free space optics" using LEDs instead of lasers is interesting technology, but I'm unsure of the feasibility of this application.

Rehashing old tech?
By piroroadkill on 10/8/2008 10:21:04 AM , Rating: 2
This sounds exactly like IR data transmission we already have. Sounds pointless.

By omnicronx on 10/8/2008 11:47:50 AM , Rating: 2
Kids how many times must I tell you, Leave the lights ON when you leave the room, I am trying to do some work here!!!

why not use ethernet cable?
By ashtonmartin on 10/9/2008 4:20:03 AM , Rating: 2
if security was a concern, why not use ethernet cable like back in the old days? it's faster and more secure.

"My sex life is pretty good" -- Steve Jobs' random musings during the 2010 D8 conference

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki