backtop


Print 17 comment(s) - last by YashBudini.. on Sep 27 at 10:19 PM

The system is based off human vision systems and will be applied to fully-autonomous vehicles

Researchers at Yale University and New York University have developed a new supercomputer that is capable of navigating a car more quickly and efficiently through the use of a human-based visual system. 

The supercomputer is called NeuFlow, and it was created by Eugenio Culurciello of Yale's School of Engineering & Applied Science along with Yann LeCun from New York University. Culurciello developed the human-inspired system while LeCun supplied the complex vision algorithms, which runs the neural networks for synthetic vision applications. NeuFlow's actions are based on the human visual system, acting as quickly and efficiently as a human when obeying traffic laws, distinguishing different objects from one another such as trees and buildings, and reacting to other drivers on the road. 

Culurciello and LeCun are looking to use this supercomputer as a way for cars to drive themselves. To do this, NeuFlow runs more than 100 billion operations per second only using a few watts of power, which is less than what's required to power a cell phone. NeuFlow exists on a single chip, making it no larger than a wallet, but it is more efficient and powerful than full-scale computers. Also, this system "processes tens of megapixel images in real time." 

"One of our first prototypes of this system is already capable of outperforming graphic processors on vision tasks," said Culurciello.

The development of fully-autonomous vehicle's will be a significant advancement in the world of human convenience and safety, and that's why NeuFlow isn't the only computer-driven system out there right now. 

In 2008, DailyTech went for a spin in the Chevrolet Tahoe DARPA Challenge vehicle, which is a fully-autonomous vehicle that won the DARPA 2007 Urban Challenge and is equipped with GPS, radar, video, laser and LIDAR sensors and inputs to recognize objects on the road. Its key sensor, velodyne, has 64 sensors in a wide array and is able to collect one million bits of data per second at 10 Hz. It's logic consists of over 350,000 lines of code, and is able to obtain a 3-D view of the surrounding terrain just like NeuFlow.

But unlike the Chevrolet Tahoe DARPA Challenge vehicle, NeuFlow is not quite ready for vehicle use yet. Culurciello and LeCun are looking to use NeuFlow in other applications as well, such as a tool for 360-degree synthetic vision for soldiers in combat and to help improve robot navigation in dangerous locations. 

NeuFlow was presented by Culurciello at the High Performance Embedded Computing (HPEC) workshop in Boston, Mass. on September 15. 



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Toyota
By ians55 on 9/17/2010 1:31:31 PM , Rating: 2
You definitely can trust more to the drunk buddy on the next lane.


"I modded down, down, down, and the flames went higher." -- Sven Olsen














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki