Neuron Chip "Learns" to Recognize Distinct Gestures Via Retina Sensors
July 24, 2013 8:35 PM
comment(s) - last by
(Source: TriStar Pictures)
Neuromorphic community continues to advance towards offering analogous behavior to mammal brains
Researchers have long hoped to emulate the way living organisms process and store information --
-- either in code or via
simulations of true neurons on supercomputer
s. But the
(Univ. of Zürich) and
Eidgenössische Technische Hochschule Zürich
(ETH Zürich) are dreaming much bigger imagining creating special circuits that mimic neurons in hardware, not software, allowing the speed necessary to perform incredibly compex tasks such as
allowing an intelligent robot
to recognize the objects it "sees" via
or "hears" via
The researchers begin by using a "neuron" circuit to create masses of untrained neurons using standard very-large-scale integration (VLSI) design techniques. They then map sensor inputs to neurons' bias voltages creating soft state machine style neural networks.
The resulting networks that can best recall the inputs -- in terms of gain, signal restoration, and multistability -- are then preserved.
The brain-like circuit, which researchers dubbed a "neuromorphic" chip, is used in a demo to perform "real-time context-dependent classification of motion patterns observed by a silicon retina."
Neuromorphic chips learn and process information faster than software models run on traditional hardware. [Image Source: INI]
, a professor at the Swiss universities'
Institute of Neuroinformatics
(INI) comments [press release], "Our goal is to emulate the properties of biological neurons and synapses directly on microchips. The network connectivity patterns [in our latest work] closely resemble structures that are also found in mammalian brains. Thanks to our method, neuromorphic chips can be configured for a large class of behavior modes. Our results are pivotal for the development of new brain-inspired technologies."
The INI's new neuromorphic chip uses a retina-like sensor as a visual input for learning.
[Image Source: GeekInfo]
The INI's work builds on
University of Sydney
Professor Andre van Shaik
digital neuron model
[abstract], which consists of transistors and capacitors attached to various voltage and current sources.
In a neuromorphic chip, neurons are modeled as digital circuits, such as the one pictured.
[Image Source: Neural Networks/Elsevier]
This approach (also known as "spiking neural network" hardware) is different from the analog circuit model first demonstrated by the aforementioned Prof. Rodney Douglas (who at the time was a professor at the
University of Oxford
, UK) and
California Institute of Technology
(CalTech) PhD student, back in 1991.
The advantage of the digital approach is that while it lacks in the detailed reproduction of every facet the neuron's electrical behavior it "coarse grains" its basic operation down to a much smaller circuit, allowing large networks of neurons to be built.
International Business Machines Corp. (
) is but one of the large companies looking to
productize neuromorphic chips
A study on the work was
[abstract] in July's early edition of the Proceedings of the National Academy of Sciences (PNAS). Co-authors of the work include Elisabetta Chicca, a postdoctoral research at the INI who since has moved to the
in Germany; INI director
Prof. Rodney Douglas
, a postodoctoral researcher at Frankfurt, Germany's
Max Planck Institute for Brain Research
, another postdoc at the INI; and
, a PhD student at the INI.
ETH Zürich [press release]
This article is over a month old, voting and posting comments is disabled
RE: Two funny things about AI ...
7/27/2013 1:00:28 AM
We know how neural networks work. Very useful results have been demonstrated in the field of pattern recognition thanks to neural networks.
Neural network simulation, just like any other simulation, would benefit greatly from more processing power. More processing power means you can get your results faster, you can run more and better tests, you can expand a parameter space, you try different architectures, and so on.
"People Don't Respect Confidentiality in This Industry" -- Sony Computer Entertainment of America President and CEO Jack Tretton
Artificial Intelligence System "ConceptNet 4" Has IQ of 4-Year-Old
July 16, 2013, 8:32 AM
Google Scoops up Self-Learning Neural-Network Startup for Image Searches
March 13, 2013, 2:50 PM
Stanford Researchers Develop Wireless Retinal Implant
May 15, 2012, 12:55 PM
IBM Researchers Develop Neurosynaptic Chips for Cognitive Computing
August 18, 2011, 11:44 AM
Researchers Map Neural Connections in Mouse Brain
April 11, 2011, 10:01 AM
"Prepare to be Punished": Microsoft is Killing OneDrive With Cuts, Blames Users
November 3, 2015, 8:23 PM
Apple's New "Magic" Peripheral Line Packs High Tech, High Prices
October 13, 2015, 9:39 PM
Samsung Adds 2 TB 850 EVO, PRO SSDs for $800, $1000
July 7, 2015, 4:23 PM
Seagate Senior Researcher: Heat Can Kill Data on Stored SSDs
May 13, 2015, 2:49 PM
How to Recover Most Apps After Your NVIDIA Driver Crashes in Windows 10
March 30, 2015, 12:54 PM
Tinkerer Gets Old School Mac Plus Running on the Modern Web
March 24, 2015, 6:41 PM
Latest Blog Posts
Sceptre Airs 27", 120 Hz. 1080p Monitor/HDTV w/ 5 ms Response Time for $220
Dec 3, 2014, 10:32 PM
Costco Gives Employees Thanksgiving Off; Wal-Mart Leads "Black Thursday" Charge
Oct 29, 2014, 9:57 PM
"Bear Selfies" Fad Could Turn Deadly, Warn Nevada Wildlife Officials
Oct 28, 2014, 12:00 PM
The Surface Mini That Was Never Released Gets "Hands On" Treatment
Sep 26, 2014, 8:22 AM
ISIS Imposes Ban on Teaching Evolution in Iraq
Sep 17, 2014, 5:22 PM
More Blog Posts
Copyright 2016 DailyTech LLC. -
Terms, Conditions & Privacy Information