backtop


Print 26 comment(s) - last by LRonaldHubbs.. on Aug 8 at 11:32 AM


  (Source: TriStar Pictures)
Neuromorphic community continues to advance towards offering analogous behavior to mammal brains

Researchers have long hoped to emulate the way living organisms process and store information -- neural networks -- either in code or via simulations of true neurons on supercomputers.  But the Universität Zürich (Univ. of Zürich) and Eidgenössische Technische Hochschule Zürich (ETH Zürich) are dreaming much bigger imagining creating special circuits that mimic neurons in hardware, not software, allowing the speed necessary to perform incredibly compex tasks such as allowing an intelligent robot to recognize the objects it "sees" via retinal sensors or "hears" via cochlea-mimicking devices.

The researchers begin by using a "neuron" circuit to create masses of untrained neurons using standard very-large-scale integration (VLSI) design techniques.  They then map sensor inputs to neurons' bias voltages creating soft state machine style neural networks.

The resulting networks that can best recall the inputs -- in terms of gain, signal restoration, and multistability -- are then preserved.

The brain-like circuit, which researchers dubbed a "neuromorphic" chip, is used in a demo to perform "real-time context-dependent classification of motion patterns observed by a silicon retina."

brain chipNeuromorphic chips learn and process information faster than software models run on traditional hardware. [Image Source: INI]

Giacomo Indiveri, a professor at the Swiss universities' Institute of Neuroinformatics (INI) comments [press release], "Our goal is to emulate the properties of biological neurons and synapses directly on microchips.  The network connectivity patterns [in our latest work] closely resemble structures that are also found in mammalian brains.  Thanks to our method, neuromorphic chips can be configured for a large class of behavior modes. Our results are pivotal for the development of new brain-inspired technologies."

Retinal implant
The INI's new neuromorphic chip uses a retina-like sensor as a visual input for learning.
[Image Source: GeekInfo]

The INI's work builds on University of Sydney Electrical Engineering Professor Andre van Shaik's 1996 digital neuron model [abstract], which consists of transistors and capacitors attached to various voltage and current sources.  

Neuron circuit
In a neuromorphic chip, neurons are modeled as digital circuits, such as the one pictured.
[Image Source: Neural Networks/Elsevier]

This approach (also known as "spiking neural network" hardware) is different from the analog circuit model first demonstrated by the aforementioned Prof. Rodney Douglas (who at the time was a professor at the University of Oxford, UK) and Misha Mahowald, a California Institute of Technology (CalTech) PhD student, back in 1991.

The advantage of the digital approach is that while it lacks in the detailed reproduction of every facet the neuron's electrical behavior it "coarse grains" its basic operation down to a much smaller circuit, allowing large networks of neurons to be built.

International Business Machines Corp. (IBM) is but one of the large companies looking to productize neuromorphic chips.

A study on the work was published [abstract] in July's early edition of the Proceedings of the National Academy of Sciences (PNAS).  Co-authors of the work include Elisabetta Chicca, a postdoctoral research at the INI who since has moved to the Universität Bielefeld in Germany; INI director Prof. Rodney DouglasUeli Rutishauser, a postodoctoral researcher at Frankfurt, Germany's Max Planck Institute for Brain ResearchEmre Neftci, another postdoc at the INI; and Jonathan Binas, a PhD student at the INI.

Sources: ETH Zürich [press release], PNAS [abstract]



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Digital?
By DennisB on 7/25/2013 7:46:09 AM , Rating: 2
I'm not sure about the "new" thing here or the difference to the efforts in neuro processors for the past 3+ decades. Any idea?
Some papers from the past:
http://www.ieeeexplore.net/xpl/articleDetails.jsp?...
http://www.ieeeexplore.net/xpl/articleDetails.jsp?...


RE: Digital?
By LRonaldHubbs on 7/25/2013 8:22:59 AM , Rating: 2
This snippet from the article summarizes the novelty:
quote:
The advantage of the digital approach is that while it lacks in the detailed reproduction of every facet the neuron's electrical behavior it "coarse grains" its basic operation down to a much smaller circuit, allowing large networks of neurons to be built.


Basically the designers of this circuit traded off functionality for density, enabling them to put many more 'neurons' onto a chip.

For reference, here is what's IBM has done:
http://www.modha.org/papers/013.CICC2.pdf (see Figure 3)

and University of Manchester:
http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumb... (see Figure 1)


"I f***ing cannot play Halo 2 multiplayer. I cannot do it." -- Bungie Technical Lead Chris Butcher














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki