IBM Unveils Latest Version of Spiking Neuron-Based "Brain Chips"
August 9, 2013 9:46 AM
comment(s) - last by
New advanced processors from IBM mimic the mammalian brain -- let's hope they're on our side
International Business Machines, Inc. (
) has been working on a project co-funded by both the U.S. military and various academic partners to
develop a chip that "thinks" and "processes"
like a fleshy life form
. IBM calls the project SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) after the structure in neurons that's used to signal the neurons they're associated with.
I. Meet the First Full-Fledge Neural Networks Dev Kit
IBM research fellow Dr. Dharmendra S. Modha has partnered with a colleague in academia --
Professor Rajit Manohar
, a spinoff of the
's (Univ. of Zürich)'s
Institute of Neuroinformatics
to develop the "thinking" chip.
Fifty-three million dollars in grant money later, they have produced a complete neural networks development packages that aims to offer some abstraction and to ease the process of developing neuronal networks that can process and respond to sensor input.
The basic offering consists of:
Commonly known as a
spiking (digital) neuron-type circuit
Used to programmatic network of the fundamental units
Maps unit interactions
Maps I/O to network
150+ premade corelet examples
Real-time actuator and sensor examples are included
Sort of like smartphone development "simulated hardware"
Uses software model of hardware unit to predict what your neural network code will do
Allows you to get network basically working before loading it onto actual hardware
Saves the cost of having to have the development hardware up front
Bundled package containing SDK, simulator, hardware support, and examples library
One example showcased by IBM shows a neural network using a retinal sensor, which mimics the human eye. A real human brain translates over 1 terabyte of raw visual data a day into recognized shapes, tracked motions, depth perceptions, and self-tuning feedback to the sensors based on light conditions. By comparison IBM's network is far simpler, accomplishing basic shape detection.
(left: retinal sensor "vision"; right: significant shape output)
Dr. Modha comments, "Architectures and programs are closely intertwined and a new architecture necessitates a new programming paradigm. We are working to create a FORTRAN for synaptic computing chips. While complementing today’s computers, this will bring forth a fundamentally new technological capability in terms of programming and applying emerging learning systems."
II. Project Motivations -- Killer Drones, Self-Driving Cars, and Stock Picking
The project is funded by
Defense Advanced Research Projects Agency
(DARPA). DARPA typically does not directly participate in the non-military projects it funds, but more often than not it funds projects that it considers
of interest to the future technological progress of the U.S. military and intelligence community
Neural networks in the future could be used to create
fully autonomous attack drones
ground-based war robots
Facial recognition neural networks
could also sort through feeds of small ground based and high-flying high-resolution air based camera drones, as well as hacked cloud-connected cameras in the target state, to recognize targets slated for termination. With the target's location in hand, the killing robots could then be dispatched using lightweight neural networks to perform target identification and aiming.
With both domestic
armed drone use
federal camera surveillance of citizens
on the rise, similar tactics could be applied domestically against individuals
the ruling administration deems "terrorists."
Drones could use multi-sensor networks to locate targets, then employ lightweight neural networks to locate their target and aim the killing shot. [Image Source: Drone Wars UK]
Currentlty the project is in its third phase, which received $12M USD in additional funding. IBM hosts details of the past rounds (
), for those interested.
IBM and its academic partners are more interested in the science and financial implications of the project than the warfare side. Among the applications they're eyeing are:
predictive stock/currency trading
A graphical representation of IBM's neural network abstraction scheme
IBM describes the justification for this new computing paradigm commenting:
Although they are fast and precise “number crunchers,” computers of traditional design become constrained by power and size while operating at reduced effectiveness when applied to real-time processing of the noisy, analog, voluminous, Big Data produced by the world around us. In contrast, the brain—which operates comparatively slowly and at low precision—excels at tasks such as recognizing, interpreting, and acting upon patterns, while consuming the same amount of power as a 20 watt light bulb and occupying the volume of a two-liter bottle.
IBM’s long-term goal is to build a chip system with ten billion neurons and hundred trillion synapses, while consuming merely one kilowatt of power and occupying less than two liters of volume.
Neural networks are the most optimal design ever produced by nature for the purpose of survival by dramatic transformation of the environment in which a living organism lives. Neural networks could one day help machines serve mankind performing menial tasks and helping to cure disease. Alternatively, they could potentially be the
most lethal combination of destructive power, obedience, and precision ever witnessed by mankind
, potentially killing millions with the press of a single button.
In that regard neural networks and ubiquitous massively parallel sensor networks are perhaps the most promising yet deadly tool produced by mankind since the fateful splitting of the atom.
This article is over a month old, voting and posting comments is disabled
RE: Run like hell
8/9/2013 10:21:17 PM
"The ruling administration" is the real problem in your scenario, not any particular technology.
Or would you feel better if they "silenced" someone using old-fashioned methods?
RE: Run like hell
8/10/2013 7:43:16 AM
Its not the technology that's the problem here.
It's the Humans that program/use it.
I find it very disheartening people apparently project 'scifi doomsday film' scenarios onto real life when those movies were made by artists who have a limited understanding of technology and science in the first place.
On the bright side, this could easily have numerous beneficial applications for far better and highly efficient system managing and problem solving.
Computers have already been put to various tasks where they invented a 'new internet' that's faster, more efficient, etc.
Tons of useful applications.
RE: Run like hell
8/10/2013 2:20:00 PM
yes and no,
for instance the desire of the nazi's to wipe out jewish populations in central and eastern europe was not unique, there is well a documented history of terrible periodic pogroms over the last two thousand years. what made the nazis different to every other anti-semitic group over the last few millenia is that with mechanised warfare and industrilisation, for the first time, they had the technology to realise their goal.
that doesnt mean to say that i'm against cars and factories!
i'm just saing that new enabling technologies can have a downside - if you think about them before widespread implementation it can probably mitigate some of the harm without hopefully impeding the benefits.
"It looks like the iPhone 4 might be their Vista, and I'm okay with that." -- Microsoft COO Kevin Turner
Neuron Chip "Learns" to Recognize Distinct Gestures Via Retina Sensors
July 24, 2013, 8:35 PM
Microsoft Wants You to Let it Watch Your House in "Near-Real Time"
July 16, 2013, 3:00 PM
White House Admits 4 Americans Killed in Warrantless Middle East Drone Strikes
May 23, 2013, 5:05 PM
Military Contractors Take a Step Forward Towards Autonomous Killer Robot Swarm
March 5, 2013, 4:10 PM
Raytheon's "RIOT" Software Tracks Trillions of Pieces of Your Data on Facebook
February 11, 2013, 3:55 PM
Gigabyte Z87X-UD7 TH Motherboard Gets Thunderbolt 2 Certification
December 16, 2013, 11:01 AM
Report: Google May Design its Own ARM-Based Server Processors
December 13, 2013, 12:25 PM
Qualcomm Unveils First 64-Bit "Snapdragon 410" Mobile Processor
December 10, 2013, 12:20 PM
12/9/2013 Daily Hardware Reviews
December 9, 2013, 5:34 PM
Samsung Unveils "Industry First" 1TB mSATA SSD
December 9, 2013, 10:21 AM
12/4/2013 Daily Hardware Reviews
December 4, 2013, 5:43 PM
Most Popular Articles
China's Lunar Rover Enters Orbit, Prepares for Historic Sat. Landing
December 13, 2013, 5:00 PM
Metro-Enabled Firefox Browser Expected to Land After Two Years of Work
December 12, 2013, 5:21 PM
The History of Normandy: How Nokia Plotted a Low-End Android Line
December 11, 2013, 8:12 PM
Ten Senators Sponsor Bill to Scrap Corn Ethanol Market Manipulation
December 13, 2013, 1:52 PM
China's Moon Rover Lands Safe and Sound, Starts Snapping Pics
December 16, 2013, 1:22 PM
Latest Blog Posts
Justice Leaks Details of Next HTC One Two Flagship Phone
Dec 5, 2013, 4:04 PM
Global Cyber Espionage Concerns Reveal Growing Cyber Armies
Nov 29, 2013, 11:04 AM
Is The Period Becoming an Expression of Anger?
Nov 26, 2013, 2:02 PM
NSA and Congress -- You Will Never Kill the Constitution, It's an Idea
Nov 10, 2013, 2:00 PM
AT&T Explores $100B+ USD Deal to Acquire Vodafone's European Operations
Nov 4, 2013, 7:34 AM
More Blog Posts
Copyright 2013 DailyTech LLC. -
Terms, Conditions & Privacy Information