University of Florida researchers Jack DiGiovanna, left, and Justin Sanchez stand next to their test bed. Rats were hooked up to the learning computer, which worked with them to control the prosthetic. Over time they performed increasingly difficult tasks better and better, receiving rewards.  (Source: University of Florida)

An artistic rendering of the system from the researchers. The system introduces a computer to also communicate with the mechanical device and assimilate and adapt to patterns.  (Source: University of Florida)
New device could be used to control artificial limbs and far more

DailyTech previously reported on a series of scientific studies in which a new kind of brain electrode implant device, using microelectromechanical systems (MEMS) for positioning, was able to create a highly in-tune interface between a mechanical arm and motor regions of the cerebral cortex.  It was discovered that monkeys acquired natural movements when using the arm, developing complex maneuvers never before seen in prosthetics.

Now the University of Florida has devised a similar device that is either amazing or amazingly creepy -- a learning brain interface, to aid in prosthetics control.  The project, a joint effort by the UF College of Medicine and School of Engineering, set out to make a system that added a learning computer between the brain's output and the mechanical inputs.  This computer would monitor the actions and over time learn to help the user perform basic actions easily, as the human brain does with muscle memory.

In the new system, the brain would not have exclusive control over the limb; the computer would have control as well.  They would use an extremely sophisticated computer to handle the complex processing necessary.  The complete system was detailed in the journal IEEE Transactions on Biomedical Engineering.

Justin C. Sanchez, Ph.D., a UF assistant professor of pediatric neurology and the study’s senior author describes, "In the grand scheme of brain-machine interfaces, this is a complete paradigm change.  This idea opens up all kinds of possibilities for how we interact with devices. It’s not just about giving instructions but about those devices assisting us in a common goal. You know the goal, the computer knows the goal and you work together to solve the task."

In the past, the goal was to develop an implantable computer chip to control prosthetic limbs.  The algorithms to accomplish this might be complex, but they were static -- they did not adapt.  Sanchez says the new machine can assimilate information over time to produce superior actions.

He states, "The status quo of brain-machine interfaces that are out there have static and fixed decoding algorithms, which assume a person thinks one way for all time.  We learn throughout our lives and come into different scenarios, so you need to develop a paradigm that allows interaction and growth."

The device was actually constructed and tested.  Lab rats were implanted with electrodes hooked up to the learning computer.  Their goal was to move a robotic arm to hit a target, a task typically overly complex to the rat brain.  When they hit the target they would be rewarded with a drop of water.

Over time, the rats learned which brain signals accomplished the task best.  The computer, whose goal was to also score the most points, winning the rat water, also learned better ways to process the signals into the correct action.  Even as the difficulty was increased, the mice's proficiency continued to rise with time.

"We think this dialogue with a goal is how we can make these systems evolve over time," explains Sanchez, "We want these devices to grow with the user. (Also) we want users to be able to experience new scenarios and be able to control the device."

Dawn Taylor, Ph.D., an assistant professor of biomedical engineering at Case Western Reserve University was impressed with the study as she thinks that if rats can accomplish such tasks with simple brains a human or primate could do much more.  She states, "It’s a clear demonstration of a methodology that will work in situations when other implementations would fall apart."

The device might be able to be not only applied to limbs, but a broad array of developing brain-interfaced mechanical systems and electronics.  For example, the Army is currently working on developing brain-interfaced binoculars which help soldiers scan for enemy targets.

A learning machine brain interface certainly seems like a wise idea, but it remains to see how the public would react to such a device.  There have been recent concerns over increasingly complex artificial intelligences crossing the line into aggression.  At a recent military summit, the provocative question of whether a war robot could be found guilty of committing a war crime was raised.  Still, for the possible long term risks, the benefits of this kind of research to the disabled seem to clearly take precedence.

Sanchez worked on the project with engineering Professors Jose Principe, Ph.D., and Jose Fortes, Ph.D., and engineering doctoral students Jack DiGiovanna and Babak Mahmoudi.  The research was funded by National Science Foundation, the Children’s Miracle Network and the UF Alumni Association.

"Folks that want porn can buy an Android phone." -- Steve Jobs

Copyright 2017 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki