Robots could soon possess hand gesture recognition in the emergency room  (Source: Purdue University photo/Mark Simons)
Approach was inspired by the Tom Cruise film, "The Minority Report"

Purdue University researchers are developing a system that controls robots working in the hospital through hand gesture recognition.

Juan Pablo Wachs, an assistant professor of industrial engineering at Purdue University, along with researchers from the Naval Postgraduate School in Monterey, California and Ben-Gurion University of Negev, Israel, are developing a system of robotic scrub nurses and computers that can obey commands through the use of hand gestures in a hospital setting.  

"It's a concept Tom Cruise demonstrated vividly in the film 'Minority Report,'" said Wachs. 

Research on the vision-based hand gesture recognition system started years ago by researchers at Washington Hospital Center and Ben-Gurion University, which included Wachs when he was a research fellow and doctoral student. Now, Wachs is working to develop and continuously improve the system at Purdue University. 

This hand gesture recognition system consists of a camera and specific algorithms that apply anthropometry, which predicts the position of the robot's hands based on where the surgeon's head is. The camera, which is a Kinect camera developed by Microsoft that maps images in 3-D, is mounted over a screen that shows the surgeon medical images of a patient during an operation. The robotic scrub nurse is used to assist the surgeon during surgery and is capable of helping because of advanced algorithms. 

"While it will be very difficult using a robot to achieve the same level of performance as an experienced nurse who has been working with the same surgeon for years, often scrub nurses have had very limited experience with a particular surgeon, maximizing the chances for misunderstandings, delays and sometimes mistakes in the operating room," said Wachs. "In that case, a robotic scrub nurse could be better."

While this technology could have huge benefits in the operating room, such as reducing the length of surgeries, coordinating emergency response efforts and assisting surgeons efficiently during an operation, there are still a few challenges ahead that need to be overcome before this system can be applied to an actual hospital. 

"One challenge will be to develop the proper shapes of hand poses and the proper hand trajectory movements to reflect and express certain medical functions," said Wachs. "You want to use intuitive and natural gestures for the surgeon, to express medical image navigation activities, but you also need to consider cultural and physical differences between surgeons. They may have different preferences regarding what gestures they may want to use."

Other roadblocks that may temporarily hinder this technology would be the development of computers that understand the meaning of certain hand gestures as well as "discriminate" between intended and unintended gestures. 

Wachs and his fellow researchers plan to make the system recognize easy vocabulary and simple gestures. They also want to make it so that surgeons do not have to wear virtual reality gloves, and that the system responds with an "OK" when it understands a command. Of course, the system is to be accurate and as low-cost as possible. In addition, they'd like the system to configure itself quickly when placed in different environments.  

"Eventually we also want to integrate voice recognition, but the biggest challenges are in gesture recognition," said Wachs. "Much is already known about voice recognition." 

This study was published in Communications of the ACM.

"We basically took a look at this situation and said, this is bullshit." -- Newegg Chief Legal Officer Lee Cheng's take on patent troll Soverain

Most Popular Articles

Copyright 2018 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki