Purdue University researchers are developing a system
that controls robots working in the hospital through hand gesture recognition.
Pablo Wachs, an assistant professor of industrial engineering at Purdue
University, along with researchers from the Naval Postgraduate School in
Monterey, California and Ben-Gurion University of Negev, Israel, are developing
a system of robotic scrub nurses and computers that can obey commands through the use of hand
gestures in a hospital setting.
a concept Tom Cruise demonstrated vividly in the film 'Minority Report,'"
on the vision-based hand gesture recognition system started years ago by
researchers at Washington Hospital Center and Ben-Gurion University, which
included Wachs when he was a research fellow and doctoral student. Now, Wachs
is working to develop and continuously improve the system at Purdue
gesture recognition system consists of a camera and specific algorithms that
apply anthropometry, which predicts the position of the robot's hands based on
where the surgeon's head is. The camera, which is a Kinect camera developed by
Microsoft that maps images in 3-D, is mounted over a screen that shows the
surgeon medical images of a patient during an operation. The robotic scrub
nurse is used to assist the surgeon during surgery and is capable of helping
because of advanced algorithms.
it will be very difficult using a robot to
achieve the same level of performance as an experienced nurse
who has been working with the same surgeon for years, often scrub nurses have
had very limited experience with a particular surgeon, maximizing the chances
for misunderstandings, delays and sometimes mistakes in the operating
room," said Wachs. "In that case, a robotic scrub nurse could be
this technology could have huge benefits in the operating room, such as
reducing the length of surgeries, coordinating emergency response efforts and
assisting surgeons efficiently during an operation, there are still a few
challenges ahead that need to be overcome before this system can be applied to
an actual hospital.
challenge will be to develop the proper shapes of hand poses and the proper hand
trajectory movements to reflect and express certain medical
functions," said Wachs. "You want to use intuitive and natural
gestures for the surgeon, to express medical image navigation activities, but
you also need to consider cultural and physical differences between surgeons.
They may have different preferences regarding what gestures they may want to
roadblocks that may temporarily hinder this technology would be the development
of computers that understand the meaning of certain hand gestures as well as
"discriminate" between intended and unintended gestures.
his fellow researchers plan to make the system recognize easy vocabulary and
simple gestures. They also want to make it so that surgeons do not have to wear
virtual reality gloves, and that the system responds with an "OK"
when it understands a command. Of course, the system is to be accurate and as
low-cost as possible. In addition, they'd like the system to configure itself
quickly when placed in different environments.
we also want to integrate voice recognition, but the biggest challenges are in
gesture recognition," said Wachs. "Much is already known about voice
This study was
published in Communications of