Purdue engineers see role for gesture recognition in robotic surgery
West Lafayette, IN--Surgeons of the future might use a system that recognizes hand gestures as commands to control a robotic scrub nurse or tell a computer to display medical images of the patient during an operation. Both the gesture recognition and robotic nurse innovations might help to reduce the length of surgeries and the potential for infection, said Juan Pablo Wachs, an assistant professor of industrial engineering at Purdue University.
The hand-gesture recognition system uses a new type of camera developed by Microsoft, called Kinect, that senses three-dimensional space. The camera is found in new consumer electronics games that can track a person's hands without the use of a wand.
The "vision-based hand gesture recognition" technology could have other applications, including the coordination of emergency response activities during disasters. And in a medical setting, surgeons routinely need to review medical images and records during surgery, but stepping away from the operating table and touching a keyboard and mouse can delay the surgery and increase the risk of spreading infection-causing bacteria.
Findings from the research will be detailed in a paper appearing in the February issue of Communications of the ACM, the publication of the Association for Computing Machinery. The paper, featured on the journal's cover, was written by researchers at Purdue, the Naval Postgraduate School in Monterey, CA, and Ben-Gurion University of the Negev, Israel.
"One challenge will be to develop the proper shapes of hand poses and the proper hand trajectory movements to reflect and express certain medical functions," Wachs said. "You want to use intuitive and natural gestures for the surgeon, to express medical image navigation activities, but you also need to consider cultural and physical differences between surgeons. They may have different preferences regarding what gestures they may want to use." Other challenges include providing computers with the ability to understand the context in which gestures are made and to discriminate between intended gestures versus unintended gestures.
Wachs is developing advanced algorithms that isolate the hands and apply "anthropometry," or predicting the position of the hands based on knowledge of where the surgeon's head is. The tracking is achieved through a camera mounted over the screen used for visualization of images. Accuracy and gesture-recognition speed depend on advanced software algorithms.
The work is funded by the U.S. Agency for Healthcare Research and Quality.
SOURCE: Purdue University; www.purdue.edu/newsroom/research/2011/110203WachsGestures.html
Subscribe now to Laser Focus World magazine; It’s free!
Gail Overton | Senior Editor (2004-2020)
Gail has more than 30 years of engineering, marketing, product management, and editorial experience in the photonics and optical communications industry. Before joining the staff at Laser Focus World in 2004, she held many product management and product marketing roles in the fiber-optics industry, most notably at Hughes (El Segundo, CA), GTE Labs (Waltham, MA), Corning (Corning, NY), Photon Kinetics (Beaverton, OR), and Newport Corporation (Irvine, CA). During her marketing career, Gail published articles in WDM Solutions and Sensors magazine and traveled internationally to conduct product and sales training. Gail received her BS degree in physics, with an emphasis in optics, from San Diego State University in San Diego, CA in May 1986.