Article
Gesture interaction with medical software for a neuronavigation device
Search Medline for
Authors
Published: | September 16, 2010 |
---|
Outline
Text
Objective: Neurosurgical procedures in the operation theatre become more and more dependent on the availability and use of information from several imaging modalities. The surgeon-machine interaction is still performed by direct contact with the navigation system, e.g. by use of a remote control or a touch-screen interface. This is disrupting the surgeon‘s workflow. We propose a system that subsequently transforms real-time camera-frames of hand gestures into interface commands controlling the OsiriX display, thus eliminating the need to physically touch any input device. This enables the neurosurgeon to control the software by himself without compromising sterility requirements.
Methods: A stereo-camera setup consisting of two Unibrain Fire-I cameras is utilized to triangulate the position of the hand in 3D. An additional light-source, preferably IR-LEDs may be used to brighten up the scene. An existing prototype constructed from item system parts houses a 24“ iMac. Aluminum surfaces foster hygienic cleaning of the workstation. The camera frames are processed using a color-segmentation algorithm. The centroid of the hand is computed. Shape-classification uses the Fourier-Transform of the hand‘s contour and a nearest-neighbor method, allowing for recognition of at least five different pre-defined gestures. The image-processing step is based on open-source software (OpenCV).
Results: Our system is able of real-time processing and does not impose any special requirements on the environment and user. The detected hand-position differs from intended position less than 2 cm in 96% of all cases and less than 1cm in over 50%. The correct gesture is recognized in 95% (±3%) of all cases and handedness is not an issue. Intuitive gestures for commands like “zoom”, “rotate”, “slide through image-series”, “transpose” and “reset” are implemented.
Conclusions: We present a system that enables surgeons to control the OsiriX medical visualization software by gestures. But it is designed application independent and there are working versions for different platforms. The gesture-recognition works reliably under most lighting-conditions, and is invariant to any type of surgical glove color. The device can easily be extended to recognize new gestures. Currently, the system is tested under surgical conditions in the operation theater. In addition the system serves as input device for a neuronavigation device under development.