Designing a wearable navigation system for image-guided cancer resection surgery

Ann Biomed Eng. 2014 Nov;42(11):2228-37. doi: 10.1007/s10439-014-1062-0. Epub 2014 Jul 1.

Abstract

A wearable surgical navigation system is developed for intraoperative imaging of surgical margin in cancer resection surgery. The system consists of an excitation light source, a monochromatic CCD camera, a host computer, and a wearable headset unit in either of the following two modes: head-mounted display (HMD) and Google glass. In the HMD mode, a CMOS camera is installed on a personal cinema system to capture the surgical scene in real-time and transmit the image to the host computer through a USB port. In the Google glass mode, a wireless connection is established between the glass and the host computer for image acquisition and data transport tasks. A software program is written in Python to call OpenCV functions for image calibration, co-registration, fusion, and display with augmented reality. The imaging performance of the surgical navigation system is characterized in a tumor simulating phantom. Image-guided surgical resection is demonstrated in an ex vivo tissue model. Surgical margins identified by the wearable navigation system are co-incident with those acquired by a standard small animal imaging system, indicating the technical feasibility for intraoperative surgical margin detection. The proposed surgical navigation system combines the sensitivity and specificity of a fluorescence imaging system and the mobility of a wearable goggle. It can be potentially used by a surgeon to identify the residual tumor foci and reduce the risk of recurrent diseases without interfering with the regular resection procedure.

Publication types

  • Research Support, N.I.H., Extramural
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Equipment Design
  • Image Processing, Computer-Assisted / instrumentation*
  • Neoplasms / surgery*
  • Software
  • Surgery, Computer-Assisted / instrumentation*
  • Wireless Technology