Augmented Reality (AR) has the potential to bring assistance to surgeons by visualizing useful medical information such as the scanned image of the patient. The key challenges for surgical AR applications are the need to choose appropriate information for the AR visualization and its visualizing technique, AR user interface, and validation of the system. Especially in neurosurgery, AR visualization can provide minimal invasiveness and maximal safety to the existing neuronavigation system. The neurosurgical field is often small, and avoiding inadvertent injuries to a vascular or nervous structure is a challenge due to the smallest possible path for a given intracranial pathology.
Our current setup includes six Optitrack cameras that use optical markers for real-time tracking and HoloLens 2 as an AR device. Optitrack tracks four rigid-body models: a surgical tool, a HoloLens 2 headset, a phantom head model, and a localization marker. Optimal numbers of optical markers are attached to each model for accurate and stable tracking from Optitrack. The localization marker model is used for the calibration between two different coordinate systems of the Optitrack system and HoloLens 2.
The 3D model of brain ventricles was created on 3D Slicer software using a patient’s CT scan. A skull frame is extracted into the 3D model as well to provide a reference of ventricle location to the head and evaluate the alignment. One of our preliminary works (by Emily, REU 2021) was implementing the system on an Android app using AR Core. Mobile AR is limited in its capabilities of computation, robustness, and interoperability, thus not an optimal solution for a host AR device when used in medical applications that require a high level of precision and robustness from the system.
While registration of the ventricle hologram inside the skull enhances the surgeon’s limited field of view, there is still a lack of guidance about the surgical task provided to the surgeon. The catheter trajectory needs to be determined by identifying anatomical landmarks and carefully calculated by checking the angle and depth of the catheter insertion. Thus, we integrate a marker-based tool tracking of the EVD catheter that enables contextual guidance such as the distance to the target or the angle of the catheter trajectory calculated in real-time.
This work is supported by Dr. Maria Gorlatova’s NSF grants CNS-1908051, and CAREER-2046072.
NeuroLens: Augmented Reality-based Contextual Guidance through Surgical Tool Tracking in Neurosurgery [Link]
S. Eom, D. Sykes, S. Rahimpour, M. Gorlatova.
In IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Oct. 2022. (Acceptance Rate: 21%)
AR-Assisted Surgical Guidance System for Ventriculostomy [Link]
S. Eom, S. Kim, S. Rahimpour, M. Gorlatova.
In IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Mar. 2022.
The Dawning of the Age of the Metaverse, 2022 Duke ECE Magazine, Oct. 2022 [Link]
Current Team Members
Sarah Sangjun Eom, ECE, Duke University (Primary)
Dr. Maria Gorlatova, ECE, Duke University
Dr. Shervin Rahimpour, Department of Neurosurgery, University of Utah
Dr. Joshua Jackson, Department of Neurosurgery, Duke University
David Sykes, School of Medicine, Duke University
Tiffany Ma, Undergraduate in CS, Duke University
Vanessa Tang, High School Student, NC School of Science and Math
Former Team Members
Neha Vutakuri, Undergraduate in Neuroscience, Duke University (Fall 2022 – Spring 2023) [Graduation with Distinction]
Seijung Kim, Undergraduate in BME & CS, Duke University (Fall 2021 – Spring 2023) [Howard G. Clark Award, Graduation with Distinction]
Emily Eisele, Undergraduate in BME, Widener University (REU Program, Summer 2021)