Home » NeuroLens: AR-assisted Neurosurgery

NeuroLens: AR-assisted Neurosurgery

Augmented Reality (AR) has the potential to bring assistance to surgeons by visualizing useful medical information such as the scanned image of the patient. The key challenges for surgical AR applications are the need to choose appropriate information for the AR visualization and its visualizing technique, AR user interface, and validation of the system. Especially in neurosurgery, AR visualization can provide minimal invasiveness and maximal safety to the existing neuronavigation system. The neurosurgical field is often small, and avoiding inadvertent injuries to a vascular or nervous structure is a challenge due to the smallest possible path for a given intracranial pathology.

Our current setup includes six Optitrack cameras that use optical markers for real-time tracking and HoloLens 2 as an AR device. Optitrack tracks four rigid-body models: a surgical tool, a HoloLens 2 headset, a phantom head model, and a localization marker. Optimal numbers of optical markers are attached to each model for accurate and stable tracking from Optitrack. The localization marker model is used for the calibration between two different coordinate systems of the Optitrack system and HoloLens 2.

Setup of Optitrack cameras facing towards the user (surgeon) performing the surgical process of the external ventricular drain using a catheter as a surgical tool

The 3D model of brain ventricles was created on 3D Slicer software using a patient’s CT scan. A skull frame is extracted into the 3D model as well to provide a reference of ventricle location to the head and evaluate the alignment. One of our preliminary works (by Emily, REU 2021) was implementing the system on an Android app using AR Core. Mobile AR is limited in its capabilities of computation, robustness, and interoperability, thus not an optimal solution for a host AR device when used in medical applications that require a high level of precision and robustness from the system.

Extraction of a 3D model of brain ventricles from a CT scan and the alignment of the extracted model with a phantom human head for a mobile AR application

While registration of the ventricle hologram inside the skull enhances the surgeon’s limited field of view, there is still a lack of guidance about the surgical task provided to the surgeon. The catheter trajectory needs to be determined by identifying anatomical landmarks and carefully calculated by checking the angle and depth of the catheter insertion. Thus, we integrate a marker-based tool tracking of the EVD catheter that enables contextual guidance such as the distance to the target or the angle of the catheter trajectory calculated in real-time.

Ventricle hologram overlaid on the phantom model with contextual guidance in alerting real-time calculation of the distance to the target point and angle of the catheter trajectory to the user

This work is supported by Dr. Maria Gorlatova’s NSF grants CNS-1908051, and CAREER-2046072.


Related Publications

Did You Do Well? Real-time Personalized Feedback on Catheter Placement in Augmented Reality-assisted Neurosurgical Training [Video]
S. Eom, T. Ma, N. Vutakuri, A. Du, Z. Qu, J. Jackson, M. Gorlatova.
To appear in IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Mar. 2024.

Did I Do Well? Personalized Assessment of Trainees’ Performance in Augmented Reality-assisted Neurosurgical Training [Link][Github Repos: Segmentation, PhantomSensing, HandGesture]
S. Eom, T. Ma, N. Vutakuri, T. Hu, J. Jackson, M. Gorlatova.
To appear in IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Mar. 2024.

Accuracy of routine external ventricular drain placement following a mixed reality-guided twist-drill craniostomy [Link]
S. Eom, T. Ma, N. Vutakuri, T. Hu, A. P. Haskell-Mendoza, D. W. Sykes, M. Gorlatova, J. Jackson.
In Neurosurgical Focus 56 (1), E11 (special issue for Mixed Reality in Neurosurgery), Jan. 2024.

NeuroLens: Augmented Reality-based Contextual Guidance through Surgical Tool Tracking in Neurosurgery [Link]
S. Eom, D. Sykes, S. Rahimpour, M. Gorlatova.
In IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Oct. 2022. (Acceptance Rate: 21%)

AR-Assisted Surgical Guidance System for Ventriculostomy [Link]
S. Eom, S. Kim, S. Rahimpour, M. Gorlatova.
In IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Mar. 2022.


Media Coverage

The Dawning of the Age of the Metaverse, 2022 Duke ECE Magazine, Oct. 2022 [Link]


Current Team Members
Sarah Sangjun Eom, ECE, Duke University (Primary)
Dr. Maria Gorlatova, ECE, Duke University
Dr. Shervin Rahimpour, Department of Neurosurgery, University of Utah
Dr. Joshua Jackson, Department of Neurosurgery, Duke University
David Sykes, School of Medicine, Duke University
Tiffany Ma, Undergraduate in CS, Duke University
Vanessa Tang, High School Student, NC School of Science and Math


Former Team Members
Neha Vutakuri, Undergraduate in Neuroscience, Duke University (Fall 2022 – Spring 2023) [Graduation with Distinction]
Seijung Kim, Undergraduate in BME & CS, Duke University (Fall 2021 – Spring 2023) [Howard G. Clark Award, Graduation with Distinction]
Emily Eisele, Undergraduate in BME, Widener University (REU Program, Summer 2021)

Latest Posts