Home » NeuroLens: AR-assisted Neurosurgery

NeuroLens: AR-assisted Neurosurgery

Augmented Reality (AR) can provide guidance to surgeons by visualizing the patient’s anatomy and other medical information using holograms. This visualization in AR enhances the surgeon’s perception and field of view and improves the surgeon’s confidence level in performing surgical tasks. The AR guidance system in neurosurgery is a classic example of providing guidance to surgeons by visualizing brain anatomies inside the patient’s brain. This AR visualization can provide minimal invasiveness and maximal safety to the existing neuronavigation system. The field of view in neurosurgery is often small, and avoiding inadvertent injuries to a vascular or nervous structure is a challenge due to the smallest possible path for a given intracranial pathology. We introduce NeuroLens, the first AR system that provided both the anatomical visualization of the patient’s ventricular hologram and contextual guidance on catheter placement to aid novice surgeons in completing the placement of external ventricular drain (EVD) in both training and clinical settings.

 

Our current setup includes six Optitrack cameras that use optical markers for real-time tracking and HoloLens 2 as an AR device. Optitrack tracks four rigid-body models: 1) surgical tool, 2) HoloLens 2 headset, 3) phantom head model, and 4) localization marker. Optimal numbers of optical markers are attached to each model for accurate and stable tracking from Optitrack. The localization marker model is used to compute the transformation of world coordinates between the Optitrack system and HoloLens 2 [VRW’22].

Overall setup of NeuroLens (left); Extraction of a patient-specific anatomical model using CT scan (right)

To achieve a more realistic visualization of ventricular hologram in AR, we extracted a sample model of brain ventricles from an anonymous patient’s computer tomography (CT) scan using 3D Slicer software. We applied a threshold to extract the ventricles, a smoothing filter to render the 3D ventricular model, and labeling of each ventricular part on the model. Additionally, we created a patient-specific phantom model by extracting the skull and 3D-printing it to align it with the ventricular hologram.

AR-based contextual guidance in determining optimal catheter trajectory and targeting foramen of Monro during the EVD procedure

While registration of the ventricle hologram inside the skull enhances the surgeon’s limited field of view, there is still a lack of guidance about the surgical task provided to the surgeon. The catheter trajectory needs to be determined by identifying anatomical landmarks and carefully calculated by checking the angle and depth of the catheter insertion. We integrated AR-based contextual guidance by tracking the EVD catheter to aid the catheter projection and the targeting of the foramen of Monro. We evaluated the impact of contextual guidance provided by NeuroLens on surgical performance by conducting an IRB-approved study with 33 medical students (recruited from Duke University and UNC Chapel Hill School of Medicine) and 9 neurosurgeons (from Department of Neurosurgery) [ISMAR’22][TVCG’24].

AR-assisted craniostomy guiding the surgeon to determine an optimal entry point for EVD placement and our sensing-embedded patient-specific phantom model computing catheter distance to target in real time

Using our robust optical-marker-based AR system, we expanded our work to provide guidance on finding an optimal entry point for external ventricular drain during the craniostomy procedure. This work includes the development of a camera-sensor-embedded phantom model that computes the distance to target accuracy from the tip of the catheter to the target point, hence eliminating the use of CT scanning in post-analysis. We conducted a study with 49 medical students and demonstrated that the distance to target accuracy was improved with AR guidance [NeurosurgicalFocus’24]. Furthermore, we integrated a machine-learning (ML)-based hand gesture recognition and eye tracking to evaluate surgeons’ overall performance to provide constructive feedback and improve specific technical skills during training. This work was presented in IEEE VR ’24 as a part of workshop and demo presentations [VRW’24][VRDemo’24].

 

This work was supported in part by NSF grants CNS-2112562 and CNS-1908051, NSF CAREER Award IIS-2046072, by a Thomas Lord Educational Innovation Grant, and by an AANS Neurosurgery Technology Development Grant.

GitHub Repositories


Thresholding-based Automatic Segmentation of Brain Ventricle using CT Scans [Link]
Real-time EVD Distance Calculation using Camera-embedded Phantom Model [Link]
Deep-learning-based Surgical Task Recognition using HoloLens Hand Tracking [Link]

 

Media Coverage


NSF Athena AI Institute’s Demo at Capitol Hill, Sep. 2023 [Highlighted in NSF Director’s Weekly Newsletter][Related LinkedIn Post]
The Dawning of the Age of the Metaverse, 2022 Duke ECE Magazine, Oct. 2022 [Link]

NSF Director trying out our demo during AI Institute Showcase in Capitol Hill, Washington DC, 2024 (left); Our demo presented at IEEE VR ’24 Conference in Orlando, FL (right).

 

Related Publications


Augmented Reality-based Contextual Guidance through Surgical Tool Tracking in Neurosurgery [Link]
S. Eom, S. Kim, J. Jackson, D. Sykes, S. Rahimpour, M. Gorlatova.
To appear in IEEE Transactions in Visualization and Computer Graphics (TVCG), 2024.

Did You Do Well? Real-time Personalized Feedback on Catheter Placement in Augmented Reality-assisted Neurosurgical Training [Link][Video]
S. Eom, T. Ma, N. Vutakuri, A. Du, Z. Qu, J. Jackson, M. Gorlatova.
In IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Mar. 2024.

Did I Do Well? Personalized Assessment of Trainees’ Performance in Augmented Reality-assisted Neurosurgical Training [Link][Github Repos: Segmentation, PhantomSensing, HandGesture]
S. Eom, T. Ma, N. Vutakuri, T. Hu, J. Jackson, M. Gorlatova.
In IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Mar. 2024.

Accuracy of Routine External Ventricular Drain Placement Following a Mixed Reality-guided Twist-drill Craniostomy [Link]
S. Eom, T. Ma, N. Vutakuri, T. Hu, A. P. Haskell-Mendoza, D. W. Sykes, M. Gorlatova, J. Jackson.
In Neurosurgical Focus 56 (1), E11 (special issue for Mixed Reality in Neurosurgery), Jan. 2024.

NeuroLens: Augmented Reality-based Contextual Guidance through Surgical Tool Tracking in Neurosurgery [Link]
S. Eom, D. Sykes, S. Rahimpour, M. Gorlatova.
In IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Oct. 2022. (Acceptance Rate: 21%)

AR-Assisted Surgical Guidance System for Ventriculostomy [Link]
S. Eom, S. Kim, S. Rahimpour, M. Gorlatova.
In IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Mar. 2022.

 

Current Team Members


Sarah Sangjun Eom, ECE, Duke University (Primary)
Dr. Maria Gorlatova, ECE, Duke University
Dr. Shervin Rahimpour, Department of Neurosurgery, University of Utah
Dr. Joshua Jackson, Department of Neurosurgery, Duke University

 

Former Team Members


Tiffany Ma, Undergraduate in CS, Duke University (Fall 2022 – Summer 2024) [Graduation with High Distinction]
David Sykes, School of Medicine, Duke University
Vanessa Tang, High School Student, NC School of Science and Math (Fall 2023 – Spring 2024)
Neha Vutakuri, Undergraduate in Neuroscience, Duke University (Fall 2022 – Spring 2023) [Graduation with Distinction]
Seijung Kim, Undergraduate in BME & CS, Duke University (Fall 2021 – Spring 2023) [Howard G. Clark Award, Graduation with Distinction]
Emily Eisele, Undergraduate in BME, Widener University (REU Program, Summer 2021)

Latest Posts