Brigham Research Institute Poster Session Site logo-1

Maud Boreel



Job Title

Research Trainee

Academic Rank




Ryan A. Bartholomew, MD; Haoyin Zhou, PhD; Alejandro Garcia, MD, Maud Boreel, Research trainee, ; Jeffrey P. Guenette, MD; Nir Ben-Shlomo, MD; Krish Suresh, MD; Daniel J Lee, MD; Jayender Jagadeesan, PhD; C. Eduardo Corrales, MD

Principal Investigator

Jayender Jagadeesan, PhD; C. Eduardo Corrales, MD

Research Category: Digital Health, Imaging, and Informatics


Lateral skull base surgical navigation using stereoscopic surface reconstruction

Scientific Abstract

Purpose: We seek to validate a stereoscopic surface reconstruction approach to surgical navigation in the lateral skull base. To relate the exposed tissue surface to underlying anatomy, a 3D surface reconstruction is created with 3D endoscopy in real time and fused to preoperative CT and MRI images.

Methods: During multiple steps of a cadaveric translabyrinthine dissection, two 3D models are generated. One model is created by stitching video frames captured with 3D endoscopy using stereo matching. This 3D surface model is then aligned to a 3D segmented CT model with a novel feature-based simultaneous localization and mapping method. Models are created in 3D slicer and fused using artificial fiducials (1.2 mm screws). Registration accuracy is assessed by calculating fiducial target registration errors (TREs).

Results: At five timepoints during translabyrinthine surgery in two cadaveric specimens, we generated surface models using a few seconds of 3D endoscopic video and 3D segmented CT models. These models were fused with a mean TRE of 0.76 mm (standard deviation 0.44 mm)

Conclusions: Our preliminary findings suggest that these stereoscopic surface reconstructions may provide surgeons a navigation technique by which they can “see through” opaque bone in real time while drilling within millimeters of critical structures.”

Lay Abstract

Operating in the lateral skull base requires drilling within millimeters of critical anatomical structures. To assist surgeons during these delicate surgeries, we are developing a novel surgical navigation system using 3D endoscopy and a novel algorithm which stitches together 3D videos frames to create 3D models of the surgical field. This surface model can be fused to 3D models generated from patient radiology scans (i.e. CT and MRI scans) taken before the operation—allowing surgeons to better understand the relationship of the exposed tissue surface to underlying anatomy.
To validate this surgical navigation technique, we are evaluating this surgical navigation technique while performing lateral skull base surgery on cadaveric specimens. Preliminary experiments were done in two cadaveric specimens with surface models and CT scans being taken during 5 timepoints during the surgery. These early results demonstrated high levels of fusion accuracy between the 3D surface and CT models, with the models offset from each other by less than 1 mm. Further refinement of this technique in cadaveric experiments is planned prior to evaluation in the hospital operating room.

Clinical Implications

A novel approach to surgical navigation using 3D endoscopy may one day allow surgeons to “see through” opaque bone, thereby providing visual buffer zone while drilling within millimeters of critical structures.