Adding AR derived from images aids surgeons in endonasal procedures

Adding computer-generated augmented reality to an endoscope can improve endonasal skull-based surgical navigation.


Adding computer-generated augmented reality to an endoscope can improve endonasal skull-based surgical navigation.

Such an approach can aid in surgery and improve surgeons’ performance during procedures, researchers contend.

Surgical navigation is a well-established tool in skull-based surgery conducted through the nose. This minimally invasive way to treat various conditions provides shorter hospitalizations, reduced post-op pain and fewer complications than surgery that involves incisions. The navigational and endoscopic views are usually displayed on separate monitors, forcing the surgeon to focus on one or the other.


However, a new navigation technique integrates the two monitors in real time by using augmented reality. With augmented reality, real-world objects are enhanced by overlaying computer-generated information.

In endoscopy, augmented reality can be used to add detail to the live video stream from the endoscope with overlaid image data, such as from MRIs or CT scans. Thus, a computer-generated image of a pre-planned surgical target, path or risk area can be integrated into the endoscope’s view.

“Using augmented reality for surgical navigation has several potential benefits compared to conventional navigation with display of 2D medical imaging on a separate screen. Overlaying segmented anatomical structures from CT or MRI on the endoscopic video stream enables navigation without the use of dedicated instruments and thereby improves workflow, while visualizing sub-surface anatomy,” the study authors stated.

The researchers, from the Eindhoven University of Technology in Eindhoven, the Netherlands, and the Karolinska Institutet in Stockholm, Sweden, took a new approach and integrated an endoscope into an augmented reality surgical navigation system to augment intraoperative cone beam computed tomography (CBCT) imaging data into the endoscope view during surgery. The CBCT images were overlaid on the endoscopic image.

The aim of the study was to determine the accuracy of this technique. The researchers tested its accuracy in a laboratory, using a 3D printed skull.

The general consensus, according to the study authors, is that accuracy must be less than two millimeters for the surgical navigation to be considered accurate.

The researchers found that their technique achieved “submillimeter” accuracy, ranging from 0.39 to 0.68 millimeters. The maximum error of an outlier was 1.43 millimeters, still well below the two millimeter standard.

The study was published January 16 in PLOS One.

The researchers noted that additional testing was needed to confirm the results. “The system shows great potential for clinical use in endoscopic skull base surgery, and further development is warranted,” the researchers concluded.

More for you

Loading data for hdm_tax_topic #better-outcomes...