Subscribe to RSS
DOI: 10.1055/s-0041-1735509
Augmented Reality for Retrosigmoid Craniotomy Planning
Abstract
While medical imaging data have traditionally been viewed on two-dimensional (2D) displays, augmented reality (AR) allows physicians to project the medical imaging data on patient's bodies to locate important anatomy. We present a surgical AR application to plan the retrosigmoid craniotomy, a standard approach to access the posterior fossa and the internal auditory canal. As a simple and accurate alternative to surface landmarks and conventional surgical navigation systems, our AR application augments the surgeon's vision to guide the optimal location of cortical bone removal. In this work, two surgeons performed a retrosigmoid approach 14 times on eight cadaver heads. In each case, the surgeon manually aligned a computed tomography (CT)-derived virtual rendering of the sigmoid sinus on the real cadaveric heads using a see-through AR display, allowing the surgeon to plan and perform the craniotomy accordingly. Postprocedure CT scans were acquired to assess the accuracy of the retrosigmoid craniotomies with respect to their intended location relative to the dural sinuses. The two surgeons had a mean margin of davg = 0.6 ± 4.7 mm and davg = 3.7 ± 2.3 mm between the osteotomy border and the dural sinuses over all their cases, respectively, and only positive margins for 12 of the 14 cases. The intended surgical approach to the internal auditory canal was successfully achieved in all cases using the proposed method, and the relatively small and consistent margins suggest that our system has the potential to be a valuable tool to facilitate planning a variety of similar skull-base procedures.
Keywords
augmented reality - craniotomy - retrosigmoid approach - acoustic neuroma - schwannoma - magicleap* These authors contributed equally.
Publication History
Received: 04 December 2020
Accepted: 28 July 2021
Article published online:
10 September 2021
© 2021. Thieme. All rights reserved.
Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany
-
References
- 1 Vávra P, Roman J, Zonča P. et al. Recent development of augmented reality in surgery: a review. J Healthc Eng 2017; 2017: 4574172
- 2 Rolland JP, Fuchs H. Optical versus video see-through head-mounted displays in medical visualization. Presence Teleoperators Virtual Environ 2000; 9 (03) 287-309
- 3 Cho J, Rahimpour S, Cutler A, Goodwin CR, Lad SP, Codd P. Enhancing reality: a systematic review of augmented reality in neuronavigation and education. World Neurosurg 2020; 139: 186-195
- 4 Perkins S, Lin M, Srinivasan S, Wheeler A, Hargreves B, Daniel B. A Mixed-Reality System for Breast Surgical Planning. Presented at IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct); October 30, 2017; Nantes, France:
- 5 Fuchs H, Livingston MA, Raskar R. et al. Augmented reality visualization for laparoscopic surgery. In: Wells WM, Colchester A, Delp S. eds. Medical Image Computing and Computer-Assisted Intervention—MICCAI'98. Cambridge, MA: : irst International Conference Cambridge Proceedings; 1998:1496: 934-943
- 6 Bajura M, Fuchs H, Ohbuchi R. Merging virtual objects with the real world: seeing ultrasound imagery within the patient. Comput Graph 1992; 26 (02) 203-210
- 7 Jackler R. Atlas of Skull Base Surgery and Neurotology. New York: Thieme; 2008: 2
- 8 Arnold H, Schulze M, Wolpert S. et al. Positioning a novel transcutaneous bone conduction hearing implant: a systematic anatomical and radiological study to standardize the retrosigmoid approach, correlating navigation-guided, and landmark-based surgery. Otol Neurotol 2018; 39 (04) 458-466
- 9 Heermann R, Schwab B, Issing PR, Haupt C, Lenarz T. Navigation with the StealthStation™ in skull base surgery: an otolaryngological perspective. Skull Base 2001; 11 (04) 277-285
- 10 Ledderose GJ, Hagedorn H, Spiegl K, Leunig A, Stelter K. Image guided surgery of the lateral skull base: testing a new dental splint registration device. Comput Aided Surg 2012; 17 (01) 13-20
- 11 Birkfellner W, Watzinger F, Wanschitz F, Ewers R, Bergmann H. Calibration of tracking systems in a surgical environment. IEEE Trans Med Imaging 1998; 17 (05) 737-742
- 12 Fischer GS. Electromagnetic tracker characterization and optimal tool design [dissertation]. Baltimore, MD: Johns Hopkins University; 2005: 233
- 13 Eggers G, Mühling J, Marmulla R. Image-to-patient registration techniques in head surgery. Int J Oral Maxillofac Surg 2006; 35 (12) 1081-1095
- 14 Azimi E, Qian L, Navab N, Kazanzides P. Alignment of the virtual scene to the 3D display space of a mixed reality head-mounted display. Accessed August 13, 2021 at: https://arxiv.org/pdf/1703.05834.pdf
- 15 Kruijff E, Swan JE, Feiner S. Perceptual issues in augmented reality revisited. Presented at Presented at IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct); November 22, 2010: 3–12 Seoul, Korea (South)
- 16 Mitsuno D, Ueda K, Hirota Y, Ogino M. Effective Application of Mixed Reality Device HoloLens: Simple Manual Alignment of Surgical Field and Holograms. Plast Reconstr Surg 2019; 143 (02) 647-651
- 17 Li Y, Chen X, Wang N. et al. A wearable mixed-reality holographic computer for guiding external ventricular drain insertion at the bedside. J Neurosurg 2018; 131 (05) 1-8
- 18 Besharati Tabrizi L, Mahvash M. Augmented reality-guided neurosurgery: accuracy and intraoperative application of an image projection technique. J Neurosurg 2015; 123 (01) 206-211
- 19 Incekara F, Smits M, Dirven C, Vincent A. Clinical feasibility of a wearable mixed-reality device in neurosurgery. World Neurosurg 2018; 118: e422-e427
- 20 McJunkin JL, Jiramongkolchai P, Chung W. et al. Development of a mixed reality platform for lateral skull base anatomy. Otol Neurotol 2018; 39 (10) e1137-e1142
- 21 Neves CA, Vaisbuch Y, Leuze C. et al. Application of holographic augmented reality for external approaches to the frontal sinus. Int Forum Allergy Rhinol 2020; 10 (07) 920-925
- 22 Fedorov A, Beichel R, Kalpathy-Cramer J. et al. 3D Slicer as an image computing platform for the Quantitative Imaging Network. Magn Reson Imaging 2012; 30 (09) 1323-1341
- 23 Monfared A, Mitteramskogler G, Gruber S, Salisbury Jr JK, Stampfl J, Blevins NH. High-fidelity, inexpensive surgical middle ear simulator. Otol Neurotol 2012; 33 (09) 1573-1577
- 24 Chan S, Li P, Locketz G, Salisbury K, Blevins NH. High-fidelity haptic and visual rendering for patient-specific simulation of temporal bone surgery. Comput Assist Surg (Abingdon) 2016; 21 (01) 85-101
- 25 Leuze C, Sathyanarayana S, Daniel BL, McNab JA. Landmark-based mixed-reality perceptual alignment of medical imaging data and accuracy validation in living subjects. Presented at Presented at IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct); December 14, 2020:1333 Porto de Galinhas, Brazil
- 26 Martin-Gomez A, Eck U, Navab N. Visualization techniques for precise alignment in VR: A comparative study. Presented at IEEE Conference on Virtual Reality and 3D User Interfaces (VR); August 15, 2019:735–741 Osaka, Japan
- 27 Jödicke A, Ottenhausen M, Lenarz T. Clinical Use of Navigation in Lateral Skull Base Surgery: Results of a Multispecialty National Survey among Skull Base Surgeons in Germany. J Neurol Surg B Skull Base 2018; 79 (06) 545-553
- 28 da Silva Jr. EB, Leal AG, Milano JB, da Silva Jr. LFM, Clemente RS, Ramina R. Image-guided surgical planning using anatomical landmarks in the retrosigmoid approach. Acta Neurochir (Wien) 2010; 152 (05) 905-910
- 29 Hall S, Peter Gan YC. Anatomical localization of the transverse-sigmoid sinus junction: Comparison of existing techniques. Surg Neurol Int 2019; 10 (186) 186
- 30 Wiltfang J, Rupprecht S, Ganslandt O. et al. Intraoperative image-guided surgery of the lateral and anterior skull base in patients with tumors or trauma. Skull Base 2003; 13 (01) 21-29
- 31 Grauvogel TD, Engelskirchen P, Semper-Hogg W, Grauvogel J, Laszig R. Navigation accuracy after automatic- and hybrid-surface registration in sinus and skull base surgery. PLoS One 2017; 12 (07) e0180975
- 32 Mascott CR, Sol JC, Bousquet P, Lagarrigue J, Lazorthes Y, Lauwers-Cances V. Quantification of true in vivo (application) accuracy in cranial image-guided surgery: influence of mode of patient registration. Neurosurgery 2006;59(1, suppl 1) ONS146–ONS156, discussion ONS146–ONS156
- 33 Wolfsberger S, Rössler K, Regatschnig R, Ungersböck K. Anatomical landmarks for image registration in frameless stereotactic neuronavigation. Neurosurg Rev 2002; 25 (1,2): 68-72
- 34 Cutting J, Vishton P. Perceiving layout and knowing distances: the integration, relative potency, and contextual use of different information about depth. Percept Sp Motion 1995; 22 (05) 69-117
- 35 Krajancich B, Padmanaban N, Wetzstein G. Factored occlusion: single spatial light modulator occlusion-capable optical see-through augmented reality display. IEEE Trans Vis Comput Graph 2020; 26 (05) 1871-1879
- 36 Schowengerdt B, Lin D, St Hilaire P. Multi-layer diffractive eyepiece. Accessed August 13, 2021 at: https://patentscope.wipo.int/search/en/detail.jsf?docId=EP321535057
- 37 Konrad R, Angelopoulos A, Wetzstein G. Gaze-contingent ocular parallax rendering for virtual reality. ACM Trans Graph 2020; 39 (02) 10
- 38 Fotouhi J, Unberath M, Navab N. et al. Reflective-AR display: an interaction methodology for virtual-to-real alignment in medical robotics. IEEE Robot Autom Lett 2020; 5 (02) 2722-2729
- 39 Diaz C, Walker M, Szafir DA, Szafir D. Designing for depth perceptions in augmented reality. IEEE International Symposium on Mixed and Augmented Reality (ISMAR); October 9–13, 2017:111–122 Nantes, France
- 40 NVIDIA Clara documentation. Accessed August 13, 2021 at: https://docs.nvidia.com/clara/tlt-mi/clara-train-sdk-v2.0/index.html