1
|
Grunert R, Snyderman CH, Gardner P, Busse M, Ahner L, Kropla F, Möbius R, Jung S, Scholz S, Güresir E, Winkler D. NextLens-The Next Generation of Surgical Navigation: Proof of Concept of an Augmented Reality System for Surgical Navigation. J Neurol Surg B Skull Base 2024; 85:363-369. [PMID: 38966300 PMCID: PMC11221910 DOI: 10.1055/a-2083-7766] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2023] [Accepted: 04/24/2023] [Indexed: 07/06/2024] Open
Abstract
Objective The aim of this work was the development of an augmented reality system including the functionality of conventional surgical navigation systems. Methods An application software for the Augmented Reality System HoloLens 2 from Microsoft was developed. It detects the position of the patient as well as position of surgical instruments in real time and displays it within the two-dimensional (2D) magnetic resonance imaging or computed tomography (CT) images. The surgical pointer instrument, including a pattern that is recognized by the HoloLens 2 sensors, was created with three-dimensional (3D) printing. The technical concept was demonstrated at a cadaver skull to identify anatomical landmarks. Results With the help of the HoloLens 2 and its sensors, the real-time position of the surgical pointer instrument could be shown. The position of the 3D-printed pointer with colored pattern could be recognized within 2D-CT images when stationary and in motion at a cadaver skull. Feasibility could be demonstrated for the clinical application of transsphenoidal pituitary surgery. Conclusion The HoloLens 2 has a high potential for use as a surgical navigation system. With subsequent studies, a further accuracy evaluation will be performed receiving valid data for comparison with conventional surgical navigation systems. In addition to transsphenoidal pituitary surgery, it could be also applied for other surgical disciplines.
Collapse
Affiliation(s)
- Ronny Grunert
- Department of Neurosurgery, University Leipzig, Leipzig, Germany
- Fraunhofer Plastics Technology Center Oberlausitz, Fraunhofer Institute for Machine Tools and Forming Technology, Zittau, Germany
| | - Carl-Henry Snyderman
- Center for Skull Base Surgery, University Pittsburgh, Medical Center, Pittsburgh, Pennsylvania, United States
| | - Paul Gardner
- Center for Skull Base Surgery, University Pittsburgh, Medical Center, Pittsburgh, Pennsylvania, United States
| | - Michel Busse
- Department of Neurosurgery, University Leipzig, Leipzig, Germany
| | - Lukas Ahner
- Department of Neurosurgery, University Leipzig, Leipzig, Germany
| | - Fabian Kropla
- Department of Neurosurgery, University Leipzig, Leipzig, Germany
| | - Robert Möbius
- Department of Neurosurgery, University Leipzig, Leipzig, Germany
| | - Svenja Jung
- Department of Neurosurgery, University Leipzig, Leipzig, Germany
| | - Sebastian Scholz
- Fraunhofer Plastics Technology Center Oberlausitz, Fraunhofer Institute for Machine Tools and Forming Technology, Zittau, Germany
| | - Erdem Güresir
- Department of Neurosurgery, University Leipzig, Leipzig, Germany
| | - Dirk Winkler
- Department of Neurosurgery, University Leipzig, Leipzig, Germany
| |
Collapse
|
2
|
Qi Z, Jin H, Xu X, Wang Q, Gan Z, Xiong R, Zhang S, Liu M, Wang J, Ding X, Chen X, Zhang J, Nimsky C, Bopp MHA. Head model dataset for mixed reality navigation in neurosurgical interventions for intracranial lesions. Sci Data 2024; 11:538. [PMID: 38796526 PMCID: PMC11127921 DOI: 10.1038/s41597-024-03385-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2024] [Accepted: 05/15/2024] [Indexed: 05/28/2024] Open
Abstract
Mixed reality navigation (MRN) technology is emerging as an increasingly significant and interesting topic in neurosurgery. MRN enables neurosurgeons to "see through" the head with an interactive, hybrid visualization environment that merges virtual- and physical-world elements. Offering immersive, intuitive, and reliable guidance for preoperative and intraoperative intervention of intracranial lesions, MRN showcases its potential as an economically efficient and user-friendly alternative to standard neuronavigation systems. However, the clinical research and development of MRN systems present challenges: recruiting a sufficient number of patients within a limited timeframe is difficult, and acquiring low-cost, commercially available, medically significant head phantoms is equally challenging. To accelerate the development of novel MRN systems and surmount these obstacles, the study presents a dataset designed for MRN system development and testing in neurosurgery. It includes CT and MRI data from 19 patients with intracranial lesions and derived 3D models of anatomical structures and validation references. The models are available in Wavefront object (OBJ) and Stereolithography (STL) formats, supporting the creation and assessment of neurosurgical MRN applications.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043, Marburg, Germany.
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China.
| | - Haitao Jin
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
- NCO School, Army Medical University, 050081, Shijiazhuang, China
| | - Xinghua Xu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
| | - Qun Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
| | - Zhichao Gan
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
| | - Ruochu Xiong
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Department of Neurosurgery, Division of Medicine, Graduate School of Medical Sciences, Kanazawa University, Takara-machi 13-1, 920-8641, Kanazawa, Ishikawa, Japan
| | - Shiyu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
| | - Minghang Liu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
| | - Jingyue Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
| | - Xinyu Ding
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
| | - Xiaolei Chen
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
| | - Jiashu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China.
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043, Marburg, Germany
- Center for Mind, Brain and Behavior (CMBB), 35043, Marburg, Germany
| | - Miriam H A Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043, Marburg, Germany.
- Center for Mind, Brain and Behavior (CMBB), 35043, Marburg, Germany.
| |
Collapse
|
3
|
Morley CT, Arreola DM, Qian L, Lynn AL, Veigulis ZP, Osborne TF. Mixed Reality Surgical Navigation System; Positional Accuracy Based on Food and Drug Administration Standard. Surg Innov 2024; 31:48-57. [PMID: 38019844 PMCID: PMC10773158 DOI: 10.1177/15533506231217620] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2023]
Abstract
BACKGROUND Computer assisted surgical navigation systems are designed to improve outcomes by providing clinicians with procedural guidance information. The use of new technologies, such as mixed reality, offers the potential for more intuitive, efficient, and accurate procedural guidance. The goal of this study is to assess the positional accuracy and consistency of a clinical mixed reality system that utilizes commercially available wireless head-mounted displays (HMDs), custom software, and localization instruments. METHODS Independent teams using the second-generation Microsoft HoloLens© hardware, Medivis SurgicalAR© software, and localization instruments, tested the accuracy of the combined system at different institutions, times, and locations. The ASTM F2554-18 consensus standard for computer-assisted surgical systems, as recognized by the U.S. FDA, was utilized to measure the performance. 288 tests were performed. RESULTS The system demonstrated consistent results, with an average accuracy performance that was better than one millimeter (.75 ± SD .37 mm). CONCLUSION Independently acquired positional tracking accuracies exceed conventional in-market surgical navigation tracking systems and FDA standards. Importantly, the performance was achieved at two different institutions, using an international testing standard, and with a system that included a commercially available off-the-shelf wireless head mounted display and software.
Collapse
Affiliation(s)
| | - David M. Arreola
- US Department of Veterans Affairs, Palo Alto Healthcare System, Palo Alto, CA, USA
| | | | | | - Zachary P. Veigulis
- US Department of Veterans Affairs, Palo Alto Healthcare System, Palo Alto, CA, USA
- Department of Business Analytics, Tippie College of Business, University of Iowa, Iowa, IA, USA
| | - Thomas F. Osborne
- US Department of Veterans Affairs, Palo Alto Healthcare System, Palo Alto, CA, USA
- Department of Radiology, Stanford University School of Medicine, Stanford, CA, USA
| |
Collapse
|
4
|
Qi Z, Jin H, Wang Q, Gan Z, Xiong R, Zhang S, Liu M, Wang J, Ding X, Chen X, Zhang J, Nimsky C, Bopp MHA. The Feasibility and Accuracy of Holographic Navigation with Laser Crosshair Simulator Registration on a Mixed-Reality Display. SENSORS (BASEL, SWITZERLAND) 2024; 24:896. [PMID: 38339612 PMCID: PMC10857152 DOI: 10.3390/s24030896] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/04/2024] [Revised: 01/21/2024] [Accepted: 01/23/2024] [Indexed: 02/12/2024]
Abstract
Addressing conventional neurosurgical navigation systems' high costs and complexity, this study explores the feasibility and accuracy of a simplified, cost-effective mixed reality navigation (MRN) system based on a laser crosshair simulator (LCS). A new automatic registration method was developed, featuring coplanar laser emitters and a recognizable target pattern. The workflow was integrated into Microsoft's HoloLens-2 for practical application. The study assessed the system's precision by utilizing life-sized 3D-printed head phantoms based on computed tomography (CT) or magnetic resonance imaging (MRI) data from 19 patients (female/male: 7/12, average age: 54.4 ± 18.5 years) with intracranial lesions. Six to seven CT/MRI-visible scalp markers were used as reference points per case. The LCS-MRN's accuracy was evaluated through landmark-based and lesion-based analyses, using metrics such as target registration error (TRE) and Dice similarity coefficient (DSC). The system demonstrated immersive capabilities for observing intracranial structures across all cases. Analysis of 124 landmarks showed a TRE of 3.0 ± 0.5 mm, consistent across various surgical positions. The DSC of 0.83 ± 0.12 correlated significantly with lesion volume (Spearman rho = 0.813, p < 0.001). Therefore, the LCS-MRN system is a viable tool for neurosurgical planning, highlighting its low user dependency, cost-efficiency, and accuracy, with prospects for future clinical application enhancements.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Haitao Jin
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
- NCO School, Army Medical University, Shijiazhuang 050081, China
| | - Qun Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Zhichao Gan
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Ruochu Xiong
- Department of Neurosurgery, Division of Medicine, Graduate School of Medical Sciences, Kanazawa University, Takara-machi 13-1, Kanazawa 920-8641, Japan;
| | - Shiyu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Minghang Liu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Jingyue Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Xinyu Ding
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Xiaolei Chen
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Jiashu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Miriam H. A. Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| |
Collapse
|
5
|
Zhao X, Zhao H, Zheng W, Gohritz A, Shen Y, Xu W. Clinical evaluation of augmented reality-based 3D navigation system for brachial plexus tumor surgery. World J Surg Oncol 2024; 22:20. [PMID: 38233922 PMCID: PMC10792838 DOI: 10.1186/s12957-023-03288-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2023] [Accepted: 12/26/2023] [Indexed: 01/19/2024] Open
Abstract
BACKGROUND Augmented reality (AR), a form of 3D imaging technology, has been preliminarily applied in tumor surgery of the head and spine, both are rigid bodies. However, there is a lack of research evaluating the clinical value of AR in tumor surgery of the brachial plexus, a non-rigid body, where the anatomical position varies with patient posture. METHODS Prior to surgery in 8 patients diagnosed with brachial plexus tumors, conventional MRI scans were performed to obtain conventional 2D MRI images. The MRI data were then differentiated automatically and converted into AR-based 3D models. After point-to-point relocation and registration, the 3D models were projected onto the patient's body using a head-mounted display for navigation. To evaluate the clinical value of AR-based 3D models compared to the conventional 2D MRI images, 2 senior hand surgeons completed questionnaires on the evaluation of anatomical structures (tumor, arteries, veins, nerves, bones, and muscles), ranging from 1 (strongly disagree) to 5 (strongly agree). RESULTS Surgeons rated AR-based 3D models as superior to conventional MRI images for all anatomical structures, including tumors. Furthermore, AR-based 3D models were preferred for preoperative planning and intraoperative navigation, demonstrating their added value. The mean positional error between the 3D models and intraoperative findings was approximately 1 cm. CONCLUSIONS This study evaluated, for the first time, the clinical value of an AR-based 3D navigation system in preoperative planning and intraoperative navigation for brachial plexus tumor surgery. By providing more direct spatial visualization, compared with conventional 2D MRI images, this 3D navigation system significantly improved the clinical accuracy and safety of tumor surgery in non-rigid bodies.
Collapse
Affiliation(s)
- Xuanyu Zhao
- Department of Hand and Upper Extremity Surgery, Jing'an District Central Hospital, Branch of Huashan Hospital, Fudan University, Shanghai, China
- Department of Hand Surgery, Huashan Hospital, Fudan University, Shanghai, China
| | - Huali Zhao
- Department of Radiology, Jing'an District Central Hospital, Branch of Huashan Hospital, Fudan University, Shanghai, China
| | - Wanling Zheng
- Department of Hand and Upper Extremity Surgery, Jing'an District Central Hospital, Branch of Huashan Hospital, Fudan University, Shanghai, China
- Department of Hand Surgery, Huashan Hospital, Fudan University, Shanghai, China
| | - Andreas Gohritz
- Department of Plastic, Reconstructive, Aesthetic and Hand Surgery, University Hospital Basel, University of Basel, Basel, Switzerland
| | - Yundong Shen
- Department of Hand and Upper Extremity Surgery, Jing'an District Central Hospital, Branch of Huashan Hospital, Fudan University, Shanghai, China.
- Department of Hand Surgery, Huashan Hospital, Fudan University, Shanghai, China.
- The National Clinical Research Center for Aging and Medicine, Fudan University, Shanghai, China.
| | - Wendong Xu
- Department of Hand and Upper Extremity Surgery, Jing'an District Central Hospital, Branch of Huashan Hospital, Fudan University, Shanghai, China.
- Department of Hand Surgery, Huashan Hospital, Fudan University, Shanghai, China.
- The National Clinical Research Center for Aging and Medicine, Fudan University, Shanghai, China.
- Institute of Brain Science, State Key Laboratory of Medical Neurobiology and Collaborative Innovation Center for Brain Science, Fudan University, Shanghai, China.
- Research Unit of Synergistic Reconstruction of Upper and Lower Limbs after Brain Injury, Chinese Academy of Medical Sciences, Beijing, China.
| |
Collapse
|
6
|
Kos TM, Colombo E, Bartels LW, Robe PA, van Doormaal TPC. Evaluation Metrics for Augmented Reality in Neurosurgical Preoperative Planning, Surgical Navigation, and Surgical Treatment Guidance: A Systematic Review. Oper Neurosurg (Hagerstown) 2023; 26:01787389-990000000-01007. [PMID: 38146941 PMCID: PMC11008635 DOI: 10.1227/ons.0000000000001009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2023] [Accepted: 10/10/2023] [Indexed: 12/27/2023] Open
Abstract
BACKGROUND AND OBJECTIVE Recent years have shown an advancement in the development of augmented reality (AR) technologies for preoperative visualization, surgical navigation, and intraoperative guidance for neurosurgery. However, proving added value for AR in clinical practice is challenging, partly because of a lack of standardized evaluation metrics. We performed a systematic review to provide an overview of the reported evaluation metrics for AR technologies in neurosurgical practice and to establish a foundation for assessment and comparison of such technologies. METHODS PubMed, Embase, and Cochrane were searched systematically for publications on assessment of AR for cranial neurosurgery on September 22, 2022. The findings were reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. RESULTS The systematic search yielded 830 publications; 114 were screened full text, and 80 were included for analysis. Among the included studies, 5% dealt with preoperative visualization using AR, with user perception as the most frequently reported metric. The majority (75%) researched AR technology for surgical navigation, with registration accuracy, clinical outcome, and time measurements as the most frequently reported metrics. In addition, 20% studied the use of AR for intraoperative guidance, with registration accuracy, task outcome, and user perception as the most frequently reported metrics. CONCLUSION For quality benchmarking of AR technologies in neurosurgery, evaluation metrics should be specific to the risk profile and clinical objectives of the technology. A key focus should be on using validated questionnaires to assess user perception; ensuring clear and unambiguous reporting of registration accuracy, precision, robustness, and system stability; and accurately measuring task performance in clinical studies. We provided an overview suggesting which evaluation metrics to use per AR application and innovation phase, aiming to improve the assessment of added value of AR for neurosurgical practice and to facilitate the integration in the clinical workflow.
Collapse
Affiliation(s)
- Tessa M. Kos
- Image Sciences Institute, University Medical Center Utrecht, Utrecht, The Netherlands
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Elisa Colombo
- Department of Neurosurgery, Clinical Neuroscience Center, Universitätsspital Zürich, Zurich, The Netherlands
| | - L. Wilbert Bartels
- Image Sciences Institute, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Pierre A. Robe
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Tristan P. C. van Doormaal
- Department of Neurosurgery, Clinical Neuroscience Center, Universitätsspital Zürich, Zurich, The Netherlands
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands
| |
Collapse
|
7
|
Qi Z, Bopp MHA, Nimsky C, Chen X, Xu X, Wang Q, Gan Z, Zhang S, Wang J, Jin H, Zhang J. A Novel Registration Method for a Mixed Reality Navigation System Based on a Laser Crosshair Simulator: A Technical Note. Bioengineering (Basel) 2023; 10:1290. [PMID: 38002414 PMCID: PMC10669875 DOI: 10.3390/bioengineering10111290] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2023] [Accepted: 11/01/2023] [Indexed: 11/26/2023] Open
Abstract
Mixed Reality Navigation (MRN) is pivotal in augmented reality-assisted intelligent neurosurgical interventions. However, existing MRN registration methods face challenges in concurrently achieving low user dependency, high accuracy, and clinical applicability. This study proposes and evaluates a novel registration method based on a laser crosshair simulator, evaluating its feasibility and accuracy. A novel registration method employing a laser crosshair simulator was introduced, designed to replicate the scanner frame's position on the patient. The system autonomously calculates the transformation, mapping coordinates from the tracking space to the reference image space. A mathematical model and workflow for registration were designed, and a Universal Windows Platform (UWP) application was developed on HoloLens-2. Finally, a head phantom was used to measure the system's target registration error (TRE). The proposed method was successfully implemented, obviating the need for user interactions with virtual objects during the registration process. Regarding accuracy, the average deviation was 3.7 ± 1.7 mm. This method shows encouraging results in efficiency and intuitiveness and marks a valuable advancement in low-cost, easy-to-use MRN systems. The potential for enhancing accuracy and adaptability in intervention procedures positions this approach as promising for improving surgical outcomes.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
| | - Miriam H. A. Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Xiaolei Chen
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Xinghua Xu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Qun Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Zhichao Gan
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Shiyu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Jingyue Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Haitao Jin
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
- NCO School, Army Medical University, Shijiazhuang 050081, China
| | - Jiashu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| |
Collapse
|
8
|
Chiou SY, Liu LS, Lee CW, Kim DH, Al-Masni MA, Liu HL, Wei KC, Yan JL, Chen PY. Augmented Reality Surgical Navigation System Integrated with Deep Learning. Bioengineering (Basel) 2023; 10:bioengineering10050617. [PMID: 37237687 DOI: 10.3390/bioengineering10050617] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2023] [Revised: 05/11/2023] [Accepted: 05/17/2023] [Indexed: 05/28/2023] Open
Abstract
Most current surgical navigation methods rely on optical navigators with images displayed on an external screen. However, minimizing distractions during surgery is critical and the spatial information displayed in this arrangement is non-intuitive. Previous studies have proposed combining optical navigation systems with augmented reality (AR) to provide surgeons with intuitive imaging during surgery, through the use of planar and three-dimensional imagery. However, these studies have mainly focused on visual aids and have paid relatively little attention to real surgical guidance aids. Moreover, the use of augmented reality reduces system stability and accuracy, and optical navigation systems are costly. Therefore, this paper proposed an augmented reality surgical navigation system based on image positioning that achieves the desired system advantages with low cost, high stability, and high accuracy. This system also provides intuitive guidance for the surgical target point, entry point, and trajectory. Once the surgeon uses the navigation stick to indicate the position of the surgical entry point, the connection between the surgical target and the surgical entry point is immediately displayed on the AR device (tablet or HoloLens glasses), and a dynamic auxiliary line is shown to assist with incision angle and depth. Clinical trials were conducted for EVD (extra-ventricular drainage) surgery, and surgeons confirmed the system's overall benefit. A "virtual object automatic scanning" method is proposed to achieve a high accuracy of 1 ± 0.1 mm for the AR-based system. Furthermore, a deep learning-based U-Net segmentation network is incorporated to enable automatic identification of the hydrocephalus location by the system. The system achieves improved recognition accuracy, sensitivity, and specificity of 99.93%, 93.85%, and 95.73%, respectively, representing a significant improvement from previous studies.
Collapse
Affiliation(s)
- Shin-Yan Chiou
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
- Department of Nuclear Medicine, Linkou Chang Gung Memorial Hospital, Taoyuan 333, Taiwan
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| | - Li-Sheng Liu
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
- Department of Electrical and Electronic Engineering, College of Engineering, Yonsei University, Seodaemun-gu, Seoul 03722, Republic of Korea
| | - Chia-Wei Lee
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
| | - Dong-Hyun Kim
- Department of Electrical and Electronic Engineering, College of Engineering, Yonsei University, Seodaemun-gu, Seoul 03722, Republic of Korea
| | - Mohammed A Al-Masni
- Department of Artificial Intelligence, College of Software & Convergence Technology, Daeyang AI Center, Sejong University, Seoul 05006, Republic of Korea
| | - Hao-Li Liu
- Department of Electrical Engineering, National Taiwan University, Taipei 106, Taiwan
| | - Kuo-Chen Wei
- New Taipei City Tucheng Hospital, Tao-Yuan, Tucheng, New Taipei City 236, Taiwan
| | - Jiun-Lin Yan
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| | - Pin-Yuan Chen
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| |
Collapse
|
9
|
Ruggiero F, Cercenelli L, Emiliani N, Badiali G, Bevini M, Zucchelli M, Marcelli E, Tarsitano A. Preclinical Application of Augmented Reality in Pediatric Craniofacial Surgery: An Accuracy Study. J Clin Med 2023; 12:jcm12072693. [PMID: 37048777 PMCID: PMC10095377 DOI: 10.3390/jcm12072693] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2023] [Revised: 03/29/2023] [Accepted: 03/31/2023] [Indexed: 04/08/2023] Open
Abstract
Background: Augmented reality (AR) allows the overlapping and integration of virtual information with the real environment. The camera of the AR device reads the object and integrates the virtual data. It has been widely applied to medical and surgical sciences in recent years and has the potential to enhance intraoperative navigation. Materials and methods: In this study, the authors aim to assess the accuracy of AR guidance when using the commercial HoloLens 2 head-mounted display (HMD) in pediatric craniofacial surgery. The Authors selected fronto-orbital remodeling (FOR) as the procedure to test (specifically, frontal osteotomy and nasal osteotomy were considered). Six people (three surgeons and three engineers) were recruited to perform the osteotomies on a 3D printed stereolithographic model under the guidance of AR. By means of calibrated CAD/CAM cutting guides with different grooves, the authors measured the accuracy of the osteotomies that were performed. We tested accuracy levels of ±1.5 mm, ±1 mm, and ±0.5 mm. Results: With the HoloLens 2, the majority of the individuals involved were able to successfully trace the trajectories of the frontal and nasal osteotomies with an accuracy level of ±1.5 mm. Additionally, 80% were able to achieve an accuracy level of ±1 mm when performing a nasal osteotomy, and 52% were able to achieve an accuracy level of ±1 mm when performing a frontal osteotomy, while 61% were able to achieve an accuracy level of ±0.5 mm when performing a nasal osteotomy, and 33% were able to achieve an accuracy level of ±0.5 mm when performing a frontal osteotomy. Conclusions: despite this being an in vitro study, the authors reported encouraging results for the prospective use of AR on actual patients.
Collapse
Affiliation(s)
- Federica Ruggiero
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Maxillo-Facial Surgery Unit, AUSL Bologna, 40124 Bologna, Italy
| | - Laura Cercenelli
- Laboratory of Bioengineering—eDIMES Lab, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, 40138 Bologna, Italy
| | - Nicolas Emiliani
- Laboratory of Bioengineering—eDIMES Lab, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, 40138 Bologna, Italy
| | - Giovanni Badiali
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
| | - Mirko Bevini
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
| | - Mino Zucchelli
- Pediatric Neurosurgery, IRCCS Istituto delle Scienze Neurologiche di Bologna, Via Altura 3, 40138 Bologna, Italy
| | - Emanuela Marcelli
- Laboratory of Bioengineering—eDIMES Lab, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, 40138 Bologna, Italy
| | - Achille Tarsitano
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
| |
Collapse
|
10
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 20] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
11
|
Baashar Y, Alkawsi G, Wan Ahmad WN, Alomari MA, Alhussian H, Tiong SK. Towards Wearable Augmented Reality in Healthcare: A Comparative Survey and Analysis of Head-Mounted Displays. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2023; 20:3940. [PMID: 36900951 PMCID: PMC10002206 DOI: 10.3390/ijerph20053940] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/30/2023] [Revised: 02/16/2023] [Accepted: 02/19/2023] [Indexed: 06/18/2023]
Abstract
Head-mounted displays (HMDs) have the potential to greatly impact the surgical field by maintaining sterile conditions in healthcare environments. Google Glass (GG) and Microsoft HoloLens (MH) are examples of optical HMDs. In this comparative survey related to wearable augmented reality (AR) technology in the medical field, we examine the current developments in wearable AR technology, as well as the medical aspects, with a specific emphasis on smart glasses and HoloLens. The authors searched recent articles (between 2017 and 2022) in the PubMed, Web of Science, Scopus, and ScienceDirect databases and a total of 37 relevant studies were considered for this analysis. The selected studies were divided into two main groups; 15 of the studies (around 41%) focused on smart glasses (e.g., Google Glass) and 22 (59%) focused on Microsoft HoloLens. Google Glass was used in various surgical specialities and preoperative settings, namely dermatology visits and nursing skill training. Moreover, Microsoft HoloLens was used in telepresence applications and holographic navigation of shoulder and gait impairment rehabilitation, among others. However, some limitations were associated with their use, such as low battery life, limited memory size, and possible ocular pain. Promising results were obtained by different studies regarding the feasibility, usability, and acceptability of using both Google Glass and Microsoft HoloLens in patient-centric settings as well as medical education and training. Further work and development of rigorous research designs are required to evaluate the efficacy and cost-effectiveness of wearable AR devices in the future.
Collapse
Affiliation(s)
- Yahia Baashar
- Faculty of Computing and Informatics, Universiti Malaysia Sabah (UMS), Labuan 87000, Malaysia
| | - Gamal Alkawsi
- Institute of Sustainable Energy (ISE), Universiti Tenaga Nasional, Kajang 43000, Malaysia
- Faculty of Computer Science and Information Systems, Thamar University, Thamar 87246, Yemen
| | | | - Mohammad Ahmed Alomari
- Institute of Informatics and Computing in Energy, Universiti Tenaga Nasional (UNITEN), Kajang 43000, Malaysia
| | - Hitham Alhussian
- Department of Computer and Information Sciences, Universiti Teknologi PETRONAS, Seri Iskandar 32610, Malaysia
| | - Sieh Kiong Tiong
- Institute of Sustainable Energy (ISE), Universiti Tenaga Nasional, Kajang 43000, Malaysia
| |
Collapse
|
12
|
Satoh M, Nakajima T, Watanabe E, Kawai K. Augmented Reality in Stereotactic Neurosurgery: Current Status and Issues. Neurol Med Chir (Tokyo) 2023; 63:137-140. [PMID: 36682793 PMCID: PMC10166603 DOI: 10.2176/jns-nmc.2022-0278] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/20/2023] Open
Abstract
Stereotactic neurosurgery is an established technique, but it has several limitations. In frame-based stereotaxy using a stereotactic frame, frame setting errors may decrease the accuracy of the procedure. Frameless stereotaxy using neuronavigation requires surgeons to shift their view from the surgical field to the navigation display and to advance the needle while assuming a physically uncomfortable position. To overcome these limitations, several researchers have applied augmented reality in stereotactic neurosurgery. Augmented reality enables surgeons to visualize the information regarding the target and preplanned trajectory superimposed over the actual surgical field. In frame-based stereotaxy, a researcher applies tablet computer-based augmented reality to check for the setting errors of the stereotactic frame, thereby improving the safety of the procedure. Several researchers have reported performing frameless stereotaxy guided by head-mounted-display-based augmented reality that enables surgeons to advance the needle at a more natural posture. These studies have shown that augmented reality can address the limitations of stereotactic neurosurgery. Conversely, they have also revealed the limited accuracy of current augmented reality systems for small targets, which indicates that further development of augmented reality systems is needed.
Collapse
Affiliation(s)
- Makoto Satoh
- Department of Neurosurgery, Jichi Medical University
| | | | - Eiju Watanabe
- Department of Neurosurgery, Jichi Medical University
| | - Kensuke Kawai
- Department of Neurosurgery, Jichi Medical University
| |
Collapse
|
13
|
Zary N, Eysenbach G, Van Doormaal TPC, Ruurda JP, Van der Kaaij NP, De Heer LM. Mixed Reality in Modern Surgical and Interventional Practice: Narrative Review of the Literature. JMIR Serious Games 2023; 11:e41297. [PMID: 36607711 PMCID: PMC9947976 DOI: 10.2196/41297] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2022] [Revised: 10/17/2022] [Accepted: 10/31/2022] [Indexed: 11/07/2022] Open
Abstract
BACKGROUND Mixed reality (MR) and its potential applications have gained increasing interest within the medical community over the recent years. The ability to integrate virtual objects into a real-world environment within a single video-see-through display is a topic that sparks imagination. Given these characteristics, MR could facilitate preoperative and preinterventional planning, provide intraoperative and intrainterventional guidance, and aid in education and training, thereby improving the skills and merits of surgeons and residents alike. OBJECTIVE In this narrative review, we provide a broad overview of the different applications of MR within the entire spectrum of surgical and interventional practice and elucidate on potential future directions. METHODS A targeted literature search within the PubMed, Embase, and Cochrane databases was performed regarding the application of MR within surgical and interventional practice. Studies were included if they met the criteria for technological readiness level 5, and as such, had to be validated in a relevant environment. RESULTS A total of 57 studies were included and divided into studies regarding preoperative and interventional planning, intraoperative and interventional guidance, as well as training and education. CONCLUSIONS The overall experience with MR is positive. The main benefits of MR seem to be related to improved efficiency. Limitations primarily seem to be related to constraints associated with head-mounted display. Future directions should be aimed at improving head-mounted display technology as well as incorporation of MR within surgical microscopes, robots, and design of trials to prove superiority.
Collapse
Affiliation(s)
| | | | - Tristan P C Van Doormaal
- University Medical Center Utrecht, Utrecht, Netherlands.,University Hospital Zurich, Zurich, Switzerland
| | | | | | | |
Collapse
|
14
|
Chegini S, Edwards E, McGurk M, Clarkson M, Schilling C. Systematic review of techniques used to validate the registration of augmented-reality images using a head-mounted device to navigate surgery. Br J Oral Maxillofac Surg 2023; 61:19-27. [PMID: 36513525 DOI: 10.1016/j.bjoms.2022.08.007] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 07/31/2022] [Accepted: 08/17/2022] [Indexed: 12/14/2022]
Abstract
Augmented-reality (AR) head-mounted devices (HMD) allow the wearer to have digital images superposed on to their field of vision. They are being used to superpose annotations on to the surgical field akin to a navigation system. This review examines published validation studies on HMD-AR systems, their reported protocols, and outcomes. The aim was to establish commonalities and an acceptable registration outcome. Multiple databases were systematically searched for relevant articles between January 2015 and January 2021. Studies that examined the registration of AR content using a HMD to guide surgery were eligible for inclusion. The country of origin, year of publication, medical specialty, HMD device, software, and method of registration, were recorded. A meta-analysis of the mean registration error was conducted. A total of 4784 papers were identified, of which 23 met the inclusion criteria. They included studies using HoloLens (Microsoft) (n = 22) and nVisor ST60 (NVIS Inc) (n = 1). Sixty-six per cent of studies were in hard tissue specialties. Eleven studies reported registration errors using pattern markers (mean (SD) 2.6 (1.8) mm), and four reported registration errors using surface markers (mean (SD) 3.8 (3.7) mm). Three studies reported registration errors using manual alignment (mean (SD) 2.2 (1.3) mm). The majority of studies in this review used in-house software with a variety of registration methods and reported errors. The mean registration error calculated in this study can be considered as a minimum acceptable standard. It should be taken into consideration when procedural applications are selected.
Collapse
Affiliation(s)
- Soudeh Chegini
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom.
| | - Eddie Edwards
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Mark McGurk
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Matthew Clarkson
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Clare Schilling
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| |
Collapse
|
15
|
Zhou Z, Yang Z, Jiang S, Zhuo J, Zhu T, Ma S. Surgical Navigation System for Hypertensive Intracerebral Hemorrhage Based on Mixed Reality. J Digit Imaging 2022; 35:1530-1543. [PMID: 35819536 PMCID: PMC9712880 DOI: 10.1007/s10278-022-00676-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2021] [Revised: 06/24/2022] [Accepted: 06/28/2022] [Indexed: 10/17/2022] Open
Abstract
Hypertensive intracerebral hemorrhage (HICH) is an intracerebral bleeding disease that affects 2.5 per 10,000 people worldwide each year. An effective way to cure this disease is puncture through the dura with a brain puncture drill and tube; the accuracy of the insertion determines the quality of the surgery. In recent decades, surgical navigation systems have been widely used to improve the accuracy of surgery and minimize risks. Augmented reality- and mixed reality-based surgical navigation is a promising new technology for surgical navigation in the clinic, aiming to improve the safety and accuracy of the operation. In this study, we present a novel multimodel mixed reality navigation system for HICH surgery in which medical images and virtual anatomical structures can be aligned intraoperatively with the actual structures of the patient in a head-mounted device and adjusted when the patient moves in real time while under local anesthesia; this approach can help the surgeon intuitively perform intraoperative navigation. A novel registration method is used to register the holographic space and serves as an intraoperative optical tracker, and a method for calibrating the HICH surgical tools is used to track the tools in real time. The results of phantom experiments revealed a mean registration error of 1.03 mm and an average time consumption of 12.9 min. In clinical usage, the registration error was 1.94 mm, and the time consumption was 14.2 min, showing that this system is sufficiently accurate and effective for clinical application.
Collapse
Affiliation(s)
- Zeyang Zhou
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Zhiyong Yang
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Shan Jiang
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China.
| | - Jie Zhuo
- Department of Neurosurgery, Huanhu Hospital, Tianjin, 300350, China.
| | - Tao Zhu
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Shixing Ma
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| |
Collapse
|
16
|
Meulstee J, Bussink T, Delye H, Xi T, Borstlap W, Maal T. Surgical guides versus augmented reality to transfer a virtual surgical plan for open cranial vault reconstruction: A pilot study. ADVANCES IN ORAL AND MAXILLOFACIAL SURGERY 2022. [DOI: 10.1016/j.adoms.2022.100334] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022] Open
|
17
|
Multicenter assessment of augmented reality registration methods for image-guided interventions. Radiol Med 2022; 127:857-865. [DOI: 10.1007/s11547-022-01515-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2021] [Accepted: 06/13/2022] [Indexed: 10/17/2022]
|
18
|
Zhang R, Xu Z, Zhang L, Cao L, Hu Y, Lu B, Shi L, Yao D, Zhao X. The effect of stimulus number on the recognition accuracy and information transfer rate of SSVEP-BCI in augmented reality. J Neural Eng 2022; 19. [PMID: 35477130 DOI: 10.1088/1741-2552/ac6ae5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/25/2021] [Accepted: 04/26/2022] [Indexed: 11/12/2022]
Abstract
OBJECTIVE The biggest advantage of steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI) lies in its large command set and high information transfer rate (ITR). Almost all current SSVEP-BCIs use a computer screen (CS) to present flickering visual stimuli, which limits its flexible use in actual scenes. Augmented reality (AR) technology provides the ability to superimpose visual stimuli on the real world, and it considerably expands the application scenarios of SSVEP-BCI. However, whether the advantages of SSVEP-BCI can be maintained when moving the visual stimuli to AR glasses is not known. This study investigated the effects of the stimulus number for SSVEP-BCI in an AR context. APPROACH We designed SSVEP flickering stimulation interfaces with four different numbers of stimulus targets and put them in AR glasses and a CS to display. Three common recognition algorithms were used to analyze the influence of the stimulus number and stimulation time on the recognition accuracy and ITR of AR-SSVEP and CS-SSVEP. MAIN RESULTS The amplitude spectrum and signal-to-noise ratio of AR-SSVEP were not significantly different from CS-SSVEP at the fundamental frequency but were significantly lower than CS-SSVEP at the second harmonic. SSVEP recognition accuracy decreased as the stimulus number increased in AR-SSVEP but not in CS-SSVEP. When the stimulus number increased, the maximum ITR of CS-SSVEP also increased, but not for AR-SSVEP. When the stimulus number was 25, the maximum ITR (142.05 bits/min) was reached at 400 ms. The importance of stimulation time in SSVEP was confirmed. When the stimulation time became longer, the recognition accuracy of both AR-SSVEP and CS-SSVEP increased. The peak value was reached at 3 s. The ITR increased first and then slowly decreased after reaching the peak value. SIGNIFICANCE Our study indicates that the conclusions based on CS-SSVEP cannot be simply applied to AR-SSVEP, and it is not advisable to set too many stimulus targets in the AR display device.
Collapse
Affiliation(s)
- Rui Zhang
- School of Electrical Engineering, Zhengzhou University, Zhengzhou, 450001, China, Zhengzhou university, Zhengzhou, 450000, CHINA
| | - Zongxin Xu
- School of Electrical Engineering, Zhengzhou University, Zhengzhou, 450001, China , Zhengzhou university, Zhengzhou, Henan, 450001, CHINA
| | - Lipeng Zhang
- Zhengzhou University, Zhengzhou university, Zhengzhou, 450001, CHINA
| | - Lijun Cao
- Zhengzhou University, Zhengzhou university, Zhengzhou, 450000, CHINA
| | - Yuxia Hu
- Zhengzhou University, Zhengzhou university, Zhengzhou, 450001, CHINA
| | - Beihan Lu
- Zhengzhou University, Zhengzhou university, Zhengzhou, 450001, CHINA
| | - Li Shi
- Department of Automation, Tsinghua University, BeiJing, Beijing, P. R, 100084, CHINA
| | - Dezhong Yao
- School of Life Science and Technology, University of Electronic Science and Technology of China, Chengdu 610054, Sichuan Province, chengdu, sichuan, 610054, CHINA
| | - Xincan Zhao
- Zhengzhou University, Zhengzhou university, Zhengzhou, 450001, CHINA
| |
Collapse
|
19
|
Zhou Z, Yang Z, Jiang S, Zhuo J, Zhu T, Ma S. Augmented reality surgical navigation system based on the spatial drift compensation method for glioma resection surgery. Med Phys 2022; 49:3963-3979. [PMID: 35383964 DOI: 10.1002/mp.15650] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2021] [Revised: 03/11/2022] [Accepted: 03/28/2022] [Indexed: 11/12/2022] Open
Abstract
BACKGROUND The number of patients who suffer from glioma has been increasing, and this malignancy is a serious threat to human health. The mainstream treatment for glioma is surgical resection; therefore, accurate resection can improve postoperative patient recovery. PURPOSE Many studies have investigated surgical navigation guided by mixed reality, with good outcomes. However, the limitations of mixed reality, such as spatial drift caused by environmental changes, limit its clinical application. Therefore, we present a mixed reality surgical navigation system for glioma resection. Preoperative information can be fused precisely with the real patient with the spatial compensation method to achieve clinically suitable accuracy. METHODS A head-mounted device was used to display virtual information, and a markerless spatial registration method was applied to precisely align the virtual anatomy with the real patient preoperatively. High-accuracy preoperative and intraoperative movement and spatial drift compensation methods were used to increase the positional accuracy of the mixed reality-guided glioma resection system when the patient's head is fixed to the bed frame. Several experiments were designed to validate the accuracy and efficacy of this system. RESULTS Phantom experiments were performed to test the efficacy and accuracy of this system under ideal conditions, and clinical tests were conducted to assess the performance of this system in clinical application. The accuracy of spatial registration was 1.18 mm in the phantom experiments and 1.86 mm in the clinical application. CONCLUSIONS Herein, we present a mixed reality-based multimodality fused surgical navigation system for assisting surgeons in intuitively identifying the glioma boundary intraoperatively. The experimental results indicate that this system has suitable accuracy and efficacy for clinical usage. This article is protected by copyright. All rights reserved.
Collapse
Affiliation(s)
- Zeyang Zhou
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Zhiyong Yang
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Shan Jiang
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China.,Centre for advanced Mechanisms and Robotics, Tianjin University, Tianjin, 300350, China
| | - Jie Zhuo
- Department of Neurosurgery, Tianjin Huanhu hospital, Tianjin, 300200, China
| | - Tao Zhu
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Shixing Ma
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| |
Collapse
|
20
|
Real-time augmented reality application in presurgical planning and lesion scalp localization by a smartphone. Acta Neurochir (Wien) 2022; 164:1069-1078. [PMID: 34448914 DOI: 10.1007/s00701-021-04968-z] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2021] [Accepted: 08/08/2021] [Indexed: 10/20/2022]
Abstract
OBJECTIVE A smartphone augmented reality (AR) application (app) was explored for clinical use in presurgical planning and lesion scalp localization. METHODS We programmed an AR App on a smartphone. The accuracy of the AR app was tested on a 3D-printed head model, using the Euclidean distance of displacement of virtual objects. For clinical validation, 14 patients with brain tumors were included in the study. Preoperative MRI images were used to generate 3D models for AR contents. The 3D models were then transferred to the smartphone AR app. Tumor scalp localization was marked, and a surgical corridor was planned on the patient's head by viewing AR images on the smartphone screen. Standard neuronavigation was applied to evaluate the accuracy of the smartphone. Max-margin distance (MMD) and area overlap ratio (AOR) were measured to quantitatively validate the clinical accuracy of the smartphone AR technique. RESULTS In model validation, the total mean Euclidean distance of virtual object displacement using the smartphone AR app was 4.7 ± 2.3 mm. In clinical validation, the mean duration of AR app usage was 168.5 ± 73.9 s. The total mean MMD was 6.7 ± 3.7 mm, and total mean AOR was 79%. CONCLUSIONS The smartphone AR app provides a new way of experience to observe intracranial anatomy in situ, and it makes surgical planning more intuitive and efficient. Localization accuracy is satisfactory with lesions larger than 15 mm.
Collapse
|
21
|
Peng C, Yang L, Yi W, Yidan L, Yanglingxi W, Qingtao Z, Xiaoyong T, Tang Y, Jia W, Xing Y, Zhiqin Z, Yongbing D. Application of Fused Reality Holographic Image and Navigation Technology in the Puncture Treatment of Hypertensive Intracerebral Hemorrhage. Front Neurosci 2022; 16:850179. [PMID: 35360174 PMCID: PMC8963409 DOI: 10.3389/fnins.2022.850179] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2022] [Accepted: 02/08/2022] [Indexed: 11/13/2022] Open
Abstract
Objective Minimally invasive puncture and drainage (MIPD) of hematomas was the preferred option for appropriate patients with hypertensive intracerebral hemorrhage (HICH). The goal of our research was to introduce the MIPD surgery using mixed reality holographic navigation technology (MRHNT). Method We provided the complete workflow for hematoma puncture using MRHNT included three-dimensional model reconstruction by preoperative CT examination, puncture trajectory design, immersive presentation of model, and real environment and hematoma puncture using dual-plane navigation by wearing special equipment. We collected clinical data on eight patients with HICH who underwent MIPD using MRHNT from March 2021 to August 2021, including the hematoma evacuation rate, operation time, deviation in drainage tube target, postoperative complications, and 2-week postoperative GCS. Result The workflow for hematoma puncture using MRHNT were performed in all eight cases, in which the average hematoma evacuation rate was 47.36±9.16%, the average operation time was 82.14±15.74 min, and the average deviation of the drainage tube target was 5.76±0.80 mm. There was no delayed bleeding, acute ischemic stroke, intracranial infection, or epilepsy 2 weeks after surgery. The 2-week postoperative GCS was improved compared with the preoperative GCS. Conclusion The research concluded it was feasible to perform the MIPD by MRHNT on patients with HICH. The risk of general anesthesia and highly professional holographic information processing restricted the promotion of the technology, it was necessary for technical innovation and the accumulation of more case experience and verification of its superiority.
Collapse
Affiliation(s)
- Chen Peng
- Department of Neurosurgery, Chongqing Emergency Medical Center, Chongqing University Central Hospital, Chongqing, China
| | - Liu Yang
- Department of Neurosurgery, Chongqing Emergency Medical Center, Chongqing University Central Hospital, Chongqing, China
| | - Wang Yi
- QINYING Technology Co., Ltd., Chongqing, China
| | - Liang Yidan
- Department of Neurosurgery, Chongqing Emergency Medical Center, Chongqing University Central Hospital, Chongqing, China
| | - Wang Yanglingxi
- Department of Neurosurgery, Chongqing Emergency Medical Center, Chongqing University Central Hospital, Chongqing, China
| | - Zhang Qingtao
- Department of Neurosurgery, Chongqing Emergency Medical Center, Chongqing University Central Hospital, Chongqing, China
| | - Tang Xiaoyong
- Department of Neurosurgery, Chongqing Emergency Medical Center, Chongqing University Central Hospital, Chongqing, China
| | - Yongbing Tang
- Department of Neurosurgery, Chongqing Emergency Medical Center, Chongqing University Central Hospital, Chongqing, China
| | - Wang Jia
- Department of Neurosurgery, Chongqing Emergency Medical Center, Chongqing University Central Hospital, Chongqing, China
| | - Yu Xing
- Department of Neurosurgery, Chongqing Emergency Medical Center, Chongqing University Central Hospital, Chongqing, China
| | - Zhu Zhiqin
- College of Automation, Chongqing University of Posts and Telecommunications, Chongqing, China
| | - Deng Yongbing
- Department of Neurosurgery, Chongqing Emergency Medical Center, Chongqing University Central Hospital, Chongqing, China
- *Correspondence: Deng Yongbing
| |
Collapse
|
22
|
Visualization, navigation, augmentation. The ever-changing perspective of the neurosurgeon. BRAIN AND SPINE 2022; 2:100926. [PMID: 36248169 PMCID: PMC9560703 DOI: 10.1016/j.bas.2022.100926] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/06/2022] [Revised: 07/23/2022] [Accepted: 08/10/2022] [Indexed: 11/22/2022]
|
23
|
Satoh M, Nakajima T, Yamaguchi T, Watanabe E, Kawai K. Evaluation of augmented-reality based navigation for brain tumor surgery. J Clin Neurosci 2021; 94:305-314. [PMID: 34863455 DOI: 10.1016/j.jocn.2021.10.033] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2021] [Revised: 09/30/2021] [Accepted: 10/24/2021] [Indexed: 11/26/2022]
Abstract
To date, several researchers have introduced augmented reality navigation (ARN) into neurological surgery. While its application in brain tumor surgery seems promising, reports on its utility have been limited, thus warranting further evaluation. To clarify the stages and approaches in which ARN is useful and assess the effect of presurgical discussion with surgeons, we assessed usefulness using a hand-held ARN system we had developed, which displays three-dimensional (3D) virtual structures overlaid on a real-time image of the surgical field via a tablet PC monitor. The system was tested in 20 patients undergoing various procedures, with the first 10 consecutive cases being unselected and the following 10 cases being selected, for whom 3D models were prepared per the surgeons' request. Thereafter, the surgeons ranked its usefulness during each stage of surgery. Consequently, case selection and presurgical discussions with surgeons considerably improved the usefulness, with the "useful" gradings improving from 50% to 88% across all surgical stages. Accordingly, usefulness improved from 50% to 90%, 67% to 100%, and 40% to 80% during the skin incision and craniotomy, dura incision, and intradural procedure stages, respectively. ARN was useful for superficial tumor resection, but less so for deep-seated tumor resection, except when using the transcortical and interhemispheric approaches. In conclusion, a tablet-type ARN can be useful during skin incisions, craniotomy and dura incisions, superficial tumor resections, and transcortical and interhemispheric approaches for deep-seated tumors. Case selection and presurgical discussions with surgeons were essential for the efficacy of ARN.
Collapse
Affiliation(s)
- Makoto Satoh
- Department of Neurosurgery, Jichi Medical University, Shimotuke-City, Japan.
| | - Takeshi Nakajima
- Department of Neurosurgery, Jichi Medical University, Shimotuke-City, Japan.
| | - Takashi Yamaguchi
- Department of Neurosurgery, Jichi Medical University, Shimotuke-City, Japan.
| | - Eiju Watanabe
- Department of Neurosurgery, Jichi Medical University, Shimotuke-City, Japan.
| | - Kensuke Kawai
- Department of Neurosurgery, Jichi Medical University, Shimotuke-City, Japan.
| |
Collapse
|
24
|
Montemurro N, Condino S, Cattari N, D’Amato R, Ferrari V, Cutolo F. Augmented Reality-Assisted Craniotomy for Parasagittal and Convexity En Plaque Meningiomas and Custom-Made Cranio-Plasty: A Preliminary Laboratory Report. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2021; 18:ijerph18199955. [PMID: 34639256 PMCID: PMC8507881 DOI: 10.3390/ijerph18199955] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/18/2021] [Revised: 09/10/2021] [Accepted: 09/17/2021] [Indexed: 12/23/2022]
Abstract
BACKGROUND This report discusses the utility of a wearable augmented reality platform in neurosurgery for parasagittal and convexity en plaque meningiomas with bone flap removal and custom-made cranioplasty. METHODS A real patient with en plaque cranial vault meningioma with diffuse and extensive dural involvement, extracranial extension into the calvarium, and homogeneous contrast enhancement on gadolinium-enhanced T1-weighted MRI, was selected for this case study. A patient-specific manikin was designed starting with the segmentation of the patient's preoperative MRI images to simulate a craniotomy procedure. Surgical planning was performed according to the segmented anatomy, and customized bone flaps were designed accordingly. During the surgical simulation stage, the VOSTARS head-mounted display was used to accurately display the planned craniotomy trajectory over the manikin skull. The precision of the craniotomy was assessed based on the evaluation of previously prepared custom-made bone flaps. RESULTS A bone flap with a radius 0.5 mm smaller than the radius of an ideal craniotomy fitted perfectly over the performed craniotomy, demonstrating an error of less than ±1 mm in the task execution. The results of this laboratory-based experiment suggest that the proposed augmented reality platform helps in simulating convexity en plaque meningioma resection and custom-made cranioplasty, as carefully planned in the preoperative phase. CONCLUSIONS Augmented reality head-mounted displays have the potential to be a useful adjunct in tumor surgical resection, cranial vault lesion craniotomy and also skull base surgery, but more study with large series is needed.
Collapse
Affiliation(s)
- Nicola Montemurro
- Department of Neurosurgery, Azienda Ospedaliera Universitaria Pisana (AOUP), University of Pisa, 56100 Pisa, Italy
- Correspondence:
| | - Sara Condino
- Department of Information Engineering, University of Pisa, 56100 Pisa, Italy; (S.C.); (R.D.); (V.F.); (F.C.)
- EndoCAS Center for Computer-Assisted Surgery, 56100 Pisa, Italy;
| | - Nadia Cattari
- EndoCAS Center for Computer-Assisted Surgery, 56100 Pisa, Italy;
- Department of Translational Research, University of Pisa, 56100 Pisa, Italy
| | - Renzo D’Amato
- Department of Information Engineering, University of Pisa, 56100 Pisa, Italy; (S.C.); (R.D.); (V.F.); (F.C.)
- EndoCAS Center for Computer-Assisted Surgery, 56100 Pisa, Italy;
| | - Vincenzo Ferrari
- Department of Information Engineering, University of Pisa, 56100 Pisa, Italy; (S.C.); (R.D.); (V.F.); (F.C.)
- EndoCAS Center for Computer-Assisted Surgery, 56100 Pisa, Italy;
| | - Fabrizio Cutolo
- Department of Information Engineering, University of Pisa, 56100 Pisa, Italy; (S.C.); (R.D.); (V.F.); (F.C.)
- EndoCAS Center for Computer-Assisted Surgery, 56100 Pisa, Italy;
| |
Collapse
|
25
|
Fick T, van Doormaal JAM, Tosic L, van Zoest RJ, Meulstee JW, Hoving EW, van Doormaal TPC. Fully automatic brain tumor segmentation for 3D evaluation in augmented reality. Neurosurg Focus 2021; 51:E14. [PMID: 34333477 DOI: 10.3171/2021.5.focus21200] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2021] [Accepted: 05/18/2021] [Indexed: 11/06/2022]
Abstract
OBJECTIVE For currently available augmented reality workflows, 3D models need to be created with manual or semiautomatic segmentation, which is a time-consuming process. The authors created an automatic segmentation algorithm that generates 3D models of skin, brain, ventricles, and contrast-enhancing tumor from a single T1-weighted MR sequence and embedded this model into an automatic workflow for 3D evaluation of anatomical structures with augmented reality in a cloud environment. In this study, the authors validate the accuracy and efficiency of this automatic segmentation algorithm for brain tumors and compared it with a manually segmented ground truth set. METHODS Fifty contrast-enhanced T1-weighted sequences of patients with contrast-enhancing lesions measuring at least 5 cm3 were included. All slices of the ground truth set were manually segmented. The same scans were subsequently run in the cloud environment for automatic segmentation. Segmentation times were recorded. The accuracy of the algorithm was compared with that of manual segmentation and evaluated in terms of Sørensen-Dice similarity coefficient (DSC), average symmetric surface distance (ASSD), and 95th percentile of Hausdorff distance (HD95). RESULTS The mean ± SD computation time of the automatic segmentation algorithm was 753 ± 128 seconds. The mean ± SD DSC was 0.868 ± 0.07, ASSD was 1.31 ± 0.63 mm, and HD95 was 4.80 ± 3.18 mm. Meningioma (mean 0.89 and median 0.92) showed greater DSC than metastasis (mean 0.84 and median 0.85). Automatic segmentation had greater accuracy for measuring DSC (mean 0.86 and median 0.87) and HD95 (mean 3.62 mm and median 3.11 mm) of supratentorial metastasis than those of infratentorial metastasis (mean 0.82 and median 0.81 for DSC; mean 5.26 mm and median 4.72 mm for HD95). CONCLUSIONS The automatic cloud-based segmentation algorithm is reliable, accurate, and fast enough to aid neurosurgeons in everyday clinical practice by providing 3D augmented reality visualization of contrast-enhancing intracranial lesions measuring at least 5 cm3. The next steps involve incorporation of other sequences and improving accuracy with 3D fine-tuning in order to expand the scope of augmented reality workflow.
Collapse
Affiliation(s)
- Tim Fick
- 1Department of Neuro-oncology, Princess Máxima Center for Pediatric Oncology, Utrecht, The Netherlands
| | - Jesse A M van Doormaal
- 2Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Lazar Tosic
- 3Department of Neurosurgery, University Hospital of Zürich, Zürich, Switzerland; and
| | - Renate J van Zoest
- 4Department of Neurology and Neurosurgery, Curaçao Medical Center, Willemstad, Curaçao
| | - Jene W Meulstee
- 1Department of Neuro-oncology, Princess Máxima Center for Pediatric Oncology, Utrecht, The Netherlands
| | - Eelco W Hoving
- 1Department of Neuro-oncology, Princess Máxima Center for Pediatric Oncology, Utrecht, The Netherlands.,2Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Tristan P C van Doormaal
- 2Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands.,3Department of Neurosurgery, University Hospital of Zürich, Zürich, Switzerland; and
| |
Collapse
|
26
|
Ivan ME, Eichberg DG, Di L, Shah AH, Luther EM, Lu VM, Komotar RJ, Urakov TM. Augmented reality head-mounted display-based incision planning in cranial neurosurgery: a prospective pilot study. Neurosurg Focus 2021; 51:E3. [PMID: 34333466 DOI: 10.3171/2021.5.focus20735] [Citation(s) in RCA: 24] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2020] [Accepted: 05/13/2021] [Indexed: 11/06/2022]
Abstract
OBJECTIVE Monitor and wand-based neuronavigation stations (MWBNSs) for frameless intraoperative neuronavigation are routinely used in cranial neurosurgery. However, they are temporally and spatially cumbersome; the OR must be arranged around the MWBNS, at least one hand must be used to manipulate the MWBNS wand (interrupting a bimanual surgical technique), and the surgical workflow is interrupted as the surgeon stops to "check the navigation" on a remote monitor. Thus, there is need for continuous, real-time, hands-free, neuronavigation solutions. Augmented reality (AR) is poised to streamline these issues. The authors present the first reported prospective pilot study investigating the feasibility of using the OpenSight application with an AR head-mounted display to map out the borders of tumors in patients undergoing elective craniotomy for tumor resection, and to compare the degree of correspondence with MWBNS tracing. METHODS Eleven consecutive patients undergoing elective craniotomy for brain tumor resection were prospectively identified and underwent circumferential tumor border tracing at the time of incision planning by a surgeon wearing HoloLens AR glasses running the commercially available OpenSight application registered to the patient and preoperative MRI. Then, the same patient underwent circumferential tumor border tracing using the StealthStation S8 MWBNS. Postoperatively, both tumor border tracings were compared by two blinded board-certified neurosurgeons and rated as having an excellent, adequate, or poor correspondence degree based on a subjective sense of the overlap. Objective overlap area measurements were also determined. RESULTS Eleven patients undergoing craniotomy were included in the study. Five patient procedures were rated as having an excellent correspondence degree, 5 had an adequate correspondence degree, and 1 had poor correspondence. Both raters agreed on the rating in all cases. AR tracing was possible in all cases. CONCLUSIONS In this small pilot study, the authors found that AR was implementable in the workflow of a neurosurgery OR, and was a feasible method of preoperative tumor border identification for incision planning. Future studies are needed to identify strategies to improve and optimize AR accuracy.
Collapse
Affiliation(s)
- Michael E Ivan
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and.,2Sylvester Comprehensive Cancer Center, Miami, Florida
| | - Daniel G Eichberg
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and
| | - Long Di
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and
| | - Ashish H Shah
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and
| | - Evan M Luther
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and
| | - Victor M Lu
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and
| | - Ricardo J Komotar
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and.,2Sylvester Comprehensive Cancer Center, Miami, Florida
| | | |
Collapse
|
27
|
Qi Z, Li Y, Xu X, Zhang J, Li F, Gan Z, Xiong R, Wang Q, Zhang S, Chen X. Holographic mixed-reality neuronavigation with a head-mounted device: technical feasibility and clinical application. Neurosurg Focus 2021; 51:E22. [PMID: 34333462 DOI: 10.3171/2021.5.focus21175] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2021] [Accepted: 05/13/2021] [Indexed: 11/06/2022]
Abstract
OBJECTIVE The authors aimed to evaluate the technical feasibility of a mixed-reality neuronavigation (MRN) system with a wearable head-mounted device (HMD) and to determine its clinical application and accuracy. METHODS A semiautomatic registration MRN system on HoloLens smart glasses was developed and tested for accuracy and feasibility. Thirty-seven patients with intracranial lesions were prospectively identified. For each patient, multimodal imaging-based holograms of lesions, markers, and surrounding eloquent structures were created and then imported to the MRN HMD. After a point-based registration, the holograms were projected onto the patient's head and observed through the HMD. The contour of the holograms was compared with standard neuronavigation (SN). The projection of the lesion boundaries perceived by the neurosurgeon on the patient's scalp was then marked with MRN and SN. The distance between the two contours generated by MRN and SN was measured so that the accuracy of MRN could be assessed. RESULTS MRN localization was achieved in all patients. The mean additional time required for MRN was 36.3 ± 6.3 minutes, in which the mean registration time was 2.6 ± 0.9 minutes. A trend toward a shorter time required for preparation was observed with the increase of neurosurgeon experience with the MRN system. The overall median deviation was 4.1 mm (IQR 3.0 mm-4.7 mm), and 81.1% of the lesions localized by MRN were found to be highly consistent with SN (deviation < 5.0 mm). There was a significant difference between the supine position and the prone position (3.7 ± 1.1 mm vs 5.4 ± 0.9 mm, p = 0.001). The magnitudes of deviation vectors did not correlate with lesion volume (p = 0.126) or depth (p = 0.128). There was no significant difference in additional operating time between different operators (37.4 ± 4.8 minutes vs 34.6 ± 4.8 minutes, p = 0.237) or in localization deviation (3.7 ± 1.0 mm vs 4.6 ± 1.5 mm, p = 0.070). CONCLUSIONS This study provided a complete set of a clinically applicable workflow on an easy-to-use MRN system using a wearable HMD, and has shown its technical feasibility and accuracy. Further development is required to improve the accuracy and clinical efficacy of this system.
Collapse
Affiliation(s)
- Ziyu Qi
- 1Department of Neurosurgery, Chinese PLA General Hospital; and.,2School of Medicine, Nankai University, Tianjin, China
| | - Ye Li
- 3Department of Neurosurgery, Xuanwu Hospital, Capital Medical University, Beijing; and
| | - Xinghua Xu
- 1Department of Neurosurgery, Chinese PLA General Hospital; and
| | - Jiashu Zhang
- 1Department of Neurosurgery, Chinese PLA General Hospital; and
| | - Fangye Li
- 1Department of Neurosurgery, Chinese PLA General Hospital; and
| | - Zhichao Gan
- 1Department of Neurosurgery, Chinese PLA General Hospital; and.,2School of Medicine, Nankai University, Tianjin, China
| | - Ruochu Xiong
- 1Department of Neurosurgery, Chinese PLA General Hospital; and
| | - Qun Wang
- 1Department of Neurosurgery, Chinese PLA General Hospital; and
| | - Shiyu Zhang
- 1Department of Neurosurgery, Chinese PLA General Hospital; and
| | - Xiaolei Chen
- 1Department of Neurosurgery, Chinese PLA General Hospital; and
| |
Collapse
|
28
|
Van Gestel F, Frantz T, Vannerom C, Verhellen A, Gallagher AG, Elprama SA, Jacobs A, Buyl R, Bruneau M, Jansen B, Vandemeulebroucke J, Scheerlinck T, Duerinck J. The effect of augmented reality on the accuracy and learning curve of external ventricular drain placement. Neurosurg Focus 2021; 51:E8. [PMID: 34333479 DOI: 10.3171/2021.5.focus21215] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2021] [Accepted: 05/13/2021] [Indexed: 11/06/2022]
Abstract
OBJECTIVE The traditional freehand technique for external ventricular drain (EVD) placement is most frequently used, but remains the primary risk factor for inaccurate drain placement. As this procedure could benefit from image guidance, the authors set forth to demonstrate the impact of augmented-reality (AR) assistance on the accuracy and learning curve of EVD placement compared with the freehand technique. METHODS Sixteen medical students performed a total of 128 EVD placements on a custom-made phantom head, both before and after receiving a standardized training session. They were guided by either the freehand technique or by AR, which provided an anatomical overlay and tailored guidance for EVD placement through inside-out infrared tracking. The outcome was quantified by the metric accuracy of EVD placement as well as by its clinical quality. RESULTS The mean target error was significantly impacted by either AR (p = 0.003) or training (p = 0.02) in a direct comparison with the untrained freehand performance. Both untrained (11.9 ± 4.5 mm) and trained (12.2 ± 4.7 mm) AR performances were significantly better than the untrained freehand performance (19.9 ± 4.2 mm), which improved after training (13.5 ± 4.7 mm). The quality of EVD placement as assessed by the modified Kakarla scale (mKS) was significantly impacted by AR guidance (p = 0.005) but not by training (p = 0.07). Both untrained and trained AR performances (59.4% mKS grade 1 for both) were significantly better than the untrained freehand performance (25.0% mKS grade 1). Spatial aptitude testing revealed a correlation between perceptual ability and untrained AR-guided performance (r = 0.63). CONCLUSIONS Compared with the freehand technique, AR guidance for EVD placement yielded a higher outcome accuracy and quality for procedure novices. With AR, untrained individuals performed as well as trained individuals, which indicates that AR guidance not only improved performance but also positively impacted the learning curve. Future efforts will focus on the translation and evaluation of AR for EVD placement in the clinical setting.
Collapse
Affiliation(s)
- Frederick Van Gestel
- 1Department of Neurosurgery, Vrije Universiteit Brussel, Universitair Ziekenhuis Brussel, Brussels.,2Research Group Center For Neurosciences (C4N-NEUR), Vrije Universiteit Brussel, Brussels
| | - Taylor Frantz
- 3Department of Electronics and Informatics (ETRO), Vrije Universiteit Brussel, Brussels.,4imec, Leuven
| | - Cédric Vannerom
- 1Department of Neurosurgery, Vrije Universiteit Brussel, Universitair Ziekenhuis Brussel, Brussels.,2Research Group Center For Neurosciences (C4N-NEUR), Vrije Universiteit Brussel, Brussels
| | - Anouk Verhellen
- 5Department of Studies on Media, Innovation & Technology (SMIT), Vrije Universiteit Brussel, Brussels
| | | | - Shirley A Elprama
- 5Department of Studies on Media, Innovation & Technology (SMIT), Vrije Universiteit Brussel, Brussels
| | - An Jacobs
- 5Department of Studies on Media, Innovation & Technology (SMIT), Vrije Universiteit Brussel, Brussels
| | - Ronald Buyl
- 7Department of Public Health, Research Group Biostatistics and Medical Informatics (BISI), Vrije Universiteit Brussel, Brussels
| | - Michaël Bruneau
- 1Department of Neurosurgery, Vrije Universiteit Brussel, Universitair Ziekenhuis Brussel, Brussels
| | - Bart Jansen
- 3Department of Electronics and Informatics (ETRO), Vrije Universiteit Brussel, Brussels.,4imec, Leuven
| | - Jef Vandemeulebroucke
- 3Department of Electronics and Informatics (ETRO), Vrije Universiteit Brussel, Brussels.,4imec, Leuven
| | - Thierry Scheerlinck
- 8Department of Orthopedic Surgery and Traumatology, Vrije Universiteit Brussel, Universitair Ziekenhuis Brussel, Brussels; and.,9Research Group Beeldvorming en Fysische wetenschappen (BEFY-ORTHO), Vrije Universiteit Brussel, Brussels, Belgium
| | - Johnny Duerinck
- 1Department of Neurosurgery, Vrije Universiteit Brussel, Universitair Ziekenhuis Brussel, Brussels.,2Research Group Center For Neurosciences (C4N-NEUR), Vrije Universiteit Brussel, Brussels
| |
Collapse
|
29
|
Evaluation of a Wearable AR Platform for Guiding Complex Craniotomies in Neurosurgery. Ann Biomed Eng 2021; 49:2590-2605. [PMID: 34297263 DOI: 10.1007/s10439-021-02834-8] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2021] [Accepted: 07/12/2021] [Indexed: 10/20/2022]
Abstract
Today, neuronavigation is widely used in daily clinical routine to perform safe and efficient surgery. Augmented reality (AR) interfaces can provide anatomical models and preoperative planning contextually blended with the real surgical scenario, overcoming the limitations of traditional neuronavigators. This study aims to demonstrate the reliability of a new-concept AR headset in navigating complex craniotomies. Moreover, we aim to prove the efficacy of a patient-specific template-based methodology for fast, non-invasive, and fully automatic planning-to-patient registration. The AR platform navigation performance was assessed with an in-vitro study whose goal was twofold: to measure the real-to-virtual 3D target visualization error (TVE), and assess the navigation accuracy through a user study involving 10 subjects in tracing a complex craniotomy. The feasibility of the template-based registration was preliminarily tested on a volunteer. The TVE mean and standard deviation were 1.3 and 0.6 mm. The results of the user study, over 30 traced craniotomies, showed that 97% of the trajectory length was traced within an error margin of 1.5 mm, and 92% within a margin of 1 mm. The in-vivo test confirmed the feasibility and reliability of the patient-specific template for registration. The proposed AR headset allows ergonomic and intuitive fruition of preoperative planning, and it can represent a valid option to support neurosurgical tasks.
Collapse
|
30
|
Chidambaram S, Stifano V, Demetres M, Teyssandier M, Palumbo MC, Redaelli A, Olivi A, Apuzzo MLJ, Pannullo SC. Applications of augmented reality in the neurosurgical operating room: A systematic review of the literature. J Clin Neurosci 2021; 91:43-61. [PMID: 34373059 DOI: 10.1016/j.jocn.2021.06.032] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2020] [Revised: 06/17/2021] [Accepted: 06/18/2021] [Indexed: 12/15/2022]
Abstract
Advancements in imaging techniques are key forces of progress in neurosurgery. The importance of accurate visualization of intraoperative anatomy cannot be overemphasized and is commonly delivered through traditional neuronavigation. Augmented Reality (AR) technology has been tested and applied widely in various neurosurgical subspecialties in intraoperative, clinical use and shows promise for the future. This systematic review of the literature explores the ways in which AR technology has been successfully brought into the operating room (OR) and incorporated into clinical practice. A comprehensive literature search was performed in the following databases from inception-April 2020: Ovid MEDLINE, Ovid EMBASE, and The Cochrane Library. Studies retrieved were then screened for eligibility against predefined inclusion/exclusion criteria. A total of 54 articles were included in this systematic review. The studies were sub- grouped into brain and spine subspecialties and analyzed for their incorporation of AR in the neurosurgical clinical setting. AR technology has the potential to greatly enhance intraoperative visualization and guidance in neurosurgery beyond the traditional neuronavigation systems. However, there are several key challenges to scaling the use of this technology and bringing it into standard operative practice including accurate and efficient brain segmentation of magnetic resonance imaging (MRI) scans, accounting for brain shift, reducing coregistration errors, and improving the AR device hardware. There is also an exciting potential for future work combining AR with multimodal imaging techniques and artificial intelligence to further enhance its impact in neurosurgery.
Collapse
Affiliation(s)
| | - Vito Stifano
- Department of Neurosurgery, Fondazione Policlinico Universitario A. Gemelli IRCCS, Rome, Italy; Institute of Neurosurgery, Catholic University, Rome, Italy
| | - Michelle Demetres
- Samuel J. Wood & C.V. Starr Biomedical Information Center, Weill Cornell Medical, College/New York Presbyterian Hospital, New York, NY, USA
| | | | - Maria Chiara Palumbo
- Department of Electronics, Information and Bioengineering, Politecnico di Milano, Milan, Italy
| | - Alberto Redaelli
- Department of Electronics, Information and Bioengineering, Politecnico di Milano, Milan, Italy
| | - Alessandro Olivi
- Department of Neurosurgery, Fondazione Policlinico Universitario A. Gemelli IRCCS, Rome, Italy; Institute of Neurosurgery, Catholic University, Rome, Italy
| | | | - Susan C Pannullo
- Department of Neurosurgery, Weill Cornell Medical College, NY, USA.
| |
Collapse
|
31
|
Applications of Smart Glasses in Applied Sciences: A Systematic Review. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app11114956] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The aim of this study is to review academic papers on the applications of smart glasses. Among 82 surveyed papers, 57 were selected through filtering. The papers were published from January 2014 to October 2020. Four research questions were set up using the systematic review method, and conclusions were drawn focusing on the research trends by year and application fields; product and operating system; sensors depending on the application purpose; and data visualization, processing, and transfer methods. It was found that the most popular commercial smart glass products are Android-based Google products. In addition, smart glasses are most often used in the healthcare field, particularly for clinical and surgical assistance or for assisting mentally or physically disabled persons. For visual data transfer, 90% of the studies conducted used a camera sensor. Smart glasses have mainly been used to visualize data based on augmented reality, in contrast with the use of mixed reality. The results of this review indicate that research related to smart glasses is steadily increasing, and technological research into the development of smart glasses is being actively conducted.
Collapse
|
32
|
Fick T, van Doormaal JAM, Hoving EW, Regli L, van Doormaal TPC. Holographic patient tracking after bed movement for augmented reality neuronavigation using a head-mounted display. Acta Neurochir (Wien) 2021; 163:879-884. [PMID: 33515122 PMCID: PMC7966201 DOI: 10.1007/s00701-021-04707-4] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2020] [Accepted: 01/04/2021] [Indexed: 11/27/2022]
Abstract
BACKGROUND Holographic neuronavigation has several potential advantages compared to conventional neuronavigation systems. We present the first report of a holographic neuronavigation system with patient-to-image registration and patient tracking with a reference array using an augmented reality head-mounted display (AR-HMD). METHODS Three patients undergoing an intracranial neurosurgical procedure were included in this pilot study. The relevant anatomy was first segmented in 3D and then uploaded as holographic scene in our custom neuronavigation software. Registration was performed using point-based matching using anatomical landmarks. We measured the fiducial registration error (FRE) as the outcome measure for registration accuracy. A custom-made reference array with QR codes was integrated in the neurosurgical setup and used for patient tracking after bed movement. RESULTS Six registrations were performed with a mean FRE of 8.5 mm. Patient tracking was achieved with no visual difference between the registration before and after movement. CONCLUSIONS This first report shows a proof of principle of intraoperative patient tracking using a standalone holographic neuronavigation system. The navigation accuracy should be further optimized to be clinically applicable. However, it is likely that this technology will be incorporated in future neurosurgical workflows because the system improves spatial anatomical understanding for the surgeon.
Collapse
Affiliation(s)
- T Fick
- Department of Neuro-oncology, Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584, CS, Utrecht, The Netherlands.
| | - J A M van Doormaal
- Department of Oral and Maxillofacial surgery, University Medical Centre Utrecht, Heidelberglaan 100, 3584, CX, Utrecht, The Netherlands
| | - E W Hoving
- Department of Neuro-oncology, Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584, CS, Utrecht, The Netherlands
- Department of Neurosurgery, University Medical Centre Utrecht, Heidelberglaan 100, 3584, CX, Utrecht, The Netherlands
| | - L Regli
- Department of Neurosurgery, Clinical Neuroscience Center, University Hospital Zurich, University of Zurich, Rämistrasse 100, 8091, Zürich, Switzerland
| | - T P C van Doormaal
- Department of Neurosurgery, University Medical Centre Utrecht, Heidelberglaan 100, 3584, CX, Utrecht, The Netherlands
- Department of Neurosurgery, Clinical Neuroscience Center, University Hospital Zurich, University of Zurich, Rämistrasse 100, 8091, Zürich, Switzerland
| |
Collapse
|
33
|
Fernandes de Oliveira Santos B, de Araujo Paz D, Fernandes VM, Dos Santos JC, Chaddad-Neto FEA, Sousa ACS, Oliveira JLM. Minimally invasive supratentorial neurosurgical approaches guided by Smartphone app and compass. Sci Rep 2021; 11:6778. [PMID: 33762597 PMCID: PMC7991647 DOI: 10.1038/s41598-021-85472-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2020] [Accepted: 03/02/2021] [Indexed: 01/19/2023] Open
Abstract
The precise location in the scalp of specifically planned points can help to achieve less invasive approaches. This study aims to develop a smartphone app, evaluate the precision and accuracy of the developed tool, and describe a series of cases using the referred technique. The application was developed with the React Native framework for Android and iOS. A phantom was printed based on the patient's CT scan, which was used for the calculation of accuracy and precision of the method. The points of interest were marked with an "x" on the patient's head, with the aid of the app and a compass attached to a skin marker pen. Then, two experienced neurosurgeons checked the plausibility of the demarcations based on the anatomical references. Both evaluators marked the frontal, temporal and parietal targets with a difference of less than 5 mm from the corresponding intended point, in all cases. The overall average accuracy observed was 1.6 ± 1.0 mm. The app was used in the surgical planning of trepanations for ventriculoperitoneal (VP) shunts and for drainage of abscesses, and in the definition of craniotomies for meningiomas, gliomas, brain metastases, intracranial hematomas, cavernomas, and arteriovenous malformation. The sample consisted of 88 volunteers who exhibited the following pathologies: 41 (46.6%) had brain tumors, 17 (19.3%) had traumatic brain injuries, 16 (18.2%) had spontaneous intracerebral hemorrhages, 2 (2.3%) had cavernomas, 1 (1.1%) had arteriovenous malformation (AVM), 4 (4.5%) had brain abscesses, and 7 (7.9%) had a VP shunt placement. In cases approached by craniotomy, with the exception of AVM, straight incisions and minicraniotomy were performed. Surgical planning with the aid of the NeuroKeypoint app is feasible and reliable. It has enabled neurological surgeries by craniotomy and trepanation in an accurate, precise, and less invasive manner.
Collapse
Affiliation(s)
- Bruno Fernandes de Oliveira Santos
- Health Sciences Graduate Program, Federal University of Sergipe, Aracaju, SE, Brazil. .,Unimed Sergipe Hospital, Aracaju, SE, Brazil. .,Clinic and Hospital São Lucas / Rede D`Or São Luiz, Aracaju, SE, Brazil. .,Department of Neurosurgery, Hospital de Cirurgia, Aracaju, SE, Brazil.
| | - Daniel de Araujo Paz
- Department of Neurology and Neurosurgery, Universidade Federal de São Paulo, São Paulo, SP, Brazil
| | | | | | | | - Antonio Carlos Sobral Sousa
- Health Sciences Graduate Program, Federal University of Sergipe, Aracaju, SE, Brazil.,Department of Internal Medicine, Federal University of Sergipe, Aracaju, SE, Brazil.,Division of Cardiology, University Hospital, Federal University of Sergipe, Aracaju, SE, Brazil.,Clinic and Hospital São Lucas / Rede D`Or São Luiz, Aracaju, SE, Brazil
| | - Joselina Luzia Menezes Oliveira
- Health Sciences Graduate Program, Federal University of Sergipe, Aracaju, SE, Brazil.,Department of Internal Medicine, Federal University of Sergipe, Aracaju, SE, Brazil.,Division of Cardiology, University Hospital, Federal University of Sergipe, Aracaju, SE, Brazil.,Clinic and Hospital São Lucas / Rede D`Or São Luiz, Aracaju, SE, Brazil
| |
Collapse
|
34
|
Gsaxner C, Pepe A, Li J, Ibrahimpasic U, Wallner J, Schmalstieg D, Egger J. Augmented Reality for Head and Neck Carcinoma Imaging: Description and Feasibility of an Instant Calibration, Markerless Approach. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2021; 200:105854. [PMID: 33261944 DOI: 10.1016/j.cmpb.2020.105854] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/10/2020] [Accepted: 11/16/2020] [Indexed: 06/12/2023]
Abstract
BACKGROUND AND OBJECTIVE Augmented reality (AR) can help to overcome current limitations in computer assisted head and neck surgery by granting "X-ray vision" to physicians. Still, the acceptance of AR in clinical applications is limited by technical and clinical challenges. We aim to demonstrate the benefit of a marker-free, instant calibration AR system for head and neck cancer imaging, which we hypothesize to be acceptable and practical for clinical use. METHODS We implemented a novel AR system for visualization of medical image data registered with the head or face of the patient prior to intervention. Our system allows the localization of head and neck carcinoma in relation to the outer anatomy. Our system does not require markers or stationary infrastructure, provides instant calibration and allows 2D and 3D multi-modal visualization for head and neck surgery planning via an AR head-mounted display. We evaluated our system in a pre-clinical user study with eleven medical experts. RESULTS Medical experts rated our application with a system usability scale score of 74.8 ± 15.9, which signifies above average, good usability and clinical acceptance. An average of 12.7 ± 6.6 minutes of training time was needed by physicians, before they were able to navigate the application without assistance. CONCLUSIONS Our AR system is characterized by a slim and easy setup, short training time and high usability and acceptance. Therefore, it presents a promising, novel tool for visualizing head and neck cancer imaging and pre-surgical localization of target structures.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 5, 8036 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria.
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria
| | - Jianning Li
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria
| | - Una Ibrahimpasic
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria
| | - Jürgen Wallner
- Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 5, 8036 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria; Department of Cranio-Maxillofacial Surgery, AZ Monica Hospital Antwerp and Antwerp University Hospital, Antwerp, Belgium.
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 5, 8036 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria.
| |
Collapse
|
35
|
Schneider M, Kunz C, Pal'a A, Wirtz CR, Mathis-Ullrich F, Hlaváč M. Augmented reality-assisted ventriculostomy. Neurosurg Focus 2021; 50:E16. [PMID: 33386016 DOI: 10.3171/2020.10.focus20779] [Citation(s) in RCA: 25] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2020] [Accepted: 10/22/2020] [Indexed: 11/06/2022]
Abstract
OBJECTIVE Placement of a ventricular drain is one of the most common neurosurgical procedures. However, a higher rate of successful placements with this freehand procedure is desirable. The authors' objective was to develop a compact navigational augmented reality (AR)-based tool that does not require rigid patient head fixation, to support the surgeon during the operation. METHODS Segmentation and tracking algorithms were developed. A commercially available Microsoft HoloLens AR headset in conjunction with Vuforia marker-based tracking was used to provide guidance for ventriculostomy in a custom-made 3D-printed head model. Eleven surgeons conducted a series of tests to place a total of 110 external ventricular drains under holographic guidance. The HoloLens was the sole active component; no rigid head fixation was necessary. CT was used to obtain puncture results and quantify success rates as well as precision of the suggested setup. RESULTS In the proposed setup, the system worked reliably and performed well. The reported application showed an overall ventriculostomy success rate of 68.2%. The offset from the reference trajectory as displayed in the hologram was 5.2 ± 2.6 mm (mean ± standard deviation). A subgroup conducted a second series of punctures in which results and precision improved significantly. For most participants it was their first encounter with AR headset technology and the overall feedback was positive. CONCLUSIONS To the authors' knowledge, this is the first report on marker-based, AR-guided ventriculostomy. The results from this first application are encouraging. The authors would expect good acceptance of this compact navigation device in a supposed clinical implementation and assume a steep learning curve in the application of this technique. To achieve this translation, further development of the marker system and implementation of the new hardware generation are planned. Further testing to address visuospatial issues is needed prior to application in humans.
Collapse
Affiliation(s)
- Max Schneider
- 1Department of Neurosurgery, University of Ulm, Günzburg; and
| | - Christian Kunz
- 2Health Robotics and Automation Lab, Institute for Anthropomatics and Robotics, Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany
| | - Andrej Pal'a
- 1Department of Neurosurgery, University of Ulm, Günzburg; and
| | | | - Franziska Mathis-Ullrich
- 2Health Robotics and Automation Lab, Institute for Anthropomatics and Robotics, Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany
| | - Michal Hlaváč
- 1Department of Neurosurgery, University of Ulm, Günzburg; and
| |
Collapse
|
36
|
Mondal SB, Achilefu S. Virtual and Augmented Reality Technologies in Molecular and Anatomical Imaging. Mol Imaging 2021. [DOI: 10.1016/b978-0-12-816386-3.00066-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
|
37
|
Buch VP, Mensah-Brown KG, Germi JW, Park BJ, Madsen PJ, Borja AJ, Haldar D, Basenfelder P, Yoon JW, Schuster JM, Chen HCI. Development of an Intraoperative Pipeline for Holographic Mixed Reality Visualization During Spinal Fusion Surgery. Surg Innov 2020; 28:427-437. [PMID: 33382008 DOI: 10.1177/1553350620984339] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Objective. Holographic mixed reality (HMR) allows for the superimposition of computer-generated virtual objects onto the operator's view of the world. Innovative solutions can be developed to enable the use of this technology during surgery. The authors developed and iteratively optimized a pipeline to construct, visualize, and register intraoperative holographic models of patient landmarks during spinal fusion surgery. Methods. The study was carried out in two phases. In phase 1, the custom intraoperative pipeline to generate patient-specific holographic models was developed over 7 patients. In phase 2, registration accuracy was optimized iteratively for 6 patients in a real-time operative setting. Results. In phase 1, an intraoperative pipeline was successfully employed to generate and deploy patient-specific holographic models. In phase 2, the registration error with the native hand-gesture registration was 20.2 ± 10.8 mm (n = 7 test points). Custom controller-based registration significantly reduced the mean registration error to 4.18 ± 2.83 mm (n = 24 test points, P < .01). Accuracy improved over time (B = -.69, P < .0001) with the final patient achieving a registration error of 2.30 ± .58 mm. Across both phases, the average model generation time was 18.0 ± 6.1 minutes (n = 6) for isolated spinal hardware and 33.8 ± 8.6 minutes (n = 6) for spinal anatomy. Conclusions. A custom pipeline is described for the generation of intraoperative 3D holographic models during spine surgery. Registration accuracy dramatically improved with iterative optimization of the pipeline and technique. While significant improvements and advancements need to be made to enable clinical utility, HMR demonstrates significant potential as the next frontier of intraoperative visualization.
Collapse
Affiliation(s)
- Vivek P Buch
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - Kobina G Mensah-Brown
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - James W Germi
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - Brian J Park
- Department of Radiology, 6572University of Pennsylvania Perelman School of Medicine, Philadelphia, PA, USA
| | - Peter J Madsen
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - Austin J Borja
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - Debanjan Haldar
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - Patricia Basenfelder
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - Jang W Yoon
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - James M Schuster
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - Han-Chiao I Chen
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| |
Collapse
|
38
|
Andrews CM, Henry AB, Soriano IM, Southworth MK, Silva JR. Registration Techniques for Clinical Applications of Three-Dimensional Augmented Reality Devices. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE 2020; 9:4900214. [PMID: 33489483 PMCID: PMC7819530 DOI: 10.1109/jtehm.2020.3045642] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/28/2020] [Revised: 11/13/2020] [Accepted: 12/03/2020] [Indexed: 12/15/2022]
Abstract
Many clinical procedures would benefit from direct and intuitive real-time visualization of anatomy, surgical plans, or other information crucial to the procedure. Three-dimensional augmented reality (3D-AR) is an emerging technology that has the potential to assist physicians with spatial reasoning during clinical interventions. The most intriguing applications of 3D-AR involve visualizations of anatomy or surgical plans that appear directly on the patient. However, commercially available 3D-AR devices have spatial localization errors that are too large for many clinical procedures. For this reason, a variety of approaches for improving 3D-AR registration accuracy have been explored. The focus of this review is on the methods, accuracy, and clinical applications of registering 3D-AR devices with the clinical environment. The works cited represent a variety of approaches for registering holograms to patients, including manual registration, computer vision-based registration, and registrations that incorporate external tracking systems. Evaluations of user accuracy when performing clinically relevant tasks suggest that accuracies of approximately 2 mm are feasible. 3D-AR device limitations due to the vergence-accommodation conflict or other factors attributable to the headset hardware add on the order of 1.5 mm of error compared to conventional guidance. Continued improvements to 3D-AR hardware will decrease these sources of error.
Collapse
Affiliation(s)
- Christopher M. Andrews
- Department of Biomedical EngineeringWashington University in St Louis, McKelvey School of EngineeringSt LouisMO63130USA
- SentiAR, Inc.St. LouisMO63108USA
| | | | | | | | - Jonathan R. Silva
- Department of Biomedical EngineeringWashington University in St Louis, McKelvey School of EngineeringSt LouisMO63130USA
| |
Collapse
|
39
|
Fick T, van Doormaal JAM, Hoving EW, Willems PWA, van Doormaal TPC. Current Accuracy of Augmented Reality Neuronavigation Systems: Systematic Review and Meta-Analysis. World Neurosurg 2020; 146:179-188. [PMID: 33197631 DOI: 10.1016/j.wneu.2020.11.029] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2020] [Revised: 11/04/2020] [Accepted: 11/05/2020] [Indexed: 12/17/2022]
Abstract
BACKGROUND Augmented reality neuronavigation (ARN) systems can overlay three-dimensional anatomy and disease without the need for a two-dimensional external monitor. Accuracy is crucial for their clinical applicability. We performed a systematic review regarding the reported accuracy of ARN systems and compared them with the accuracy of conventional infrared neuronavigation (CIN). METHODS PubMed and Embase were searched for ARN and CIN systems. For ARN, type of system, method of patient-to-image registration, accuracy method, and accuracy of the system were noted. For CIN, navigation accuracy, expressed as target registration error (TRE), was noted. A meta-analysis was performed comparing the TRE of ARN and CIN systems. RESULTS Thirty-five studies were included, 12 for ARN and 23 for CIN. ARN systems could be divided into head-mounted display and heads-up display. In ARN, 4 methods were encountered for patient-to-image registration, of which point-pair matching was the one most frequently used. Five methods for assessing accuracy were described. Ninety-four TRE measurements of ARN systems were compared with 9058 TRE measurements of CIN systems. Mean TRE was 2.5 mm (95% confidence interval, 0.7-4.4) for ARN systems and 2.6 mm (95% confidence interval, 2.1-3.1) for CIN systems. CONCLUSIONS In ARN, there seems to be lack of agreement regarding the best method to assess accuracy. Nevertheless, ARN systems seem able to achieve an accuracy comparable to CIN systems. Future studies should be prospective and compare TREs, which should be measured in a standardized fashion.
Collapse
Affiliation(s)
- Tim Fick
- Department of Neuro-oncology, Princess Máxima Center for Pediatric Oncology, Utrecht, The Netherlands.
| | - Jesse A M van Doormaal
- Department of Oral and Maxillofacial Surgery, University Medical Centre Utrecht, Utrecht, The Netherlands
| | - Eelco W Hoving
- Department of Neuro-oncology, Princess Máxima Center for Pediatric Oncology, Utrecht, The Netherlands
| | - Peter W A Willems
- Department of Neurosurgery, University Medical Centre Utrecht, Utrecht, The Netherlands
| | - Tristan P C van Doormaal
- Department of Neurosurgery, University Medical Centre Utrecht, Utrecht, The Netherlands; Department of Neurosurgery, Clinical Neuroscience Center, University Hospital Zurich, University of Zurich, Switzerland
| |
Collapse
|
40
|
Park BJ, Hunt SJ, Nadolski GJ, Gade TP. Augmented reality improves procedural efficiency and reduces radiation dose for CT-guided lesion targeting: a phantom study using HoloLens 2. Sci Rep 2020; 10:18620. [PMID: 33122766 PMCID: PMC7596500 DOI: 10.1038/s41598-020-75676-4] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Accepted: 10/19/2020] [Indexed: 12/14/2022] Open
Abstract
Out-of-plane lesions pose challenges for CT-guided interventions. Augmented reality (AR) headsets are capable to provide holographic 3D guidance to assist CT-guided targeting. A prospective trial was performed assessing CT-guided lesion targeting on an abdominal phantom with and without AR guidance using HoloLens 2. Eight operators performed a cumulative total of 86 needle passes. Total needle redirections, radiation dose, procedure time, and puncture rates of nontargeted lesions were compared with and without AR. Mean number of needle passes to reach the target reduced from 7.4 passes without AR to 3.4 passes with AR (p = 0.011). Mean CT dose index decreased from 28.7 mGy without AR to 16.9 mGy with AR (p = 0.009). Mean procedure time reduced from 8.93 min without AR to 4.42 min with AR (p = 0.027). Puncture rate of a nontargeted lesion decreased from 11.9% without AR (7/59 passes) to 0% with AR (0/27 passes). First needle passes were closer to the ideal target trajectory with AR versus without AR (4.6° vs 8.0° offset, respectively, p = 0.018). AR reduced variability and elevated the performance of all operators to the same level irrespective of prior clinical experience. AR guidance can provide significant improvements in procedural efficiency and radiation dose savings for targeting out-of-plane lesions.
Collapse
Affiliation(s)
- Brian J Park
- Oregon Health and Science, University School of Medicine, 3181 SW Sam Jackson Park Rd, Portland, OR, 97239, USA.
| | - Stephen J Hunt
- Perelman School of Medicine, University of Pennsylvania, 3400 Civic Center Blvd, Philadelphia, PA, 19104, USA
| | - Gregory J Nadolski
- Perelman School of Medicine, University of Pennsylvania, 3400 Civic Center Blvd, Philadelphia, PA, 19104, USA
| | - Terence P Gade
- Perelman School of Medicine, University of Pennsylvania, 3400 Civic Center Blvd, Philadelphia, PA, 19104, USA
| |
Collapse
|
41
|
Value of the surgeon's sightline on hologram registration and targeting in mixed reality. Int J Comput Assist Radiol Surg 2020; 15:2027-2039. [PMID: 32984934 PMCID: PMC7671978 DOI: 10.1007/s11548-020-02263-3] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2020] [Accepted: 09/14/2020] [Indexed: 12/12/2022]
Abstract
Purpose Mixed reality (MR) is being evaluated as a visual tool for surgical navigation. Current literature presents unclear results on intraoperative accuracy using the Microsoft HoloLens 1®. This study aims to assess the impact of the surgeon’s sightline in an inside-out marker-based MR navigation system for open surgery. Methods Surgeons at Akershus University Hospital tested this system. A custom-made phantom was used, containing 18 wire target crosses within its inner walls. A CT scan was obtained in order to segment all wire targets into a single 3D-model (hologram). An in-house software application (CTrue), developed for the Microsoft HoloLens 1, uploaded 3D-models and automatically registered the 3D-model with the phantom. Based on the surgeon’s sightline while registering and targeting (free sightline /F/or a strictly perpendicular sightline /P/), 4 scenarios were developed (FF-PF-FP-PP). Target error distance (TED) was obtained in three different working axes-(XYZ).
Results Six surgeons (5 males, age 29–62) were enrolled. A total of 864 measurements were collected in 4 scenarios, twice. Scenario PP showed the smallest TED in XYZ-axes mean = 2.98 mm ± SD 1.33; 2.28 mm ± SD 1.45; 2.78 mm ± SD 1.91, respectively. Scenario FF showed the largest TED in XYZ-axes with mean = 10.03 mm ± SD 3.19; 6.36 mm ± SD 3.36; 16.11 mm ± SD 8.91, respectively. Multiple comparison tests, grouped in scenarios and axes, showed that the majority of scenario comparisons had significantly different TED values (p < 0.05). Y-axis always presented the smallest TED regardless of scenario tested. Conclusion A strictly perpendicular working sightline in relation to the 3D-model achieves the best accuracy results. Shortcomings in this technology, as an intraoperative visual cue, can be overcome by sightline correction. Incidentally, this is the preferred working angle for open surgery.
Collapse
|
42
|
Scherl C, Stratemeier J, Karle C, Rotter N, Hesser J, Huber L, Dias A, Hoffmann O, Riffel P, Schoenberg SO, Schell A, Lammert A, Affolter A, Männle D. Augmented reality with HoloLens in parotid surgery: how to assess and to improve accuracy. Eur Arch Otorhinolaryngol 2020; 278:2473-2483. [PMID: 32910225 DOI: 10.1007/s00405-020-06351-7] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2020] [Accepted: 08/31/2020] [Indexed: 11/28/2022]
Abstract
PURPOSE Augmented reality improves planning and execution of surgical procedures. The aim of this study was to evaluate the feasibility of a 3D augmented reality hologram in live parotic surgery. Another goal was to develop an accuracy measuring instrument and to determine the accuracy of the system. METHODS We created a software to build and manually align 2D and 3D augmented reality models generated from MRI data onto the patient during surgery using the HoloLens® 1 (Microsoft Corporation, Redmond, USA). To assess the accuracy of the system, we developed a specific measuring tool applying a standard electromagnetic navigation device (Fiagon GmbH, Hennigsdorf, Germany). RESULTS The accuracy of our system was measured during real surgical procedures. Training of the experimenters and the use of fiducial markers significantly reduced the accuracy of holographic system (p = 0.0166 and p = 0.0132). Precision of the developed measuring system was very high with a mean error of the basic system of 1.3 mm. Feedback evaluation demonstrated 86% of participants agreed or strongly agreed that the HoloLens will play a role in surgical education. Furthermore, 80% of participants agreed or strongly agreed that the HoloLens is feasible to be introduced in clinical routine and will play a role within surgery in the future. CONCLUSION The use of fiducial markers and repeated training reduces the positional error between the hologram and the real structures. The developed measuring device under the use of the Fiagon navigation system is suitable to measure accuracies of holographic augmented reality images of the HoloLens.
Collapse
Affiliation(s)
- Claudia Scherl
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Theodor-Kutzer-Ufer 1-3, 68167, Mannheim, Germany.
| | - Johanna Stratemeier
- Institute of Experimental Radiation Oncology, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Theodor-Kutzer-Ufer 1-3, 68167, Mannheim, Germany
| | - Celine Karle
- Institute of Experimental Radiation Oncology, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Theodor-Kutzer-Ufer 1-3, 68167, Mannheim, Germany
| | - Nicole Rotter
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Theodor-Kutzer-Ufer 1-3, 68167, Mannheim, Germany
| | - Jürgen Hesser
- Institute of Experimental Radiation Oncology, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Theodor-Kutzer-Ufer 1-3, 68167, Mannheim, Germany
| | - Lena Huber
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Theodor-Kutzer-Ufer 1-3, 68167, Mannheim, Germany
| | - Andre Dias
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Theodor-Kutzer-Ufer 1-3, 68167, Mannheim, Germany
| | - Oliver Hoffmann
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Theodor-Kutzer-Ufer 1-3, 68167, Mannheim, Germany
| | - Philipp Riffel
- Department of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Theodor-Kutzer-Ufer 1-3, 68167, Mannheim, Germany
| | - Stefan O Schoenberg
- Department of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Theodor-Kutzer-Ufer 1-3, 68167, Mannheim, Germany
| | - Angela Schell
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Theodor-Kutzer-Ufer 1-3, 68167, Mannheim, Germany
| | - Anne Lammert
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Theodor-Kutzer-Ufer 1-3, 68167, Mannheim, Germany
| | - Annette Affolter
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Theodor-Kutzer-Ufer 1-3, 68167, Mannheim, Germany
| | - David Männle
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Theodor-Kutzer-Ufer 1-3, 68167, Mannheim, Germany
| |
Collapse
|
43
|
Maloca PM, Faludi B, Zelechowski M, Jud C, Vollmar T, Hug S, Müller PL, de Carvalho ER, Zarranz-Ventura J, Reich M, Lange C, Egan C, Tufail A, Hasler PW, Scholl HPN, Cattin PC. Validation of virtual reality orbitometry bridges digital and physical worlds. Sci Rep 2020; 10:11815. [PMID: 32678297 PMCID: PMC7366721 DOI: 10.1038/s41598-020-68867-6] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2020] [Accepted: 06/22/2020] [Indexed: 11/09/2022] Open
Abstract
Clinical science and medical imaging technology are traditionally displayed in two dimensions (2D) on a computer monitor. In contrast, three-dimensional (3D) virtual reality (VR) expands the realm of 2D image visualization, enabling an immersive VR experience with unhindered spatial interaction by the user. Thus far, analysis of data extracted from VR applications was mainly qualitative. In this study, we enhance VR and provide evidence for quantitative VR research by validating digital VR display of computed tomography (CT) data of the orbit. Volumetric CT data were transferred and rendered into a VR environment. Subsequently, seven graders performed repeated and blinded diameter measurements. The intergrader variability of the measurements in VR was much lower compared to measurements in the physical world and measurements were reasonably consistent with their corresponding elements in the real context. The overall VR measurements were 5.49% higher. As such, this study attests the ability of VR to provide similar quantitative data alongside the added benefit of VR interfaces. VR entails a lot of potential for the future research in ophthalmology and beyond in any scientific field that uses three-dimensional data.
Collapse
Affiliation(s)
- Peter M Maloca
- Institute of Molecular and Clinical Ophthalmology Basel (IOB), 4031, Basel, Switzerland. .,OCTlab, Department of Ophthalmology, University Hospital Basel, 4031, Basel, Switzerland. .,Department of Ophthalmology, University of Basel, 4031, Basel, Switzerland. .,Moorfields Eye Hospital NHS Foundation Trust, London, EC1V 2PD, UK.
| | - Balázs Faludi
- Center for Medical Image Analysis & Navigation, University of Basel, 4031, Basel, Switzerland
| | - Marek Zelechowski
- Center for Medical Image Analysis & Navigation, University of Basel, 4031, Basel, Switzerland
| | - Christoph Jud
- Center for Medical Image Analysis & Navigation, University of Basel, 4031, Basel, Switzerland
| | - Theo Vollmar
- MRZ Medical Radiology Center, 6004, Lucerne, Switzerland
| | - Sibylle Hug
- MRZ Medical Radiology Center, 6004, Lucerne, Switzerland
| | - Philipp L Müller
- Moorfields Eye Hospital NHS Foundation Trust, London, EC1V 2PD, UK
| | | | | | - Michael Reich
- Faculty of Medicine, Eye Center, Albert-Ludwigs University Freiburg, 79085, Freiburg, Germany
| | - Clemens Lange
- Faculty of Medicine, Eye Center, Albert-Ludwigs University Freiburg, 79085, Freiburg, Germany
| | - Catherine Egan
- Moorfields Eye Hospital NHS Foundation Trust, London, EC1V 2PD, UK
| | - Adnan Tufail
- Moorfields Eye Hospital NHS Foundation Trust, London, EC1V 2PD, UK
| | - Pascal W Hasler
- OCTlab, Department of Ophthalmology, University Hospital Basel, 4031, Basel, Switzerland.,Department of Ophthalmology, University of Basel, 4031, Basel, Switzerland
| | - Hendrik P N Scholl
- Institute of Molecular and Clinical Ophthalmology Basel (IOB), 4031, Basel, Switzerland.,Department of Ophthalmology, University of Basel, 4031, Basel, Switzerland.,Wilmer Eye Institute, Johns Hopkins University, Baltimore, 21287, USA
| | - Philippe C Cattin
- Center for Medical Image Analysis & Navigation, University of Basel, 4031, Basel, Switzerland
| |
Collapse
|
44
|
Fredrickson VL, Lin M, Catapano JS, Attenello FJ. Commentary: Clinical Accuracy of Holographic Navigation Using Point-Based Registration on Augmented-Reality Glasses. Oper Neurosurg (Hagerstown) 2020; 17:E229-E230. [PMID: 31515566 DOI: 10.1093/ons/opz266] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2019] [Accepted: 06/16/2019] [Indexed: 11/13/2022] Open
Affiliation(s)
- Vance L Fredrickson
- Department of Neurological Surgery, Keck School of Medicine, University of Southern California, Los Angeles, California.,Department of Neurosurgery, Barrow Neurological Institute, St. Joseph's Hospital and Medical Center, Phoenix, Arizona
| | - Michelle Lin
- Department of Neurological Surgery, Keck School of Medicine, University of Southern California, Los Angeles, California
| | - Joshua S Catapano
- Department of Neurosurgery, Barrow Neurological Institute, St. Joseph's Hospital and Medical Center, Phoenix, Arizona
| | - Frank J Attenello
- Department of Neurological Surgery, Keck School of Medicine, University of Southern California, Los Angeles, California
| |
Collapse
|