1
|
Qi Z, Jin H, Xu X, Wang Q, Gan Z, Xiong R, Zhang S, Liu M, Wang J, Ding X, Chen X, Zhang J, Nimsky C, Bopp MHA. Head model dataset for mixed reality navigation in neurosurgical interventions for intracranial lesions. Sci Data 2024; 11:538. [PMID: 38796526 PMCID: PMC11127921 DOI: 10.1038/s41597-024-03385-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2024] [Accepted: 05/15/2024] [Indexed: 05/28/2024] Open
Abstract
Mixed reality navigation (MRN) technology is emerging as an increasingly significant and interesting topic in neurosurgery. MRN enables neurosurgeons to "see through" the head with an interactive, hybrid visualization environment that merges virtual- and physical-world elements. Offering immersive, intuitive, and reliable guidance for preoperative and intraoperative intervention of intracranial lesions, MRN showcases its potential as an economically efficient and user-friendly alternative to standard neuronavigation systems. However, the clinical research and development of MRN systems present challenges: recruiting a sufficient number of patients within a limited timeframe is difficult, and acquiring low-cost, commercially available, medically significant head phantoms is equally challenging. To accelerate the development of novel MRN systems and surmount these obstacles, the study presents a dataset designed for MRN system development and testing in neurosurgery. It includes CT and MRI data from 19 patients with intracranial lesions and derived 3D models of anatomical structures and validation references. The models are available in Wavefront object (OBJ) and Stereolithography (STL) formats, supporting the creation and assessment of neurosurgical MRN applications.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043, Marburg, Germany.
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China.
| | - Haitao Jin
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
- NCO School, Army Medical University, 050081, Shijiazhuang, China
| | - Xinghua Xu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
| | - Qun Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
| | - Zhichao Gan
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
| | - Ruochu Xiong
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Department of Neurosurgery, Division of Medicine, Graduate School of Medical Sciences, Kanazawa University, Takara-machi 13-1, 920-8641, Kanazawa, Ishikawa, Japan
| | - Shiyu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
| | - Minghang Liu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
| | - Jingyue Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
| | - Xinyu Ding
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
| | - Xiaolei Chen
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
| | - Jiashu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China.
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043, Marburg, Germany
- Center for Mind, Brain and Behavior (CMBB), 35043, Marburg, Germany
| | - Miriam H A Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043, Marburg, Germany.
- Center for Mind, Brain and Behavior (CMBB), 35043, Marburg, Germany.
| |
Collapse
|
2
|
Huerta Osnaya JR, Gonzalez Carranza V, Chico-Ponce de León F, Pérez-Escamirosa F, Lorias-Espinoza D. Image Guided Interpedicular Screw Placement Simulation System for Training and Skill Evaluation. Proof of Concept. World Neurosurg 2024:S1878-8750(24)00849-0. [PMID: 38768749 DOI: 10.1016/j.wneu.2024.05.087] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2024] [Revised: 05/13/2024] [Accepted: 05/14/2024] [Indexed: 05/22/2024]
Abstract
BACKGROUND The SpineST-01 system is an image-guided vertebrae cannulation training system. During task execution, the computer calculates performance-based metrics displaying different visual perspectives (lateral view, axial view, anteroposterior view) with the position of the instrument inside the vertebra. Finally, a report with the metrics is generated as performance feedback. METHODS A training box holds a 3D printed spine section. The computer works with 2 orthogonally disposed cameras, tracking passive markers placed on the instrument. Eight metrics were proposed to evaluate the execution of the surgical task. A preliminary study with 25 participants divided into 3 groups (12 novices, 10 intermediates, and 3 expert) was conducted to determine the feasibility of the system and to evaluate and assess the performance differences of each group using Kruskal-Wallis analysis and Mann-Whitney U analysis. In both analyses, a P value ≤ 0.05 was considered statistically significant. RESULTS When comparing experts versus novices and all 3 groups, statistical analysis showed significant differences in 6 of the 8 metrics: axial angle error (°), lateral angle error (°), average speed (mm/second), progress between shots (mm), Time (seconds), and shots. The metrics that did not show any statistically significant difference were time between shots (seconds), and speed between shots (mm/second). Also, the average result comparison placed the experts as the best performance group. CONCLUSIONS Initial testing of the SpineST-01 demonstrated potential for the system to practice image-guided cannulation tasks on lumbar vertebrae. Results showed objective differences between experts, intermediates, and novices in the proposed metrics, making this system a feasible option for developing basic navigation system skills without the risk of radiation exposure and objectively evaluating task performance.
Collapse
Affiliation(s)
- José Rubén Huerta Osnaya
- Departamento de Ingeniería Eléctrica, Sección de Bioelectrónica, Centro de Investigación y de Estudios Avanzados del Instituto Politécnico Nacional (Cinvestav), México, Mexico
| | | | | | - Fernando Pérez-Escamirosa
- Instituto de Ciencias Aplicadas y Tecnología (ICAT), Universidad Nacional Autónoma de México (UNAM), México, Mexico
| | - Daniel Lorias-Espinoza
- Departamento de Ingeniería Eléctrica, Sección de Bioelectrónica, Centro de Investigación y de Estudios Avanzados del Instituto Politécnico Nacional (Cinvestav), México, Mexico.
| |
Collapse
|
3
|
Polt M, Viehöfer AF, Casari FA, Imhoff FB, Wirth SH, Zimmermann SM. Conventional vs Augmented Reality-Guided Lateral Calcaneal Lengthening Simulated in a Foot Bone Model. Foot Ankle Int 2024:10711007241237532. [PMID: 38501722 DOI: 10.1177/10711007241237532] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 03/20/2024]
Abstract
BACKGROUND Acquired adult flatfoot deformity (AAFD) results in a loss of the medial longitudinal arch of the foot and dysfunction of the posteromedial soft tissues. Hintermann osteotomy (H-O) is often used to treat stage II AAFD. The procedure is challenging because of variations in the subtalar facets and limited intraoperative visibility. We aimed to assess the impact of augmented reality (AR) guidance on surgical accuracy and the facet violation rate. METHODS Sixty AR-guided and 60 conventional osteotomies were performed on foot bone models. For AR osteotomies, the ideal osteotomy plane was uploaded to a Microsoft HoloLens 1 headset and carried out in strict accordance with the superimposed holographic plane. The conventional osteotomies were performed relying solely on the anatomy of the calcaneal lateral column. The rate and severity of facet joint violation was measured, as well as accuracy of entry and exit points. The results were compared across AR-guided and conventional osteotomies, and between experienced and inexperienced surgeons. RESULTS Experienced surgeons showed significantly greater accuracy for the osteotomy entry point using AR, with the mean deviation of 1.6 ± 0.9 mm (95% CI 1.26, 1.93) compared to 2.3 ± 1.3 mm (95% CI 1.87, 2.79) in the conventional method (P = .035). The inexperienced had improved accuracy, although not statistically significant (P = .064), with the mean deviation of 2.0 ± 1.5 mm (95% CI 1.47, 2.55) using AR compared with 2.7 ± 1.6 mm (95% CI 2.18, 3.32) in the conventional method. AR helped the experienced surgeons avoid full violation of the posterior facet (P = .011). Inexperienced surgeons had a higher rate of middle and posterior facet injury with both methods (P = .005 and .021). CONCLUSION Application of AR guidance during H-O was associated with improved accuracy for experienced surgeons, demonstrated by a better accuracy of the osteotomy entry point. More crucially, AR guidance prevented full violation of the posterior facet in the experienced group. Further research is needed to address limitations and test this technology on cadaver feet. Ultimately, the use of AR in surgery has the potential to improve patient and surgeon safety while minimizing radiation exposure. CLINICAL RELEVANCE Subtalar facet injury during lateral column lengthening osteotomy represents a real problem in clinical orthopaedic practice. Because of limited intraoperative visibility and variable anatomy, it is hard to resolve this issue with conventional means. This study suggests the potential of augmented reality to improve the osteotomy accuracy.
Collapse
Affiliation(s)
- Maksym Polt
- Department of Orthopaedics, Balgrist University Hospital, University of Zürich, Zürich, Switzerland
| | - Arnd F Viehöfer
- Department of Orthopaedics, Balgrist University Hospital, University of Zürich, Zürich, Switzerland
| | - Fabio A Casari
- Department of Orthopaedics, Balgrist University Hospital, University of Zürich, Zürich, Switzerland
| | - Florian B Imhoff
- Department of Orthopaedics, Balgrist University Hospital, University of Zürich, Zürich, Switzerland
| | - Stephan H Wirth
- Department of Orthopaedics, Balgrist University Hospital, University of Zürich, Zürich, Switzerland
| | - Stefan M Zimmermann
- Department of Orthopaedics, Balgrist University Hospital, University of Zürich, Zürich, Switzerland
| |
Collapse
|
4
|
Youssef S, McDonnell JM, Wilson KV, Turley L, Cunniffe G, Morris S, Darwish S, Butler JS. Accuracy of augmented reality-assisted pedicle screw placement: a systematic review. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2024; 33:974-984. [PMID: 38177834 DOI: 10.1007/s00586-023-08094-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/23/2023] [Revised: 12/06/2023] [Accepted: 12/08/2023] [Indexed: 01/06/2024]
Abstract
OBJECTIVE Conventional freehand methods of pedicle screw placement are associated with significant complications due to close proximity to neural and vascular structures. Recent advances in augmented reality surgical navigation (ARSN) have led to its adoption into spine surgery. However, little is known regarding its overall accuracy. The purpose of this study is to delineate the overall accuracy of ARSN pedicle screw placement across various models. METHODS A systematic review was conducted of Medline/PubMed, Cochrane and Embase Library databases according to the PRISMA guidelines. Relevant data extracted included reports of pedicle screw placement accuracy and breaches, as defined by the Gertzbein-Robbins classification, in addition to deviation from pre-planned trajectory and entry point. Accuracy was defined as the summation of grade 0 and grade 1 events per the Gertzbein-Robbins classification. RESULTS Twenty studies reported clinically accurate placed screws. The range of clinically accurate placed screws was 26.3-100%, with 2095 screws (93.1%) being deemed clinically accurate. Furthermore, 5.4% (112/2088) of screws were reported as grade two breaches, 1.6% (33/2088) grade 3 breaches, 3.1% (29/926) medial breaches and 2.3% (21/926) lateral breaches. Mean linear deviation ranged from 1.3 to 5.99 mm, while mean angular/trajectory deviation ranged 1.6°-5.88°. CONCLUSION The results of this study highlight the overall accuracy of ARSN pedicle screw placement. However, further robust prospective studies are needed to accurately compare to conventional methods of pedicle screw placement.
Collapse
Affiliation(s)
- Salma Youssef
- School of Medicine, University College Dublin, Belfield, Dublin, Ireland
| | - Jake M McDonnell
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland
- Trinity Biomedical Sciences Institute, Trinity College Dublin, Dublin, Ireland
| | - Kielan V Wilson
- School of Medicine, University College Dublin, Belfield, Dublin, Ireland.
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland.
| | - Luke Turley
- Department of Orthopaedics, Tallaght University Hospital, Tallaght, Dublin, Ireland
| | - Gráinne Cunniffe
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland
| | - Seamus Morris
- School of Medicine, University College Dublin, Belfield, Dublin, Ireland
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland
| | - Stacey Darwish
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland
- Department of Orthopaedics, St. Vincent's University Hospital, Dublin, Ireland
| | - Joseph S Butler
- School of Medicine, University College Dublin, Belfield, Dublin, Ireland
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland
| |
Collapse
|
5
|
Hoch A, Liebmann F, Farshad M, Fürnstahl P, Rahm S, Zingg PO. Augmented reality-guided pelvic osteotomy of Ganz: feasibility in cadavers. Arch Orthop Trauma Surg 2024; 144:1077-1089. [PMID: 38133802 PMCID: PMC10896923 DOI: 10.1007/s00402-023-05167-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/21/2023] [Accepted: 11/25/2023] [Indexed: 12/23/2023]
Abstract
INTRODUCTION The periacetabular osteotomy is a technically demanding procedure with the goal to improve the osseous containment of the femoral head. The options for controlled execution of the osteotomies and verification of the acetabular reorientation are limited. With the assistance of augmented reality, new possibilities are emerging to guide this intervention. However, the scientific knowledge regarding AR navigation for PAO is sparse. METHODS In this cadaveric study, we wanted to find out, if the execution of this complex procedure is feasible with AR guidance, quantify the accuracy of the execution of the three-dimensional plan, and find out what has to be done to proceed to real surgery. Therefore, an AR guidance for the PAO was developed and applied on 14 human hip cadavers. The guidance included performance of the four osteotomies and reorientation of the acetabular fragment. The osteotomy starting points, the orientation of the osteotomy planes, as well as the reorientation of the acetabular fragment were compared to the 3D planning. RESULTS The mean 3D distance between planned and performed starting points was between 9 and 17 mm. The mean angle between planned and performed osteotomies was between 6° and 7°. The mean reorientation error between the planned and performed rotation of the acetabular fragment was between 2° and 11°. CONCLUSION The planned correction can be achieved with promising accuracy and without serious errors. Further steps for a translation from the cadaver to the patient have been identified and must be addressed in future work.
Collapse
Affiliation(s)
- Armando Hoch
- Department of Orthopaedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland.
| | - Florentin Liebmann
- Research in Orthopaedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopaedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopaedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Stefan Rahm
- Department of Orthopaedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - Patrick O Zingg
- Department of Orthopaedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| |
Collapse
|
6
|
Qi Z, Jin H, Wang Q, Gan Z, Xiong R, Zhang S, Liu M, Wang J, Ding X, Chen X, Zhang J, Nimsky C, Bopp MHA. The Feasibility and Accuracy of Holographic Navigation with Laser Crosshair Simulator Registration on a Mixed-Reality Display. SENSORS (BASEL, SWITZERLAND) 2024; 24:896. [PMID: 38339612 PMCID: PMC10857152 DOI: 10.3390/s24030896] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/04/2024] [Revised: 01/21/2024] [Accepted: 01/23/2024] [Indexed: 02/12/2024]
Abstract
Addressing conventional neurosurgical navigation systems' high costs and complexity, this study explores the feasibility and accuracy of a simplified, cost-effective mixed reality navigation (MRN) system based on a laser crosshair simulator (LCS). A new automatic registration method was developed, featuring coplanar laser emitters and a recognizable target pattern. The workflow was integrated into Microsoft's HoloLens-2 for practical application. The study assessed the system's precision by utilizing life-sized 3D-printed head phantoms based on computed tomography (CT) or magnetic resonance imaging (MRI) data from 19 patients (female/male: 7/12, average age: 54.4 ± 18.5 years) with intracranial lesions. Six to seven CT/MRI-visible scalp markers were used as reference points per case. The LCS-MRN's accuracy was evaluated through landmark-based and lesion-based analyses, using metrics such as target registration error (TRE) and Dice similarity coefficient (DSC). The system demonstrated immersive capabilities for observing intracranial structures across all cases. Analysis of 124 landmarks showed a TRE of 3.0 ± 0.5 mm, consistent across various surgical positions. The DSC of 0.83 ± 0.12 correlated significantly with lesion volume (Spearman rho = 0.813, p < 0.001). Therefore, the LCS-MRN system is a viable tool for neurosurgical planning, highlighting its low user dependency, cost-efficiency, and accuracy, with prospects for future clinical application enhancements.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Haitao Jin
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
- NCO School, Army Medical University, Shijiazhuang 050081, China
| | - Qun Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Zhichao Gan
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Ruochu Xiong
- Department of Neurosurgery, Division of Medicine, Graduate School of Medical Sciences, Kanazawa University, Takara-machi 13-1, Kanazawa 920-8641, Japan;
| | - Shiyu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Minghang Liu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Jingyue Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Xinyu Ding
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Xiaolei Chen
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Jiashu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Miriam H. A. Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| |
Collapse
|
7
|
Zhao X, Zhao H, Zheng W, Gohritz A, Shen Y, Xu W. Clinical evaluation of augmented reality-based 3D navigation system for brachial plexus tumor surgery. World J Surg Oncol 2024; 22:20. [PMID: 38233922 PMCID: PMC10792838 DOI: 10.1186/s12957-023-03288-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2023] [Accepted: 12/26/2023] [Indexed: 01/19/2024] Open
Abstract
BACKGROUND Augmented reality (AR), a form of 3D imaging technology, has been preliminarily applied in tumor surgery of the head and spine, both are rigid bodies. However, there is a lack of research evaluating the clinical value of AR in tumor surgery of the brachial plexus, a non-rigid body, where the anatomical position varies with patient posture. METHODS Prior to surgery in 8 patients diagnosed with brachial plexus tumors, conventional MRI scans were performed to obtain conventional 2D MRI images. The MRI data were then differentiated automatically and converted into AR-based 3D models. After point-to-point relocation and registration, the 3D models were projected onto the patient's body using a head-mounted display for navigation. To evaluate the clinical value of AR-based 3D models compared to the conventional 2D MRI images, 2 senior hand surgeons completed questionnaires on the evaluation of anatomical structures (tumor, arteries, veins, nerves, bones, and muscles), ranging from 1 (strongly disagree) to 5 (strongly agree). RESULTS Surgeons rated AR-based 3D models as superior to conventional MRI images for all anatomical structures, including tumors. Furthermore, AR-based 3D models were preferred for preoperative planning and intraoperative navigation, demonstrating their added value. The mean positional error between the 3D models and intraoperative findings was approximately 1 cm. CONCLUSIONS This study evaluated, for the first time, the clinical value of an AR-based 3D navigation system in preoperative planning and intraoperative navigation for brachial plexus tumor surgery. By providing more direct spatial visualization, compared with conventional 2D MRI images, this 3D navigation system significantly improved the clinical accuracy and safety of tumor surgery in non-rigid bodies.
Collapse
Affiliation(s)
- Xuanyu Zhao
- Department of Hand and Upper Extremity Surgery, Jing'an District Central Hospital, Branch of Huashan Hospital, Fudan University, Shanghai, China
- Department of Hand Surgery, Huashan Hospital, Fudan University, Shanghai, China
| | - Huali Zhao
- Department of Radiology, Jing'an District Central Hospital, Branch of Huashan Hospital, Fudan University, Shanghai, China
| | - Wanling Zheng
- Department of Hand and Upper Extremity Surgery, Jing'an District Central Hospital, Branch of Huashan Hospital, Fudan University, Shanghai, China
- Department of Hand Surgery, Huashan Hospital, Fudan University, Shanghai, China
| | - Andreas Gohritz
- Department of Plastic, Reconstructive, Aesthetic and Hand Surgery, University Hospital Basel, University of Basel, Basel, Switzerland
| | - Yundong Shen
- Department of Hand and Upper Extremity Surgery, Jing'an District Central Hospital, Branch of Huashan Hospital, Fudan University, Shanghai, China.
- Department of Hand Surgery, Huashan Hospital, Fudan University, Shanghai, China.
- The National Clinical Research Center for Aging and Medicine, Fudan University, Shanghai, China.
| | - Wendong Xu
- Department of Hand and Upper Extremity Surgery, Jing'an District Central Hospital, Branch of Huashan Hospital, Fudan University, Shanghai, China.
- Department of Hand Surgery, Huashan Hospital, Fudan University, Shanghai, China.
- The National Clinical Research Center for Aging and Medicine, Fudan University, Shanghai, China.
- Institute of Brain Science, State Key Laboratory of Medical Neurobiology and Collaborative Innovation Center for Brain Science, Fudan University, Shanghai, China.
- Research Unit of Synergistic Reconstruction of Upper and Lower Limbs after Brain Injury, Chinese Academy of Medical Sciences, Beijing, China.
| |
Collapse
|
8
|
Adida S, Legarreta AD, Hudson JS, McCarthy D, Andrews E, Shanahan R, Taori S, Lavadi RS, Buell TJ, Hamilton DK, Agarwal N, Gerszten PC. Machine Learning in Spine Surgery: A Narrative Review. Neurosurgery 2024; 94:53-64. [PMID: 37930259 DOI: 10.1227/neu.0000000000002660] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2023] [Accepted: 07/06/2023] [Indexed: 11/07/2023] Open
Abstract
Artificial intelligence and machine learning (ML) can offer revolutionary advances in their application to the field of spine surgery. Within the past 5 years, novel applications of ML have assisted in surgical decision-making, intraoperative imaging and navigation, and optimization of clinical outcomes. ML has the capacity to address many different clinical needs and improve diagnostic and surgical techniques. This review will discuss current applications of ML in the context of spine surgery by breaking down its implementation preoperatively, intraoperatively, and postoperatively. Ethical considerations to ML and challenges in ML implementation must be addressed to maximally benefit patients, spine surgeons, and the healthcare system. Areas for future research in augmented reality and mixed reality, along with limitations in generalizability and bias, will also be highlighted.
Collapse
Affiliation(s)
- Samuel Adida
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Andrew D Legarreta
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Joseph S Hudson
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - David McCarthy
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Edward Andrews
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Regan Shanahan
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Suchet Taori
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Raj Swaroop Lavadi
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Thomas J Buell
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - D Kojo Hamilton
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Nitin Agarwal
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
- Department of Neurosurgery, University of Pittsburgh Medical Center, Pittsburgh , Pennsylvania , USA
| | - Peter C Gerszten
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| |
Collapse
|
9
|
Liebmann F, von Atzigen M, Stütz D, Wolf J, Zingg L, Suter D, Cavalcanti NA, Leoty L, Esfandiari H, Snedeker JG, Oswald MR, Pollefeys M, Farshad M, Fürnstahl P. Automatic registration with continuous pose updates for marker-less surgical navigation in spine surgery. Med Image Anal 2024; 91:103027. [PMID: 37992494 DOI: 10.1016/j.media.2023.103027] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2023] [Revised: 10/29/2023] [Accepted: 11/09/2023] [Indexed: 11/24/2023]
Abstract
Established surgical navigation systems for pedicle screw placement have been proven to be accurate, but still reveal limitations in registration or surgical guidance. Registration of preoperative data to the intraoperative anatomy remains a time-consuming, error-prone task that includes exposure to harmful radiation. Surgical guidance through conventional displays has well-known drawbacks, as information cannot be presented in-situ and from the surgeon's perspective. Consequently, radiation-free and more automatic registration methods with subsequent surgeon-centric navigation feedback are desirable. In this work, we present a marker-less approach that automatically solves the registration problem for lumbar spinal fusion surgery in a radiation-free manner. A deep neural network was trained to segment the lumbar spine and simultaneously predict its orientation, yielding an initial pose for preoperative models, which then is refined for each vertebra individually and updated in real-time with GPU acceleration while handling surgeon occlusions. An intuitive surgical guidance is provided thanks to the integration into an augmented reality based navigation system. The registration method was verified on a public dataset with a median of 100% successful registrations, a median target registration error of 2.7 mm, a median screw trajectory error of 1.6°and a median screw entry point error of 2.3 mm. Additionally, the whole pipeline was validated in an ex-vivo surgery, yielding a 100% screw accuracy and a median target registration error of 1.0 mm. Our results meet clinical demands and emphasize the potential of RGB-D data for fully automatic registration approaches in combination with augmented reality guidance.
Collapse
Affiliation(s)
- Florentin Liebmann
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland.
| | - Marco von Atzigen
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland
| | - Dominik Stütz
- Computer Vision and Geometry Group, ETH Zurich, Zurich, Switzerland
| | - Julian Wolf
- Product Development Group, ETH Zurich, Zurich, Switzerland
| | - Lukas Zingg
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Daniel Suter
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Nicola A Cavalcanti
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Laura Leoty
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Hooman Esfandiari
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Jess G Snedeker
- Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland
| | - Martin R Oswald
- Computer Vision and Geometry Group, ETH Zurich, Zurich, Switzerland; Computer Vision Lab, University of Amsterdam, Amsterdam, Netherlands
| | - Marc Pollefeys
- Computer Vision and Geometry Group, ETH Zurich, Zurich, Switzerland; Microsoft Mixed Reality and AI Zurich Lab, Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| |
Collapse
|
10
|
Qi Z, Bopp MHA, Nimsky C, Chen X, Xu X, Wang Q, Gan Z, Zhang S, Wang J, Jin H, Zhang J. A Novel Registration Method for a Mixed Reality Navigation System Based on a Laser Crosshair Simulator: A Technical Note. Bioengineering (Basel) 2023; 10:1290. [PMID: 38002414 PMCID: PMC10669875 DOI: 10.3390/bioengineering10111290] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2023] [Accepted: 11/01/2023] [Indexed: 11/26/2023] Open
Abstract
Mixed Reality Navigation (MRN) is pivotal in augmented reality-assisted intelligent neurosurgical interventions. However, existing MRN registration methods face challenges in concurrently achieving low user dependency, high accuracy, and clinical applicability. This study proposes and evaluates a novel registration method based on a laser crosshair simulator, evaluating its feasibility and accuracy. A novel registration method employing a laser crosshair simulator was introduced, designed to replicate the scanner frame's position on the patient. The system autonomously calculates the transformation, mapping coordinates from the tracking space to the reference image space. A mathematical model and workflow for registration were designed, and a Universal Windows Platform (UWP) application was developed on HoloLens-2. Finally, a head phantom was used to measure the system's target registration error (TRE). The proposed method was successfully implemented, obviating the need for user interactions with virtual objects during the registration process. Regarding accuracy, the average deviation was 3.7 ± 1.7 mm. This method shows encouraging results in efficiency and intuitiveness and marks a valuable advancement in low-cost, easy-to-use MRN systems. The potential for enhancing accuracy and adaptability in intervention procedures positions this approach as promising for improving surgical outcomes.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
| | - Miriam H. A. Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Xiaolei Chen
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Xinghua Xu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Qun Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Zhichao Gan
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Shiyu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Jingyue Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Haitao Jin
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
- NCO School, Army Medical University, Shijiazhuang 050081, China
| | - Jiashu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| |
Collapse
|
11
|
Pose-Díez-de-la-Lastra A, Ungi T, Morton D, Fichtinger G, Pascau J. Real-time integration between Microsoft HoloLens 2 and 3D Slicer with demonstration in pedicle screw placement planning. Int J Comput Assist Radiol Surg 2023; 18:2023-2032. [PMID: 37310561 PMCID: PMC10589185 DOI: 10.1007/s11548-023-02977-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2023] [Accepted: 05/23/2023] [Indexed: 06/14/2023]
Abstract
PURPOSE Up to date, there has been a lack of software infrastructure to connect 3D Slicer to any augmented reality (AR) device. This work describes a novel connection approach using Microsoft HoloLens 2 and OpenIGTLink, with a demonstration in pedicle screw placement planning. METHODS We developed an AR application in Unity that is wirelessly rendered onto Microsoft HoloLens 2 using Holographic Remoting. Simultaneously, Unity connects to 3D Slicer using the OpenIGTLink communication protocol. Geometrical transform and image messages are transferred between both platforms in real time. Through the AR glasses, a user visualizes a patient's computed tomography overlaid onto virtual 3D models showing anatomical structures. We technically evaluated the system by measuring message transference latency between the platforms. Its functionality was assessed in pedicle screw placement planning. Six volunteers planned pedicle screws' position and orientation with the AR system and on a 2D desktop planner. We compared the placement accuracy of each screw with both methods. Finally, we administered a questionnaire to all participants to assess their experience with the AR system. RESULTS The latency in message exchange is sufficiently low to enable real-time communication between the platforms. The AR method was non-inferior to the 2D desktop planner, with a mean error of 2.1 ± 1.4 mm. Moreover, 98% of the screw placements performed with the AR system were successful, according to the Gertzbein-Robbins scale. The average questionnaire outcomes were 4.5/5. CONCLUSIONS Real-time communication between Microsoft HoloLens 2 and 3D Slicer is feasible and supports accurate planning for pedicle screw placement.
Collapse
Affiliation(s)
| | - Tamas Ungi
- Laboratory for Percutaneous Surgery, School of Computing, Queen's University, Kingston, ON, K7M2N8, Canada
| | - David Morton
- Laboratory for Percutaneous Surgery, School of Computing, Queen's University, Kingston, ON, K7M2N8, Canada
| | - Gabor Fichtinger
- Laboratory for Percutaneous Surgery, School of Computing, Queen's University, Kingston, ON, K7M2N8, Canada
| | - Javier Pascau
- Departamento de Bioingeniería, Universidad Carlos III de Madrid, 28911, Leganés, Spain
| |
Collapse
|
12
|
Jabbary Aslany F, McBain K, Chen L, O'Brien J, Noel GPJC. Comparison between pre-mortem and post-mortem cadaveric images for use with augmented reality headsets during dissection. Surg Radiol Anat 2023; 45:1311-1319. [PMID: 37698598 DOI: 10.1007/s00276-023-03239-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Accepted: 08/28/2023] [Indexed: 09/13/2023]
Abstract
PURPOSE Medical training has undergone many transformations to incorporate diagnostic imaging along side anatomical education. Post-mortem computed tomography (CT) scanning of body donors prior to dissection has been proposed. However, it poses challenges secondary to the embalming process and other post-mortem physiological changes that significantly alter the imaging quality. The purposes of this study were to compare the accuracy of pathology identification on pre- and post-mortem CT scans of body donors and to assess the integration of those scans in a dissection-based course, where these images were overlaid onto body donors using augmented reality (AR). METHODS Participants in this study included 35 fourth year medical students, 5 radiology residents and 3 radiologists. A convergent, parallel mixed methods design was employed with quantitative measures that included statistical analyses of a double-blinded comparison of pathological lesions recognition, on both image sets, the group responses to a study participant survey and the login access data from imaging repository. The study also included qualitative analysis of post-elective structured interviews. RESULTS The double-blinded comparison revealed that staff radiologists can only identify, on post-mortem images, 54.8% of the pathologies that they were able to detect on the pre-mortem scans. Analyses of the surveys and login access data reveal that 60% of radiology residents and 56% of students preferred pre-mortem scans and used those scans more often than post-mortem scans (67 access vs 36, respectively). However, post-mortem scans were significantly preferred when used to overlay onto body donors using AR (p = 0.0047). CONCLUSION These results show that post-mortem imaging can be valuable alongside pre-mortem imaging, as they represent the most concordance between the anatomical structures and pathologies seen on the images and what is being dissected.
Collapse
Affiliation(s)
| | - Kimberly McBain
- School of Physical and Occupational Therapy, McGill University, Montreal, QC, Canada
| | - Liang Chen
- Faculty of Medicine and Health Sciences, McGill University, Montréal, QC, Canada
| | - Jeremy O'Brien
- Department of Diagnostic Radiology, McGill University, Montreal, QC, Canada
| | - Geoffroy P J C Noel
- Division of Anatomical Sciences, Department of Anatomy and Cell Biology, McGill University, Montreal, QC, Canada.
- Institute of Health Sciences Education, Faculty of Medicine, McGill University, Montreal, QC, Canada.
- Division of Anatomy, Department of Surgery, School of Medicine, Medical Teaching Facility, University of California, 9500 Gilman Dr., La Jolla, San Diego, CA, 92093-0604, USA.
| |
Collapse
|
13
|
Shahzad H, Bhatti NS, Phillips FM, Khan SN. Applications of Augmented Reality in Orthopaedic Spine Surgery. J Am Acad Orthop Surg 2023; 31:e601-e609. [PMID: 37105182 DOI: 10.5435/jaaos-d-23-00023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Accepted: 03/27/2023] [Indexed: 04/29/2023] Open
Abstract
The application of augmented reality (AR) in surgical settings has primarily been as a navigation tool in the operating room because of its ease of use and minimal effect on surgical procedures. The surgeon can directly face the surgical field while viewing 3D anatomy virtually, thus reducing the need to look at an external display, such as a navigation system. Applications of AR are being explored in spine surgery. The basic principles of AR include data preparation, registration, tracking, and visualization. Current literature provides sufficient preclinical and clinical data evidence for the use of AR technology in spine surgery. AR systems are efficient assistive devices, providing greater accuracy for insertion points, more comfort for surgeons, and reduced operating time. AR technology also has beneficial applications in surgical training, education, and telementorship for spine surgery. However, costs associated with specially designed imaging equipment and physicians' comfort in using this technology continue to remain barriers to its adoption. As this technology evolves to a more widespread use, future applications will be directed by the cost-effectiveness of AR-assisted surgeries.
Collapse
Affiliation(s)
- Hania Shahzad
- From the Department of Orthopedics, The Ohio State University, Wexner Medical Center, Columbus, OH (Shahzad, Bhatti, and Khan) and Department of Orthopedics, Rush University Medical Center, Chicago, IL (Phillips)
| | | | | | | |
Collapse
|
14
|
Bhatt FR, Orosz LD, Tewari A, Boyd D, Roy R, Good CR, Schuler TC, Haines CM, Jazini E. Augmented Reality-Assisted Spine Surgery: An Early Experience Demonstrating Safety and Accuracy with 218 Screws. Global Spine J 2023; 13:2047-2052. [PMID: 35000409 PMCID: PMC10556900 DOI: 10.1177/21925682211069321] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
STUDY DESIGN Prospective cohort study. OBJECTIVES In spine surgery, accurate screw guidance is critical to achieving satisfactory fixation. Augmented reality (AR) is a novel technology to assist in screw placement and has shown promising results in early studies. This study aims to provide our early experience evaluating safety and efficacy with an Food and Drug Administration-approved head-mounted (head-mounted device augmented reality (HMD-AR)) device. METHODS Consecutive adult patients undergoing AR-assisted thoracolumbar fusion between October 2020 and August 2021 with 2 -week follow-up were included. Preoperative, intraoperative, and postoperative data were collected to include demographics, complications, revision surgeries, and AR performance. Intraoperative 3D imaging was used to assess screw accuracy using the Gertzbein-Robbins (G-R) grading scale. RESULTS Thirty-two patients (40.6% male) were included with a total of 222 screws executed using HMD-AR. Intraoperatively, 4 (1.8%) were deemed misplaced and revised using AR or freehand. The remaining 218 (98.2%) screws were placed accurately. There were no intraoperative adverse events or complications, and AR was not abandoned in any case. Of the 208 AR-placed screws with 3D imaging confirmation, 97.1% were considered clinically accurate (91.8% Grade A, 5.3% Grade B). There were no early postoperative surgical complications or revision surgeries during the 2 -week follow-up. CONCLUSIONS This early experience study reports an overall G-R accuracy of 97.1% across 218 AR-guided screws with no intra or early postoperative complications. This shows that HMD-AR-assisted spine surgery is a safe and accurate tool for pedicle, cortical, and pelvic fixation. Larger studies are needed to continue to support this compelling evolution in spine surgery.
Collapse
Affiliation(s)
| | | | - Anant Tewari
- National Spine Health Foundation, Reston, VA, USA
| | - David Boyd
- Reston Radiology Consultants, Reston, VA, USA
| | - Rita Roy
- National Spine Health Foundation, Reston, VA, USA
| | | | | | | | | |
Collapse
|
15
|
Pierzchajlo N, Stevenson TC, Huynh H, Nguyen J, Boatright S, Arya P, Chakravarti S, Mehrki Y, Brown NJ, Gendreau J, Lee SJ, Chen SG. Augmented Reality in Minimally Invasive Spinal Surgery: A Narrative Review of Available Technology. World Neurosurg 2023; 176:35-42. [PMID: 37059357 DOI: 10.1016/j.wneu.2023.04.030] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Accepted: 04/08/2023] [Indexed: 04/16/2023]
Abstract
INTRODUCTION Spine surgery has undergone significant changes in approach and technique. With the adoption of intraoperative navigation, minimally invasive spinal surgery (MISS) has arguably become the gold standard. Augmented reality (AR) has now emerged as a front-runner in anatomical visualization and narrower operative corridors. In effect, AR is poised to revolutionize surgical training and operative outcomes. Our study examines the current literature on AR-assisted MISS, synthesizes findings, and creates a narrative highlighting the history and future of AR in spine surgery. MATERIAL AND METHODS Relevant literature was gathered using the PubMed (Medline) database from 1975 to 2023. Pedicle screw placement models were the primary intervention in AR. These were compared to the outcomes of traditional MISS RESULTS: We found that AR devices on the market show promising clinical outcomes in preoperative training and intraoperative use. Three prominent systems were as follows: XVision, HoloLens, and ImmersiveTouch. In the studies, surgeons, residents, and medical students had opportunities to operate AR systems, showcasing their educational potential across each phase of learning. Specifically, one facet described training with cadaver models to gauge accuracy in pedicle screw placement. AR-MISS exceeded free-hand methods without unique complications or contraindications. CONCLUSIONS While still in its infancy, AR has already proven beneficial for educational training and intraoperative MISS applications. We believe that with continued research and advancement of this technology, AR is poised to become a dominant player within the fundamentals of surgical education and MISS operative technique.
Collapse
Affiliation(s)
| | | | - Huey Huynh
- Mercer University, School of Medicine, Savannah, GA, USA
| | - Jimmy Nguyen
- Mercer University, School of Medicine, Savannah, GA, USA
| | | | - Priya Arya
- Mercer University, School of Medicine, Savannah, GA, USA
| | | | - Yusuf Mehrki
- Department of Neurosurgery, University of Florida, Jacksonville, FL, USA
| | - Nolan J Brown
- Department of Neurosurgery, University of California Irvine, Orange, CA, USA
| | - Julian Gendreau
- Department of Biomedical Engineering, Johns Hopkins Whiting School of Engineering, Baltimore, MD, USA
| | - Seung Jin Lee
- Department of Neurosurgery, Mayo Clinic, Jacksonville, FL, USA
| | - Selby G Chen
- Department of Neurosurgery, Mayo Clinic, Jacksonville, FL, USA
| |
Collapse
|
16
|
Cao B, Yuan B, Xu G, Zhao Y, Sun Y, Wang Z, Zhou S, Xu Z, Wang Y, Chen X. A Pilot Human Cadaveric Study on Accuracy of the Augmented Reality Surgical Navigation System for Thoracolumbar Pedicle Screw Insertion Using a New Intraoperative Rapid Registration Method. J Digit Imaging 2023; 36:1919-1929. [PMID: 37131064 PMCID: PMC10406793 DOI: 10.1007/s10278-023-00840-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2023] [Revised: 04/20/2023] [Accepted: 04/21/2023] [Indexed: 05/04/2023] Open
Abstract
To evaluate the feasibility and accuracy of AR-assisted pedicle screw placement using a new intraoperative rapid registration method of combining preoperative CT scanning and intraoperative C-arm 2D fluoroscopy in cadavers. Five cadavers with intact thoracolumbar spines were employed in this study. Intraoperative registration was performed using anteroposterior and lateral views of preoperative CT scanning and intraoperative 2D fluoroscopic images. Patient-specific targeting guides were used for pedicle screw placement from Th1-L5, totaling 166 screws. Instrumentation for each side was randomized (augmented reality surgical navigation (ARSN) vs. C-arm) with an equal distribution of 83 screws in each group. CT was performed to evaluate the accuracy of both techniques by assessing the screw positions and the deviations between the inserted screws and planned trajectories. Postoperative CT showed that 98.80% (82/83) screws in ARSN group and 72.29% (60/83) screws in C-arm group were within the 2-mm safe zone (p < 0.001). The mean time for instrumentation per level in ARSN group was significantly shorter than that in C-arm group (56.17 ± 3.33 s vs. 99.22 ± 9.03 s, p < 0.001). The overall intraoperative registration time was 17.2 ± 3.5 s per segment. AR-based navigation technology can provide surgeons with accurate guidance of pedicle screw insertion and save the operation time by using the intraoperative rapid registration method of combining preoperative CT scanning and intraoperative C-arm 2D fluoroscopy.
Collapse
Affiliation(s)
- Bing Cao
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Bo Yuan
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Guofeng Xu
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Yin Zhao
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Yanqing Sun
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Zhiwei Wang
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Shengyuan Zhou
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Zheng Xu
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Yao Wang
- Linyan Medical Technology Company Limited, 528 Ruiqing Road, Pudong New District, Shanghai, China
| | - Xiongsheng Chen
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China.
| |
Collapse
|
17
|
Taghian A, Abo-Zahhad M, Sayed MS, Abd El-Malek AH. Virtual and augmented reality in biomedical engineering. Biomed Eng Online 2023; 22:76. [PMID: 37525193 PMCID: PMC10391968 DOI: 10.1186/s12938-023-01138-3] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2022] [Accepted: 07/12/2023] [Indexed: 08/02/2023] Open
Abstract
BACKGROUND In the future, extended reality technology will be widely used. People will be led to utilize virtual reality (VR) and augmented reality (AR) technologies in their daily lives, hobbies, numerous types of entertainment, and employment. Medical augmented reality has evolved with applications ranging from medical education to picture-guided surgery. Moreover, a bulk of research is focused on clinical applications, with the majority of research devoted to surgery or intervention, followed by rehabilitation and treatment applications. Numerous studies have also looked into the use of augmented reality in medical education and training. METHODS Using the databases Semantic Scholar, Web of Science, Scopus, IEEE Xplore, and ScienceDirect, a scoping review was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) criteria. To find other articles, a manual search was also carried out in Google Scholar. This study presents studies carried out over the previous 14 years (from 2009 to 2023) in detail. We classify this area of study into the following categories: (1) AR and VR in surgery, which is presented in the following subsections: subsection A: MR in neurosurgery; subsection B: spine surgery; subsection C: oral and maxillofacial surgery; and subsection D: AR-enhanced human-robot interaction; (2) AR and VR in medical education presented in the following subsections; subsection A: medical training; subsection B: schools and curriculum; subsection C: XR in Biomedicine; (3) AR and VR for rehabilitation presented in the following subsections; subsection A: stroke rehabilitation during COVID-19; subsection B: cancer and VR, and (4) Millimeter-wave and MIMO systems for AR and VR. RESULTS In total, 77 publications were selected based on the inclusion criteria. Four distinct AR and/or VR applications groups could be differentiated: AR and VR in surgery (N = 21), VR and AR in Medical Education (N = 30), AR and VR for Rehabilitation (N = 15), and Millimeter-Wave and MIMO Systems for AR and VR (N = 7), where N is number of cited studies. We found that the majority of research is devoted to medical training and education, with surgical or interventional applications coming in second. The research is mostly focused on rehabilitation, therapy, and clinical applications. Moreover, the application of XR in MIMO has been the subject of numerous research. CONCLUSION Examples of these diverse fields of applications are displayed in this review as follows: (1) augmented reality and virtual reality in surgery; (2) augmented reality and virtual reality in medical education; (3) augmented reality and virtual reality for rehabilitation; and (4) millimeter-wave and MIMO systems for augmented reality and virtual reality.
Collapse
Affiliation(s)
- Aya Taghian
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt.
| | - Mohammed Abo-Zahhad
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt
- Department of Electrical Engineering, Assiut University, Assiut, Egypt
| | - Mohammed S Sayed
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt
- Department of Electronics and Communications Engineering, Zagazig University, Zagazig, Ash Sharqia, Egypt
| | - Ahmed H Abd El-Malek
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt
| |
Collapse
|
18
|
Wu J, Gao L, Shi Q, Qin C, Xu K, Jiang Z, Zhang X, Li M, Qiu J, Gu W. Accuracy Evaluation Trial of Mixed Reality-Guided Spinal Puncture Technology. Ther Clin Risk Manag 2023; 19:599-609. [PMID: 37484696 PMCID: PMC10361284 DOI: 10.2147/tcrm.s416918] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2023] [Accepted: 07/03/2023] [Indexed: 07/25/2023] Open
Abstract
Purpose To evaluate the accuracy of mixed reality (MR)-guided visualization technology for spinal puncture (MRsp). Methods MRsp involved the following three steps: 1. Lumbar spine computed tomography (CT) data were obtained to reconstruct virtual 3D images, which were imported into a HoloLens (2nd gen). 2. The patented MR system quickly recognized the spatial orientation and superimposed the virtual image over the real spine in the HoloLens. 3. The operator performed the spinal puncture with structural information provided by the virtual image. A posture fixation cushion was used to keep the subjects' lateral decubitus position consistent. 12 subjects were recruited to verify the setup error and the registration error. The setup error was calculated using the first two CT scans and measuring the displacement of two location markers. The projection points of the upper edge of the L3 spinous process (L3↑), the lower edge of the L3 spinous process (L3↓), and the lower edge of the L4 spinous process (L4↓) in the virtual image were positioned and marked on the skin as the registration markers. A third CT scan was performed to determine the registration error by measuring the displacement between the three registration markers and the corresponding real spinous process edges. Results The setup errors in the position of the cranial location marker between CT scans along the left-right (LR), anterior-posterior (AP), and superior-inferior (SI) axes of the CT bed measured 0.09 ± 0.06 cm, 0.30 ± 0.28 cm, and 0.22 ± 0.12 cm, respectively, while those of the position of the caudal location marker measured 0.08 ± 0.06 cm, 0.29 ± 0.18 cm, and 0.18 ± 0.10 cm, respectively. The registration errors between the three registration markers and the subject's real L3↑, L3↓, and L4↓ were 0.11 ± 0.09 cm, 0.15 ± 0.13 cm, and 0.13 ± 0.10 cm, respectively, in the SI direction. Conclusion This MR-guided visualization technology for spinal puncture can accurately and quickly superimpose the reconstructed 3D CT images over a real human spine.
Collapse
Affiliation(s)
- Jiajun Wu
- Department of Anesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, 200040, People’s Republic of China
- Shanghai Key Laboratory of Clinical Geriatric Medicine, Shanghai, 200040, People’s Republic of China
| | - Lei Gao
- Department of Anesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, 200040, People’s Republic of China
- Shanghai Key Laboratory of Clinical Geriatric Medicine, Shanghai, 200040, People’s Republic of China
| | - Qiao Shi
- Department of Anesthesiology, International Peace Maternity and Child Health Hospital of China, School of Medicine, Shanghai Jiao Tong University, Shanghai, 200030, People’s Republic of China
| | - Chunhui Qin
- Department of Pain Management, Yueyang Integrated Traditional Chinese Medicine and Western Medicine Hospital Affiliated to Shanghai University of Traditional Chinese Medicine, Shanghai, 200437, People’s Republic of China
| | - Kai Xu
- Department of Anesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, 200040, People’s Republic of China
- Shanghai Key Laboratory of Clinical Geriatric Medicine, Shanghai, 200040, People’s Republic of China
| | - Zhaoshun Jiang
- Department of Anesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, 200040, People’s Republic of China
- Shanghai Key Laboratory of Clinical Geriatric Medicine, Shanghai, 200040, People’s Republic of China
| | - Xixue Zhang
- Department of Anesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, 200040, People’s Republic of China
- Shanghai Key Laboratory of Clinical Geriatric Medicine, Shanghai, 200040, People’s Republic of China
| | - Ming Li
- Department of Radiology, Huadong Hospital affiliated to Fudan University, Shanghai, 200040, People’s Republic of China
| | - Jianjian Qiu
- Department of Radiation Oncology, Huadong Hospital Affiliated to Fudan University, Shanghai, 200040, People’s Republic of China
| | - Weidong Gu
- Department of Anesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, 200040, People’s Republic of China
- Shanghai Key Laboratory of Clinical Geriatric Medicine, Shanghai, 200040, People’s Republic of China
| |
Collapse
|
19
|
Onuma H, Sakai K, Arai Y, Torigoe I, Tomori M, Sakaki K, Hirai T, Egawa S, Kobayashi Y, Okawa A, Yoshii T. Augmented Reality Support for Anterior Decompression and Fusion Using Floating Method for Cervical Ossification of the Posterior Longitudinal Ligament. J Clin Med 2023; 12:jcm12082898. [PMID: 37109235 PMCID: PMC10143834 DOI: 10.3390/jcm12082898] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2023] [Revised: 04/06/2023] [Accepted: 04/14/2023] [Indexed: 04/29/2023] Open
Abstract
Anterior decompression and fusion (ADF) using the floating method for cervical ossification of the posterior longitudinal ligament (OPLL) is an ideal surgical technique, but it has a specific risk of insufficient decompression caused by the impingement of residual ossification. Augmented reality (AR) support is a novel technology that enables the superimposition of images onto the view of a surgical field. AR technology was applied to ADF for cervical OPLL to facilitate intraoperative anatomical orientation and OPLL identification. In total, 14 patients with cervical OPLL underwent ADF with microscopic AR support. The outline of the OPLL and the bilateral vertebral arteries was marked after intraoperative CT, and the reconstructed 3D image data were transferred and linked to the microscope. The AR microscopic view enabled us to visualize the ossification outline, which could not be seen directly in the surgical field, and allowed sufficient decompression of the ossification. Neurological disturbances were improved in all patients. No cases of serious complications, such as major intraoperative bleeding or reoperation due to the postoperative impingement of the floating OPLL, were registered. To our knowledge, this is the first report of the introduction of microscopic AR into ADF using the floating method for cervical OPLL with favorable clinical results.
Collapse
Affiliation(s)
- Hiroaki Onuma
- Department of Orthopedic Surgery, Saiseikai Kawaguchi General Hospital, 5-11-5 Nishikawaguchi, Kawaguchi-shi 332-8558, Japan
| | - Kenichiro Sakai
- Department of Orthopedic Surgery, Saiseikai Kawaguchi General Hospital, 5-11-5 Nishikawaguchi, Kawaguchi-shi 332-8558, Japan
| | - Yoshiyasu Arai
- Department of Orthopedic Surgery, Saiseikai Kawaguchi General Hospital, 5-11-5 Nishikawaguchi, Kawaguchi-shi 332-8558, Japan
| | - Ichiro Torigoe
- Department of Orthopedic Surgery, Saiseikai Kawaguchi General Hospital, 5-11-5 Nishikawaguchi, Kawaguchi-shi 332-8558, Japan
| | - Masaki Tomori
- Department of Orthopedic Surgery, Saiseikai Kawaguchi General Hospital, 5-11-5 Nishikawaguchi, Kawaguchi-shi 332-8558, Japan
| | - Kyohei Sakaki
- Department of Orthopedic Surgery, Saiseikai Kawaguchi General Hospital, 5-11-5 Nishikawaguchi, Kawaguchi-shi 332-8558, Japan
| | - Takashi Hirai
- Department of Orthopedic Surgery, Tokyo Medical and Dental University, 1-5-45 Yushima, Bunkyo Ward, Tokyo 113-8519, Japan
| | - Satoru Egawa
- Department of Orthopedic Surgery, Tokyo Medical and Dental University, 1-5-45 Yushima, Bunkyo Ward, Tokyo 113-8519, Japan
| | - Yutaka Kobayashi
- Department of Orthopedic Surgery, Saiseikai Kawaguchi General Hospital, 5-11-5 Nishikawaguchi, Kawaguchi-shi 332-8558, Japan
| | - Atsushi Okawa
- Department of Orthopedic Surgery, Tokyo Medical and Dental University, 1-5-45 Yushima, Bunkyo Ward, Tokyo 113-8519, Japan
| | - Toshitaka Yoshii
- Department of Orthopedic Surgery, Tokyo Medical and Dental University, 1-5-45 Yushima, Bunkyo Ward, Tokyo 113-8519, Japan
| |
Collapse
|
20
|
Medress ZA, Bobrow A, Tigchelaar SS, Henderson T, Parker JJ, Desai A. Augmented Reality-Assisted Resection of a Large Presacral Ganglioneuroma: 2-Dimensional Operative Video. Oper Neurosurg (Hagerstown) 2023; 24:e284-e285. [PMID: 36701554 DOI: 10.1227/ons.0000000000000542] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 09/22/2022] [Indexed: 01/27/2023] Open
Affiliation(s)
- Zachary A Medress
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | | | - Seth S Tigchelaar
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | | | - Jonathon J Parker
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | - Atman Desai
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| |
Collapse
|
21
|
Bounajem MT, Cameron B, Sorensen K, Parr R, Gibby W, Prashant G, Evans JJ, Karsy M. Improved Accuracy and Lowered Learning Curve of Ventricular Targeting Using Augmented Reality-Phantom and Cadaveric Model Testing. Neurosurgery 2023; 92:884-891. [PMID: 36562619 DOI: 10.1227/neu.0000000000002293] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2022] [Accepted: 09/23/2022] [Indexed: 12/24/2022] Open
Abstract
BACKGROUND Augmented reality (AR) has demonstrated significant potential in neurosurgical cranial, spine, and teaching applications. External ventricular drain (EVD) placement remains a common procedure, but with error rates in targeting between 10% and 40%. OBJECTIVE To evaluate Novarad VisAR guidance system for the placement of EVDs in phantom and cadaveric models. METHODS Two synthetic ventricular phantom models and a third cadaver model underwent computerized tomography imaging and registration with the VisAR system (Novarad). Root mean square (RMS), angular error (γ), and Euclidian distance were measured by multiple methods for various standard EVD placements. RESULTS Computerized tomography measurements on a phantom model (0.5-mm targets showed a mean Euclidean distance error of 1.20 ± 0.98 mm and γ of 1.25° ± 1.02°. Eight participants placed EVDs in lateral and occipital burr holes using VisAR in a second phantom anatomic ventricular model (mean RMS: 3.9 ± 1.8 mm, γ: 3.95° ± 1.78°). There were no statistically significant differences in accuracy for postgraduate year level, prior AR experience, prior EVD experience, or experience with video games ( P > .05). In comparing EVDs placed with anatomic landmarks vs VisAR navigation in a cadaver, VisAR demonstrated significantly better RMS and γ, 7.47 ± 0.94 mm and 7.12° ± 0.97°, respectively ( P ≤ .05). CONCLUSION The novel VisAR AR system resulted in accurate placement of EVDs with a rapid learning curve, which may improve clinical treatment and patient safety. Future applications of VisAR can be expanded to other cranial procedures.
Collapse
Affiliation(s)
- Michael T Bounajem
- Department of Neurosurgery, Clinical Neurosciences Center, University of Utah, Salt Lake City, Utah, USA
| | | | | | | | - Wendell Gibby
- Novarad, Provo, Utah, USA
- Department of Radiology, University of California-San Diego, San Diego, California, USA
| | - Giyarpuram Prashant
- Department of Neurosurgery, Thomas Jefferson University Hospital, Philadelphia, Pennsylvania, USA
| | - James J Evans
- Department of Neurosurgery, Thomas Jefferson University Hospital, Philadelphia, Pennsylvania, USA
| | - Michael Karsy
- Department of Neurosurgery, Clinical Neurosciences Center, University of Utah, Salt Lake City, Utah, USA
| |
Collapse
|
22
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 16] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
23
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
24
|
Clinical applications of augmented reality in orthopaedic surgery: a comprehensive narrative review. INTERNATIONAL ORTHOPAEDICS 2023; 47:375-391. [PMID: 35852653 DOI: 10.1007/s00264-022-05507-w] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/15/2022] [Accepted: 07/04/2022] [Indexed: 01/28/2023]
Abstract
PURPOSE The development of augmented reality (AR) technology allows orthopaedic surgeons to incorporate and visualize surgical data, assisting the execution of both routine and complex surgical operations. Uniquely, AR technology allows a surgeon to view the surgical field and superimpose peri-operative imaging, anatomical landmarks, navigation guidance, and more, all in one view without the need for conjugate gaze between multiple screens. The aim of this literature review was to introduce the fundamental requirements for an augmented reality system and to assess the current applications, outcomes, and potential limitations to this technology. METHODS A literature search was performed using MEDLINE and Embase databases, by two independent reviewers, who then collaboratively synthesized and collated the results of the literature search into a narrative review focused on the applications of augmented reality in major orthopaedic sub-specialties. RESULTS Current technology requires that pre-operative patient data be acquired, and AR-compatible models constructed. Intra-operatively, to produce manipulatable virtual images into the user's view in real time, four major components are required including a camera, computer image processing technology, tracking tools, and an output screen. The user is provided with a heads-up display, which is a transparent display, enabling the user to look at both their natural view and the computer-generated images. Currently, high-quality evidence for clinical implementation of AR technology in the orthopaedic surgery operating room is lacking; however, growing in vitro literature highlights a multitude of potential applications, including increasing operative accuracy, improved biomechanical angular and alignment parameters, and potentially reduced operative time. CONCLUSION While the application of AR systems in surgery is currently in its infancy, we anticipate rapid and widespread implementation of this technology in various orthopaedic sub-specialties.
Collapse
|
25
|
Fan X, Zhu Q, Tu P, Joskowicz L, Chen X. A review of advances in image-guided orthopedic surgery. Phys Med Biol 2023; 68. [PMID: 36595258 DOI: 10.1088/1361-6560/acaae9] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2022] [Accepted: 12/12/2022] [Indexed: 12/15/2022]
Abstract
Orthopedic surgery remains technically demanding due to the complex anatomical structures and cumbersome surgical procedures. The introduction of image-guided orthopedic surgery (IGOS) has significantly decreased the surgical risk and improved the operation results. This review focuses on the application of recent advances in artificial intelligence (AI), deep learning (DL), augmented reality (AR) and robotics in image-guided spine surgery, joint arthroplasty, fracture reduction and bone tumor resection. For the pre-operative stage, key technologies of AI and DL based medical image segmentation, 3D visualization and surgical planning procedures are systematically reviewed. For the intra-operative stage, the development of novel image registration, surgical tool calibration and real-time navigation are reviewed. Furthermore, the combination of the surgical navigation system with AR and robotic technology is also discussed. Finally, the current issues and prospects of the IGOS system are discussed, with the goal of establishing a reference and providing guidance for surgeons, engineers, and researchers involved in the research and development of this area.
Collapse
Affiliation(s)
- Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Qiyang Zhu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Leo Joskowicz
- School of Computer Science and Engineering, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China.,Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| |
Collapse
|
26
|
McBain K, Chen L, Lee A, O'Brien J, Ventura NM, Noël GPJC. Evaluating the integration of body donor imaging into anatomical dissection using augmented reality. ANATOMICAL SCIENCES EDUCATION 2023; 16:71-86. [PMID: 34850590 DOI: 10.1002/ase.2157] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/24/2021] [Accepted: 11/26/2021] [Indexed: 06/13/2023]
Abstract
Augmented reality (AR) has recently been utilized as an integrative teaching tool in medical curricula given its ability to view virtual objects while interacting with the physical environment. The evidence for AR in medical training, however, is limited. For this reason, the purpose of this mixed method study was to evaluate the implementation of overlaying donor-specific diagnostic imaging (DSDI) onto corresponding body donors in a fourth-year, dissection-based, medical elective course entitled anatomy for surgeons (AFS). Students registered in AFS course were separated into groups, receiving either DSDI displayed on Microsoft HoloLens AR head-mounted display (n = 12) or DSDI displayed on iPad (n = 15). To test for the change in spatial ability, students completed an anatomical mental rotation test (AMRT) prior to and following the AFS course. Students also participated in a focus group discussion and completed a survey at the end of AFS, analyzed through thematic triangulation and an unpaired, Mann Whitney U test respectively, both addressing dissection experience, DSDI relevancy to dissection, and use of AR in anatomical education. Although statistically significant differences were not found when comparing student group AMRT scores, survey and discussion data suggest that the HoloLens had improved the students' understanding of, and their spatial orientation of, anatomical relationships. Trunk dissection quality grades were significantly higher with students using the HoloLens. Although students mentioned difficulties with HoloLens software, with faculty assistance, training, and enhanced software development, there is potential for this AR tool to contribute to improved dissection quality and an immersive learning experience.
Collapse
Affiliation(s)
- Kimberly McBain
- School of Physical and Occupational Therapy, McGill University, Montreal, Québec, Canada
| | - Liang Chen
- Postgraduate Medical Education, McGill University, Montreal, Québec, Canada
| | - Angela Lee
- Division of Experimental Medicine, McGill University, Montreal, Québec, Canada
| | - Jeremy O'Brien
- Department of Diagnostic Radiology, McGill University, Montreal, Québec, Canada
| | - Nicole M Ventura
- Division of Anatomical Sciences, Department of Anatomy and Cell Biology, McGill University, Montreal, Québec, Canada
- Institute of Health Sciences Education, Faculty of Medicine, McGill University, Montreal, Québec, Canada
| | - Geoffroy P J C Noël
- Division of Anatomical Sciences, Department of Anatomy and Cell Biology, McGill University, Montreal, Québec, Canada
- Institute of Health Sciences Education, Faculty of Medicine, McGill University, Montreal, Québec, Canada
- Division of Anatomy, Department of Surgery, University of California San Diego, La Jolla, California, USA
| |
Collapse
|
27
|
Chegini S, Edwards E, McGurk M, Clarkson M, Schilling C. Systematic review of techniques used to validate the registration of augmented-reality images using a head-mounted device to navigate surgery. Br J Oral Maxillofac Surg 2023; 61:19-27. [PMID: 36513525 DOI: 10.1016/j.bjoms.2022.08.007] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 07/31/2022] [Accepted: 08/17/2022] [Indexed: 12/14/2022]
Abstract
Augmented-reality (AR) head-mounted devices (HMD) allow the wearer to have digital images superposed on to their field of vision. They are being used to superpose annotations on to the surgical field akin to a navigation system. This review examines published validation studies on HMD-AR systems, their reported protocols, and outcomes. The aim was to establish commonalities and an acceptable registration outcome. Multiple databases were systematically searched for relevant articles between January 2015 and January 2021. Studies that examined the registration of AR content using a HMD to guide surgery were eligible for inclusion. The country of origin, year of publication, medical specialty, HMD device, software, and method of registration, were recorded. A meta-analysis of the mean registration error was conducted. A total of 4784 papers were identified, of which 23 met the inclusion criteria. They included studies using HoloLens (Microsoft) (n = 22) and nVisor ST60 (NVIS Inc) (n = 1). Sixty-six per cent of studies were in hard tissue specialties. Eleven studies reported registration errors using pattern markers (mean (SD) 2.6 (1.8) mm), and four reported registration errors using surface markers (mean (SD) 3.8 (3.7) mm). Three studies reported registration errors using manual alignment (mean (SD) 2.2 (1.3) mm). The majority of studies in this review used in-house software with a variety of registration methods and reported errors. The mean registration error calculated in this study can be considered as a minimum acceptable standard. It should be taken into consideration when procedural applications are selected.
Collapse
Affiliation(s)
- Soudeh Chegini
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom.
| | - Eddie Edwards
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Mark McGurk
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Matthew Clarkson
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Clare Schilling
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| |
Collapse
|
28
|
A novel motionless calibration method for augmented reality surgery navigation system based on optical tracker. Heliyon 2022; 8:e12115. [PMID: 36590529 PMCID: PMC9801086 DOI: 10.1016/j.heliyon.2022.e12115] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2022] [Revised: 09/25/2022] [Accepted: 11/28/2022] [Indexed: 12/13/2022] Open
Abstract
Augmented reality (AR) surgery navigation systems display the pre-operation planned virtual model at the accurate position in the real surgical scene to assist the operation. Accurate calibration of the mapping relationship between the virtual coordinate and the real world is the key to the virtual-real fusion effect. Former calibration methods require the doctor user to conduct complex manual procedures before usage. This paper introduces a novel motionless virtual-real calibration method. The method only requires to take a mixed reality image containing both virtual and real marker balls using the built-in forward camera of the AR glasses. The mapping relationship between the virtual and real spaces is calculated by using the camera coordinate system as a transformation medium. The composition and working process of the AR navigation system is introduced, and then the mathematical principle of the calibration is designed. The feasibility of the proposed calibration scheme is verified with a verification experiment, and the average registration accuracy of the scheme is around 5.80mm, which is of same level of formerly reported methods. The proposed method is convenient and rapid to implement, and the calibration accuracy is not dependent on the user experience. Further, it can potentially realize the real-time update of the registration transformation matrix, which can improve the AR fusion accuracy when the AR glasses moves. This motionless calibration method has great potential to be applied in future clinical navigation research.
Collapse
|
29
|
Tigchelaar SS, Medress ZA, Quon J, Dang P, Barbery D, Bobrow A, Kin C, Louis R, Desai A. Augmented Reality Neuronavigation for En Bloc Resection of Spinal Column Lesions. World Neurosurg 2022; 167:102-110. [PMID: 36096393 DOI: 10.1016/j.wneu.2022.08.143] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2022] [Revised: 08/28/2022] [Accepted: 08/30/2022] [Indexed: 11/22/2022]
Abstract
BACKGROUND Primary tumors involving the spine are relatively rare but represent surgically challenging procedures with high patient morbidity. En bloc resection of these tumors necessitates large exposures, wide tumor margins, and poses risks to functionally relevant anatomical structures. Augmented reality neuronavigation (ARNV) represents a paradigm shift in neuronavigation, allowing on-demand visualization of 3D navigation data in real-time directly in line with the operative field. METHODS Here, we describe the first application of ARNV to perform distal sacrococcygectomies for the en bloc removal of sacral and retrorectal lesions involving the coccyx in 2 patients, as well as a thoracic 9-11 laminectomy with costotransversectomy for en bloc removal of a schwannoma in a third patient. RESULTS In our experience, ARNV allowed our teams to minimize the length of the incision, reduce the extent of bony resection, and enhanced visualization of critical adjacent anatomy. All tumors were resected en bloc, and the patients recovered well postoperatively, with no known complications. Pathologic analysis confirmed the en bloc removal of these lesions with negative margins. CONCLUSIONS We conclude that ARNV is an effective strategy for the precise, en bloc removal of spinal lesions including both sacrococcygeal tumors involving the retrorectal space and thoracic schwannomas.
Collapse
Affiliation(s)
- Seth S Tigchelaar
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA.
| | - Zachary A Medress
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | - Jennifer Quon
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | - Phuong Dang
- Surgical Theater, Inc., Cleveland, Ohio, USA
| | | | | | - Cindy Kin
- Department of Surgery, Stanford University Medical Center, Stanford, California, USA
| | - Robert Louis
- The Brain and Spine Center, Hoag Memorial Hospital Presbyterian Newport Beach, Newport Beach, California, USA; Pickup Family Neurosciences Institute, Hoag Memorial Hospital Presbyterian Newport Beach, Newport Beach, California, USA
| | - Atman Desai
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| |
Collapse
|
30
|
Zhang D, Aoude A, Driscoll M. Development and model form assessment of an automatic subject-specific vertebra reconstruction method. Comput Biol Med 2022; 150:106158. [PMID: 37859278 DOI: 10.1016/j.compbiomed.2022.106158] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2022] [Revised: 09/09/2022] [Accepted: 09/24/2022] [Indexed: 11/21/2022]
Abstract
BACKGROUND Current spine models for analog bench models, surgical navigation and training platforms are conventionally based on 3D models from anatomical human body polygon database or from time-consuming manual-labelled data. This work proposed a workflow of quick and accurate subject-specific vertebra reconstruction method and quantified the reconstructed model accuracy and model form errors. METHODS Four different neural networks were customized for vertebra segmentation. To validate the workflow in clinical applications, an excised human lumbar vertebra was scanned via CT and reconstructed into 3D CAD models using four refined networks. A reverse engineering solution was proposed to obtain the high-precision geometry of the excised vertebra as gold standard. The 3D model evaluation metrics and a finite element analysis (FEA) method were designed to reflect the model accuracy and model form errors. RESULTS The automatic segmentation networks achieved the best Dice score of 94.20% in validation datasets. The accuracy of reconstructed models was quantified with the best 3D Dice index of 92.80%, 3D IoU of 86.56%, Hausdorff distance of 1.60 mm, and the heatmaps and histograms were used for error visualization. The FEA results showed the impact of different geometries and reflected partial surface accuracy of the reconstructed vertebra under biomechanical loads with the closest percentage error of 4.2710% compared to the gold standard model. CONCLUSIONS In this work, a workflow of automatic subject-specific vertebra reconstruction method was proposed while the errors in geometry and FEA were quantified. Such errors should be considered when leveraging subject-specific modelling towards the development and improvement of treatments.
Collapse
Affiliation(s)
- Dingzhong Zhang
- Musculoskeletal Biomechanics Research Lab, Department of Mechanical Engineering, McGill University, 845 Sherbrooke St. W, Montréal, Quebec, H3A 0G4, Canada.
| | - Ahmed Aoude
- Orthopaedic Research Laboratory, Research Institute of McGill University Health Centre, Montreal General Hospital, 1650 Cedar Avenue, Montréal, Québec, H3G 1A4, Canada.
| | - Mark Driscoll
- Musculoskeletal Biomechanics Research Lab, Department of Mechanical Engineering, McGill University, 845 Sherbrooke St. W, Montréal, Quebec, H3A 0G4, Canada; Orthopaedic Research Laboratory, Research Institute of McGill University Health Centre, Montreal General Hospital, 1650 Cedar Avenue, Montréal, Québec, H3G 1A4, Canada.
| |
Collapse
|
31
|
Gupta A, Ambade R. From Diagnosis to Therapy: The Role of Virtual and Augmented Reality in Orthopaedic Trauma Surgery. Cureus 2022; 14:e29099. [PMID: 36249662 PMCID: PMC9557249 DOI: 10.7759/cureus.29099] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2022] [Accepted: 09/12/2022] [Indexed: 11/28/2022] Open
Abstract
By reducing procedure-related problems, advancements in computer-assisted surgery (CAS) and surgical training aim to boost operative precision and enhance patient safety. Orthopaedic training and practice have started to change as a result of the incorporation of reality technologies like virtual reality (VR), augmented reality (AR), and mixed reality (MR) into CAS. Today's trainees can engage in realistic and highly involved operational simulations without supervision. With the coronavirus disease 2019 (COVID-19) pandemic, there is a greater need for breakthrough technology adoption. VR is an interactive technology that enables personalised care and could support successful patient-centered rehabilitation. It is a valid and trustworthy evaluation method for determining joint range of motion, function, and balance in physical rehabilitation. It may make it possible to customise care, encourage patients, boost compliance, and track their advancement. AR supplementation in orthopaedic surgery has shown promising results in pre-clinical settings, with improvements in surgical accuracy and reproducibility, decreased operating times, and less radiation exposure. As little patient observation is needed, this may lessen the workload clinicians must bear. The ability to use it for home-based therapy is often available commercially as well. The objectives of this review are to evaluate the technology available, comprehend the available evidence regarding the benefit, and take into account implementation problems in clinical practice. The use of this technology, its practical and moral ramifications, and how it will affect orthopaedic doctors and their patients are also covered. This review offers a current and thorough analysis of the reality technologies and their uses in orthopaedic surgery.
Collapse
|
32
|
Multicenter assessment of augmented reality registration methods for image-guided interventions. Radiol Med 2022; 127:857-865. [DOI: 10.1007/s11547-022-01515-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2021] [Accepted: 06/13/2022] [Indexed: 10/17/2022]
|
33
|
Spijkerboer KG, Fitski M, Siepel FJ, van de Ven CP, van der Steeg AF. Augmented reality-guided localization of a chest wall tumor in a pediatric patient. Eur J Cancer 2022; 170:103-105. [DOI: 10.1016/j.ejca.2022.04.023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2022] [Revised: 04/10/2022] [Accepted: 04/14/2022] [Indexed: 11/26/2022]
|
34
|
Augmented Reality in Orthopedic Surgery and Its Application in Total Joint Arthroplasty: A Systematic Review. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12105278] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
The development of augmented reality (AR) and its application in total joint arthroplasty aims at improving the accuracy and precision in implant components’ positioning, hopefully leading to increased outcomes and survivorship. However, this field is far from being thoroughly explored. We therefore performed a systematic review of the literature in order to examine the application, the results, and the different AR systems available in TJA. A systematic review of the literature according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines was performed. A comprehensive search of PubMed, MEDLINE, EMBASE, and the Cochrane Database of Systematic Reviews was conducted for English articles on the application of augmented reality in total joint arthroplasty using various combinations of keywords since the inception of the database to 31 March 2022. Accuracy was intended as the mean error from the targeted positioning angle and compared as mean values and standard deviations. In all, 14 articles met the inclusion criteria. Among them, four studies reported on the application of AR in total knee arthroplasty, six studies on total hip arthroplasty, three studies reported on reverse shoulder arthroplasty, and one study on total elbow arthroplasty. Nine of the included studies were preclinical (sawbones or cadaveric), while five of them reported results of AR’s clinical application. The main common feature was the high accuracy and precision when implant positioning was compared with preoperative targeted angles with errors ≤2 mm and/or ≤2°. Despite the promising results in terms of increased accuracy and precision, this technology is far from being widely adopted in daily clinical practice. However, the recent exponential growth in machine learning techniques and technologies may eventually lead to the resolution of the ongoing limitations including depth perception and their high complexity, favorably encouraging the widespread usage of AR systems.
Collapse
|
35
|
Augmented Reality: Mapping Methods and Tools for Enhancing the Human Role in Healthcare HMI. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12094295] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Background: Augmented Reality (AR) represents an innovative technology to improve data visualization and strengthen the human perception. Among Human–Machine Interaction (HMI), medicine can benefit most from the adoption of these digital technologies. In this perspective, the literature on orthopedic surgery techniques based on AR was evaluated, focusing on identifying the limitations and challenges of AR-based healthcare applications, to support the research and the development of further studies. Methods: Studies published from January 2018 to December 2021 were analyzed after a comprehensive search on PubMed, Google Scholar, Scopus, IEEE Xplore, Science Direct, and Wiley Online Library databases. In order to improve the review reporting, the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines were used. Results: Authors selected sixty-two articles meeting the inclusion criteria, which were categorized according to the purpose of the study (intraoperative, training, rehabilitation) and according to the surgical procedure used. Conclusions: AR has the potential to improve orthopedic training and practice by providing an increasingly human-centered clinical approach. Further research can be addressed by this review to cover problems related to hardware limitations, lack of accurate registration and tracking systems, and absence of security protocols.
Collapse
|
36
|
Liu Y, Lee MG, Kim JS. Spine Surgery Assisted by Augmented Reality: Where Have We Been? Yonsei Med J 2022; 63:305-316. [PMID: 35352881 PMCID: PMC8965436 DOI: 10.3349/ymj.2022.63.4.305] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/19/2021] [Revised: 02/02/2022] [Accepted: 02/09/2022] [Indexed: 11/27/2022] Open
Abstract
This present systematic review examines spine surgery literature supporting augmented reality (AR) technology and summarizes its current status in spinal surgery technology. Database search strategies were retrieved from PubMed, Web of Science, Cochrane Library, Embase, from the earliest records to April 1, 2021. Our review briefly examines the history of AR, and enumerates different device application workflows in a variety of spinal surgeries. We also sort out the pros and cons of current mainstream AR devices and the latest updates. A total of 45 articles are included in our review. The most prevalent surgical applications included are the augmented reality surgical navigation system and head-mounted display. The most popular application of AR is pedicle screw instrumentation in spine surgery, and the primary responsible surgical levels are thoracic and lumbar. AR guidance systems show high potential value in practical clinical applications for the spine. The overall number of cases in AR-related studies is still rare compared to traditional surgical-assisted techniques. These lack long-term clinical efficacy and robust surgical-related statistical data. Changing healthcare laws as well as the increasing prevalence of spinal surgery are generating critical data that determines the value of AR technology.
Collapse
Affiliation(s)
- Yanting Liu
- Department of Neurosurgery, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Min-Gi Lee
- Department of Neurosurgery, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Jin-Sung Kim
- Department of Neurosurgery, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea.
| |
Collapse
|
37
|
Birlo M, Edwards PJE, Clarkson M, Stoyanov D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med Image Anal 2022; 77:102361. [PMID: 35168103 PMCID: PMC10466024 DOI: 10.1016/j.media.2022.102361] [Citation(s) in RCA: 19] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 11/17/2021] [Accepted: 01/10/2022] [Indexed: 12/11/2022]
Abstract
This article presents a systematic review of optical see-through head mounted display (OST-HMD) usage in augmented reality (AR) surgery applications from 2013 to 2020. Articles were categorised by: OST-HMD device, surgical speciality, surgical application context, visualisation content, experimental design and evaluation, accuracy and human factors of human-computer interaction. 91 articles fulfilled all inclusion criteria. Some clear trends emerge. The Microsoft HoloLens increasingly dominates the field, with orthopaedic surgery being the most popular application (28.6%). By far the most common surgical context is surgical guidance (n=58) and segmented preoperative models dominate visualisation (n=40). Experiments mainly involve phantoms (n=43) or system setup (n=21), with patient case studies ranking third (n=19), reflecting the comparative infancy of the field. Experiments cover issues from registration to perception with very different accuracy results. Human factors emerge as significant to OST-HMD utility. Some factors are addressed by the systems proposed, such as attention shift away from the surgical site and mental mapping of 2D images to 3D patient anatomy. Other persistent human factors remain or are caused by OST-HMD solutions, including ease of use, comfort and spatial perception issues. The significant upward trend in published articles is clear, but such devices are not yet established in the operating room and clinical studies showing benefit are lacking. A focused effort addressing technical registration and perceptual factors in the lab coupled with design that incorporates human factors considerations to solve clear clinical problems should ensure that the significant current research efforts will succeed.
Collapse
Affiliation(s)
- Manuel Birlo
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK.
| | - P J Eddie Edwards
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Matthew Clarkson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| |
Collapse
|
38
|
Tu P, Qin C, Guo Y, Li D, Lungu AJ, Wang H, Chen X. Ultrasound image guided and mixed reality-based surgical system with real-time soft tissue deformation computing for robotic cervical pedicle screw placement. IEEE Trans Biomed Eng 2022; 69:2593-2603. [PMID: 35157575 DOI: 10.1109/tbme.2022.3150952] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Cervical pedicle screw (CPS) placement surgery remains technically demanding due to the complicated anatomy with neurovascular structures. State-of-the-art surgical navigation or robotic systems still suffer from the problem of hand-eye coordination and soft tissue deformation. In this study, we aim at tracking the intraoperative soft tissue deformation and constructing a virtual physical fusion surgical scene, and integrating them into the robotic system for CPS placement surgery. Firstly, we propose a real-time deformation computation method based on the prior shape model and intraoperative partial information acquired from ultrasound images. According to the generated posterior shape, the structure representation of deformed target tissue gets updated continuously. Secondly, a hand tremble compensation method is proposed to improve the accuracy and robustness of the virtual-physical calibration procedure, and a mixed reality based surgical scene is further constructed for CPS placement surgery. Thirdly, we integrate the soft tissue deformation method and virtual-physical fusion method into our previously proposed surgical robotic system, and the surgical workflow for CPS placement surgery is introduced. We conducted phantom and animal experiments to evaluate the feasibility and accuracy of the proposed system. Our system yielded a mean surface distance error of 1.52 ± 0.43 mm for soft tissue deformation computing, and an average distance deviation of 1.04 ± 0.27 mm for CPS placement. Results demonstrated that our system involves tremendous clinical application potential. Our proposed system promotes the efficiency and safety of the CPS placement surgery.
Collapse
|
39
|
Key Ergonomics Requirements and Possible Mechanical Solutions for Augmented Reality Head-Mounted Displays in Surgery. MULTIMODAL TECHNOLOGIES AND INTERACTION 2022. [DOI: 10.3390/mti6020015] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022] Open
Abstract
In the context of a European project, we identified over 150 requirements for the development of an augmented reality (AR) head-mounted display (HMD) specifically tailored to support highly challenging manual surgical procedures. The requirements were established by surgeons from different specialties and by industrial players working in the surgical field who had strong commitments to the exploitation of this technology. Some of these requirements were specific to the project, while others can be seen as key requirements for the implementation of an efficient and reliable AR headset to be used to support manual activities in the peripersonal space. The aim of this work is to describe these ergonomic requirements that impact the mechanical design of the HMDs, the possible innovative solutions to these requirements, and how these solutions have been used to implement the AR headset in surgical navigation. We also report the results of a preliminary qualitative evaluation of the AR headset by three surgeons.
Collapse
|
40
|
García-Sevilla M, Moreta-Martinez R, García-Mato D, Arenas de Frutos G, Ochandiano S, Navarro-Cuéllar C, Sanjuán de Moreta G, Pascau J. Surgical Navigation, Augmented Reality, and 3D Printing for Hard Palate Adenoid Cystic Carcinoma En-Bloc Resection: Case Report and Literature Review. Front Oncol 2022; 11:741191. [PMID: 35059309 PMCID: PMC8763795 DOI: 10.3389/fonc.2021.741191] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2021] [Accepted: 11/26/2021] [Indexed: 12/18/2022] Open
Abstract
Adenoid Cystic Carcinoma is a rare and aggressive tumor representing less than 1% of head and neck cancers. This malignancy often arises from the minor salivary glands, being the palate its most common location. Surgical en-bloc resection with clear margins is the primary treatment. However, this location presents a limited line of sight and a high risk of injuries, making the surgical procedure challenging. In this context, technologies such as intraoperative navigation can become an effective tool, reducing morbidity and improving the safety and accuracy of the procedure. Although their use is extended in fields such as neurosurgery, their application in maxillofacial surgery has not been widely evidenced. One reason is the need to rigidly fixate a navigation reference to the patient, which often entails an invasive setup. In this work, we studied three alternative and less invasive setups using optical tracking, 3D printing and augmented reality. We evaluated their precision in a patient-specific phantom, obtaining errors below 1 mm. The optimum setup was finally applied in a clinical case, where the navigation software was used to guide the tumor resection. Points were collected along the surgical margins after resection and compared with the real ones identified in the postoperative CT. Distances of less than 2 mm were obtained in 90% of the samples. Moreover, the navigation provided confidence to the surgeons, who could then undertake a less invasive and more conservative approach. The postoperative CT scans showed adequate resection margins and confirmed that the patient is free of disease after two years of follow-up.
Collapse
Affiliation(s)
- Mónica García-Sevilla
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - Rafael Moreta-Martinez
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - David García-Mato
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - Gema Arenas de Frutos
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Cirugía Oral y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Santiago Ochandiano
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Cirugía Oral y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Carlos Navarro-Cuéllar
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Cirugía Oral y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Guillermo Sanjuán de Moreta
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Otorrinolaringología, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Javier Pascau
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| |
Collapse
|
41
|
XR (Extended Reality: Virtual Reality, Augmented Reality, Mixed Reality) Technology in Spine Medicine: Status Quo and Quo Vadis. J Clin Med 2022; 11:jcm11020470. [PMID: 35054164 PMCID: PMC8779726 DOI: 10.3390/jcm11020470] [Citation(s) in RCA: 28] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 01/01/2022] [Accepted: 01/11/2022] [Indexed: 02/06/2023] Open
Abstract
In recent years, with the rapid advancement and consumerization of virtual reality, augmented reality, mixed reality, and extended reality (XR) technology, the use of XR technology in spine medicine has also become increasingly popular. The rising use of XR technology in spine medicine has also been accelerated by the recent wave of digital transformation (i.e., case-specific three-dimensional medical images and holograms, wearable sensors, video cameras, fifth generation, artificial intelligence, and head-mounted displays), and further accelerated by the COVID-19 pandemic and the increase in minimally invasive spine surgery. The COVID-19 pandemic has a negative impact on society, but positive impacts can also be expected, including the continued spread and adoption of telemedicine services (i.e., tele-education, tele-surgery, tele-rehabilitation) that promote digital transformation. The purpose of this narrative review is to describe the accelerators of XR (VR, AR, MR) technology in spine medicine and then to provide a comprehensive review of the use of XR technology in spine medicine, including surgery, consultation, education, and rehabilitation, as well as to identify its limitations and future perspectives (status quo and quo vadis).
Collapse
|
42
|
Mixed Reality Needle Guidance Application on Smartglasses Without Pre-procedural CT Image Import with Manually Matching Coordinate Systems. Cardiovasc Intervent Radiol 2022; 45:349-356. [PMID: 35022858 DOI: 10.1007/s00270-021-03029-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/20/2021] [Accepted: 10/28/2021] [Indexed: 11/02/2022]
Abstract
PURPOSE To develop and assess the accuracy of a mixed reality (MR) needle guidance application on smartglasses. MATERIALS AND METHODS An MR needle guidance application on HoloLens2, without pre-procedural CT image reconstruction or import by manually matching the spatial and MR coordinate systems, was developed. First, the accuracy of the target locations in the image overlay at 63 points arranged on a 45 × 35 × 21 cm box and needle angles from 0° to 80°, placed using the MR application, was verified. The needle placement errors from 12 different entry points in a phantom by seven operators (four physicians and three non-physicians) were compared using a linear mixed model between the MR guidance and conventional methods using protractors. RESULTS The average errors of the target locations and needle angles placed using the MR application were 5.9 ± 2.6 mm and 2.3 ± 1.7°, respectively. The average needle insertion error using the MR guidance was slightly smaller compared to that using the conventional method (8.4 ± 4.0 mm vs. 9.6 ± 5.1 mm, p = 0.091), particularly in the out-of-plane approach (9.6 ± 3.5 mm vs. 12.3 ± 4.6 mm, p = 0.003). The procedural time was longer with MR guidance than with the conventional method (412 ± 134 s vs. 219 ± 66 s, p < 0.001). CONCLUSION MR needle guidance without pre-procedural CT image import is feasible when matching coordinate systems, and the accuracy of needle insertion is slightly better than that of the conventional method.
Collapse
|
43
|
Feasibility and Accuracy of Thoracolumbar Pedicle Screw Placement Using an Augmented Reality Head Mounted Device. SENSORS 2022; 22:s22020522. [PMID: 35062483 PMCID: PMC8779462 DOI: 10.3390/s22020522] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 12/30/2021] [Accepted: 01/06/2022] [Indexed: 02/06/2023]
Abstract
Background: To investigate the accuracy of augmented reality (AR) navigation using the Magic Leap head mounted device (HMD), pedicle screws were minimally invasively placed in four spine phantoms. Methods: AR navigation provided by a combination of a conventional navigation system integrated with the Magic Leap head mounted device (AR-HMD) was used. Forty-eight screws were planned and inserted into Th11-L4 of the phantoms using the AR-HMD and navigated instruments. Postprocedural CT scans were used to grade the technical (deviation from the plan) and clinical (Gertzbein grade) accuracy of the screws. The time for each screw placement was recorded. Results: The mean deviation between navigation plan and screw position was 1.9 ± 0.7 mm (1.9 [0.3–4.1] mm) at the entry point and 1.4 ± 0.8 mm (1.2 [0.1–3.9] mm) at the screw tip. The angular deviation was 3.0 ± 1.4° (2.7 [0.4–6.2]°) and the mean time for screw placement was 130 ± 55 s (108 [58–437] s). The clinical accuracy was 94% according to the Gertzbein grading scale. Conclusion: The combination of an AR-HMD with a conventional navigation system for accurate minimally invasive screw placement is feasible and can exploit the benefits of AR in the perspective of the surgeon with the reliability of a conventional navigation system.
Collapse
|
44
|
Bori E, Pancani S, Vigliotta S, Innocenti B. Validation and accuracy evaluation of automatic segmentation for knee joint pre-planning. Knee 2021; 33:275-281. [PMID: 34739958 DOI: 10.1016/j.knee.2021.10.016] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/07/2021] [Revised: 09/28/2021] [Accepted: 10/12/2021] [Indexed: 02/02/2023]
Abstract
BACKGROUND Proper use of three-dimensional (3D) models generated from medical imaging data in clinical preoperative planning, training and consultation is based on the preliminary proved accuracy of the replication of the patient anatomy. Therefore, this study investigated the dimensional accuracy of 3D reconstructions of the knee joint generated from computed tomography scans via automatic segmentation by comparing them with 3D models generated through manual segmentation. METHODS Three unpaired, fresh-frozen right legs were investigated. Three-dimensional models of the femur and the tibia of each leg were manually segmented using a commercial software and compared in terms of geometrical accuracy with the 3D models automatically segmented using proprietary software. Bony landmarks were identified and used to calculate clinically relevant distances: femoral epicondylar distance; posterior femoral epicondylar distance; femoral trochlear groove length; tibial knee center tubercle distance (TKCTD). Pearson's correlation coefficient and Bland and Altman plots were used to evaluate the level of agreement between measured distances. RESULTS Differences between parameters measured on 3D models manually and automatically segmented were below 1 mm (range: -0.06 to 0.72 mm), except for TKCTD (between 1.00 and 1.40 mm in two specimens). In addition, there was a significant strong correlation between measurements. CONCLUSIONS The results obtained are comparable to those reported in previous studies where accuracy of bone 3D reconstruction was investigated. Automatic segmentation techniques can be used to quickly reconstruct reliable 3D models of bone anatomy and these results may contribute to enhance the spread of this technology in preoperative and operative settings, where it has shown considerable potential.
Collapse
Affiliation(s)
- Edoardo Bori
- BEAMS Department, Université Libre de Bruxelles, Bruxelles, Belgium.
| | | | | | | |
Collapse
|
45
|
Augmented Reality (AR) in Orthopedics: Current Applications and Future Directions. Curr Rev Musculoskelet Med 2021; 14:397-405. [PMID: 34751894 DOI: 10.1007/s12178-021-09728-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 09/27/2021] [Indexed: 01/05/2023]
Abstract
PURPOSE OF REVIEW Imaging technologies (X-ray, CT, MRI, and ultrasound) have revolutionized orthopedic surgery, allowing for the more efficient diagnosis, monitoring, and treatment of musculoskeletal aliments. The current review investigates recent literature surrounding the impact of augmented reality (AR) imaging technologies on orthopedic surgery. In particular, it investigates the impact that AR technologies may have on provider cognitive burden, operative times, occupational radiation exposure, and surgical precision and outcomes. RECENT FINDINGS Many AR technologies have been shown to lower provider cognitive burden and reduce operative time and radiation exposure while improving surgical precision in pre-clinical cadaveric and sawbones models. So far, only a few platforms focusing on pedicle screw placement have been approved by the FDA. These technologies have been implemented clinically with mixed results when compared to traditional free-hand approaches. It remains to be seen if current AR technologies can deliver upon their multitude of promises, and the ability to do so seems contingent upon continued technological progress. Additionally, the impact of these platforms will likely be highly conditional on clinical indication and provider type. It remains unclear if AR will be broadly accepted and utilized or if it will be reserved for niche indications where it adds significant value. One thing is clear, orthopedics' high utilization of pre- and intra-operative imaging, combined with the relative ease of tracking rigid structures like bone as compared to soft tissues, has made it the clear beachhead market for AR technologies in medicine.
Collapse
|
46
|
Augmented and virtual reality in spine surgery, current applications and future potentials. Spine J 2021; 21:1617-1625. [PMID: 33774210 DOI: 10.1016/j.spinee.2021.03.018] [Citation(s) in RCA: 58] [Impact Index Per Article: 19.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/15/2021] [Accepted: 03/17/2021] [Indexed: 02/03/2023]
Abstract
BACKGROUND CONTEXT The field of artificial intelligence (AI) is rapidly advancing, especially with recent improvements in deep learning (DL) techniques. Augmented (AR) and virtual reality (VR) are finding their place in healthcare, and spine surgery is no exception. The unique capabilities and advantages of AR and VR devices include their low cost, flexible integration with other technologies, user-friendly features and their application in navigation systems, which makes them beneficial across different aspects of spine surgery. Despite the use of AR for pedicle screw placement, targeted cervical foraminotomy, bone biopsy, osteotomy planning, and percutaneous intervention, the current applications of AR and VR in spine surgery remain limited. PURPOSE The primary goal of this study was to provide the spine surgeons and clinical researchers with the general information about the current applications, future potentials, and accessibility of AR and VR systems in spine surgery. STUDY DESIGN/SETTING We reviewed titles of more than 250 journal papers from google scholar and PubMed with search words: augmented reality, virtual reality, spine surgery, and orthopaedic, out of which 89 related papers were selected for abstract review. Finally, full text of 67 papers were analyzed and reviewed. METHODS The papers were divided into four groups: technological papers, applications in surgery, applications in spine education and training, and general application in orthopaedic. A team of two reviewers performed paper reviews and a thorough web search to ensure the most updated state of the art in each of four group is captured in the review. RESULTS In this review we discuss the current state of the art in AR and VR hardware, their preoperative applications and surgical applications in spine surgery. Finally, we discuss the future potentials of AR and VR and their integration with AI, robotic surgery, gaming, and wearables. CONCLUSIONS AR and VR are promising technologies that will soon become part of standard of care in spine surgery.
Collapse
|
47
|
Hersh A, Mahapatra S, Weber-Levine C, Awosika T, Theodore JN, Zakaria HM, Liu A, Witham TF, Theodore N. Augmented Reality in Spine Surgery: A Narrative Review. HSS J 2021; 17:351-358. [PMID: 34539277 PMCID: PMC8436352 DOI: 10.1177/15563316211028595] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Augmented reality (AR) navigation refers to novel technologies that superimpose images, such as radiographs and navigation pathways, onto a view of the operative field. The development of AR navigation has focused on improving the safety and efficacy of neurosurgical and orthopedic procedures. In this review, the authors focus on 3 types of AR technology used in spine surgery: AR surgical navigation, microscope-mediated heads-up display, and AR head-mounted displays. Microscope AR and head-mounted displays offer the advantage of reducing attention shift and line-of-sight interruptions inherent in traditional navigation systems. With the U.S. Food and Drug Administration's recent clearance of the XVision AR system (Augmedics, Arlington Heights, IL), the adoption and refinement of AR technology by spine surgeons will only accelerate.
Collapse
Affiliation(s)
- Andrew Hersh
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Smruti Mahapatra
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Carly Weber-Levine
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Tolulope Awosika
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | | | - Hesham M Zakaria
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Ann Liu
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Timothy F Witham
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Nicholas Theodore
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| |
Collapse
|
48
|
Augmented reality-navigated pedicle screw placement: a cadaveric pilot study. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2021; 30:3731-3737. [PMID: 34350487 DOI: 10.1007/s00586-021-06950-w] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/08/2020] [Revised: 11/04/2020] [Accepted: 07/25/2021] [Indexed: 10/20/2022]
Abstract
PURPOSE Augmented reality (AR) is an emerging technology with great potential for surgical navigation through its ability to provide 3D holographic projection of otherwise hidden anatomical information. This pilot cadaver study investigated the feasibility and accuracy of one of the first holographic navigation techniques for lumbar pedicle screw placement. METHODS Lumbar computer tomography scans (CT) of two cadaver specimens and their reconstructed 3D models were used for pedicle screw trajectory planning. Planned trajectories and 3D models were subsequently uploaded to an AR head-mounted device. Randomly, k-wires were placed either into the left or the right pedicle of a vertebra (L1-5) with or without AR-navigation (by holographic projection of the planned trajectory). CT-scans were subsequently performed to assess accuracy of both techniques. RESULTS A total of 18 k-wires could be placed (8 navigated, 10 free hand) by two experienced spine surgeons. In two vertebrae, the AR-navigation was aborted because the registration of the preoperative plan with the intraoperative anatomy was imprecise due to a technical failure. The average differences of the screw entry points between planning and execution were 4.74 ± 2.37 mm in the freehand technique and 5.99 ± 3.60 mm in the AR-navigated technique (p = 0.39). The average deviation from the planned trajectories was 11.21° ± 7.64° in the freehand technique and 5.88° ± 3.69° in the AR-navigated technique (p = 0.09). CONCLUSION This pilot study demonstrates improved angular precision in one of the first AR-navigated pedicle screw placement studies worldwide. Technical shortcomings need to be eliminated before potential clinical applications.
Collapse
|
49
|
Yanni DS, Ozgur BM, Louis RG, Shekhtman Y, Iyer RR, Boddapati V, Iyer A, Patel PD, Jani R, Cummock M, Herur-Raman A, Dang P, Goldstein IM, Brant-Zawadzki M, Steineke T, Lenke LG. Real-time navigation guidance with intraoperative CT imaging for pedicle screw placement using an augmented reality head-mounted display: a proof-of-concept study. Neurosurg Focus 2021; 51:E11. [PMID: 34333483 DOI: 10.3171/2021.5.focus21209] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Accepted: 05/17/2021] [Indexed: 11/06/2022]
Abstract
OBJECTIVE Augmented reality (AR) has the potential to improve the accuracy and efficiency of instrumentation placement in spinal fusion surgery, increasing patient safety and outcomes, optimizing ergonomics in the surgical suite, and ultimately lowering procedural costs. The authors sought to describe the use of a commercial prototype Spine AR platform (SpineAR) that provides a commercial AR head-mounted display (ARHMD) user interface for navigation-guided spine surgery incorporating real-time navigation images from intraoperative imaging with a 3D-reconstructed model in the surgeon's field of view, and to assess screw placement accuracy via this method. METHODS Pedicle screw placement accuracy was assessed and compared with literature-reported data of the freehand (FH) technique. Accuracy with SpineAR was also compared between participants of varying spine surgical experience. Eleven operators without prior experience with AR-assisted pedicle screw placement took part in the study: 5 attending neurosurgeons and 6 trainees (1 neurosurgical fellow, 1 senior orthopedic resident, 3 neurosurgical residents, and 1 medical student). Commercially available 3D-printed lumbar spine models were utilized as surrogates of human anatomy. Among the operators, a total of 192 screws were instrumented bilaterally from L2-5 using SpineAR in 24 lumbar spine models. All but one trainee also inserted 8 screws using the FH method. In addition to accuracy scoring using the Gertzbein-Robbins grading scale, axial trajectory was assessed, and user feedback on experience with SpineAR was collected. RESULTS Based on the Gertzbein-Robbins grading scale, the overall screw placement accuracy using SpineAR among all users was 98.4% (192 screws). Accuracy for attendings and trainees was 99.1% (112 screws) and 97.5% (80 screws), respectively. Accuracy rates were higher compared with literature-reported lumbar screw placement accuracy using FH for attendings (99.1% vs 94.32%; p = 0.0212) and all users (98.4% vs 94.32%; p = 0.0099). The percentage of total inserted screws with a minimum of 5° medial angulation was 100%. No differences were observed between attendings and trainees or between the two methods. User feedback on SpineAR was generally positive. CONCLUSIONS Screw placement was feasible and accurate using SpineAR, an ARHMD platform with real-time navigation guidance that provided a favorable surgeon-user experience.
Collapse
Affiliation(s)
- Daniel S Yanni
- 1Pickup Family Neurosciences Institute, Hoag Memorial Hospital Presbyterian Newport Beach; and.,2Disc Comfort, Inc., Newport Beach, California
| | - Burak M Ozgur
- 1Pickup Family Neurosciences Institute, Hoag Memorial Hospital Presbyterian Newport Beach; and
| | - Robert G Louis
- 1Pickup Family Neurosciences Institute, Hoag Memorial Hospital Presbyterian Newport Beach; and
| | - Yevgenia Shekhtman
- 3Neuroscience Institute, Hackensack Meridian JFK Medical Center, Edison; and
| | - Rajiv R Iyer
- 4Department of Orthopedic Surgery, Columbia University; and
| | | | - Asha Iyer
- 3Neuroscience Institute, Hackensack Meridian JFK Medical Center, Edison; and
| | - Purvee D Patel
- 5Department of Neurological Surgery, Rutgers New Jersey Medical School, Newark, New Jersey
| | - Raja Jani
- 5Department of Neurological Surgery, Rutgers New Jersey Medical School, Newark, New Jersey
| | - Matthew Cummock
- 5Department of Neurological Surgery, Rutgers New Jersey Medical School, Newark, New Jersey
| | - Aalap Herur-Raman
- 6George Washington University School of Medicine, Washington, DC; and
| | | | - Ira M Goldstein
- 5Department of Neurological Surgery, Rutgers New Jersey Medical School, Newark, New Jersey
| | - Michael Brant-Zawadzki
- 1Pickup Family Neurosciences Institute, Hoag Memorial Hospital Presbyterian Newport Beach; and
| | - Thomas Steineke
- 3Neuroscience Institute, Hackensack Meridian JFK Medical Center, Edison; and
| | - Lawrence G Lenke
- 4Department of Orthopedic Surgery, Columbia University; and.,8Department of Neurological Surgery, NewYork-Presbyterian/Allen Hospital, New York, New York
| |
Collapse
|
50
|
Ivan ME, Eichberg DG, Di L, Shah AH, Luther EM, Lu VM, Komotar RJ, Urakov TM. Augmented reality head-mounted display-based incision planning in cranial neurosurgery: a prospective pilot study. Neurosurg Focus 2021; 51:E3. [PMID: 34333466 DOI: 10.3171/2021.5.focus20735] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2020] [Accepted: 05/13/2021] [Indexed: 11/06/2022]
Abstract
OBJECTIVE Monitor and wand-based neuronavigation stations (MWBNSs) for frameless intraoperative neuronavigation are routinely used in cranial neurosurgery. However, they are temporally and spatially cumbersome; the OR must be arranged around the MWBNS, at least one hand must be used to manipulate the MWBNS wand (interrupting a bimanual surgical technique), and the surgical workflow is interrupted as the surgeon stops to "check the navigation" on a remote monitor. Thus, there is need for continuous, real-time, hands-free, neuronavigation solutions. Augmented reality (AR) is poised to streamline these issues. The authors present the first reported prospective pilot study investigating the feasibility of using the OpenSight application with an AR head-mounted display to map out the borders of tumors in patients undergoing elective craniotomy for tumor resection, and to compare the degree of correspondence with MWBNS tracing. METHODS Eleven consecutive patients undergoing elective craniotomy for brain tumor resection were prospectively identified and underwent circumferential tumor border tracing at the time of incision planning by a surgeon wearing HoloLens AR glasses running the commercially available OpenSight application registered to the patient and preoperative MRI. Then, the same patient underwent circumferential tumor border tracing using the StealthStation S8 MWBNS. Postoperatively, both tumor border tracings were compared by two blinded board-certified neurosurgeons and rated as having an excellent, adequate, or poor correspondence degree based on a subjective sense of the overlap. Objective overlap area measurements were also determined. RESULTS Eleven patients undergoing craniotomy were included in the study. Five patient procedures were rated as having an excellent correspondence degree, 5 had an adequate correspondence degree, and 1 had poor correspondence. Both raters agreed on the rating in all cases. AR tracing was possible in all cases. CONCLUSIONS In this small pilot study, the authors found that AR was implementable in the workflow of a neurosurgery OR, and was a feasible method of preoperative tumor border identification for incision planning. Future studies are needed to identify strategies to improve and optimize AR accuracy.
Collapse
Affiliation(s)
- Michael E Ivan
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and.,2Sylvester Comprehensive Cancer Center, Miami, Florida
| | - Daniel G Eichberg
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and
| | - Long Di
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and
| | - Ashish H Shah
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and
| | - Evan M Luther
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and
| | - Victor M Lu
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and
| | - Ricardo J Komotar
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and.,2Sylvester Comprehensive Cancer Center, Miami, Florida
| | | |
Collapse
|