1
|
Cho JS, Jotwani R, Chan S, Thaker DM, On JD, Yong RJ, Hao D. Extended reality navigation for pain procedures: a narrative review. Reg Anesth Pain Med 2024:rapm-2024-105352. [PMID: 38754990 DOI: 10.1136/rapm-2024-105352] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2024] [Accepted: 05/01/2024] [Indexed: 05/18/2024]
Abstract
BACKGROUND Extended reality (XR) technology, encompassing virtual reality, augmented reality, and mixed reality, has been widely studied for procedural navigation in surgical specialties. Similar to how ultrasound transformed regional anesthesia, XR has the potential to reshape how anesthesiologists and pain physicians perform procedures to relieve pain. OBJECTIVE This narrative review examines the clinical benefits of XR for navigation in various pain procedures. It defines key terms and concepts related to XR technology and explores characteristics of procedures that are most amenable to XR-based navigation. Finally, it suggests best practices for developing XR navigation systems and discusses the role of emerging technology in the future of XR in regional anesthesia and pain medicine. EVIDENCE REVIEW A search was performed across PubMed, Embase, and Cochrane Central Register of Controlled Trials for primary literature investigating the clinical benefits of XR navigation for pain procedures. FINDINGS Thirteen studies using XR for procedural navigation are included. The evidence includes randomized controlled trials, retrospective studies, and case series. CONCLUSIONS Early randomized controlled trials show potential for XR to improve procedural efficiency, but more comprehensive research is needed to determine if there are significant clinical benefits. Case reports demonstrate XR's utility in generating patient-specific navigation plans when difficult anatomy is encountered. Procedures that facilitate the generation and registration of XR images are most conducive to XR navigation, whereas those that rely on frequent re-imaging will continue to depend on traditional modes of navigation.
Collapse
Affiliation(s)
- James Sungjai Cho
- Department of Anesthesia, Critical Care and Pain Medicine, Massachusetts General Hospital, Boston, Massachusetts, USA
- Harvard Medical School, Boston, Massachusetts, USA
| | - Rohan Jotwani
- Department of Anesthesiology, Weill Cornell Medicine, New York, New York, USA
| | | | - Devaunsh Manish Thaker
- Department of Anesthesiology, Perioperative Care & Pain Medicine, NYU Langone Health, New York, New York, USA
| | - Jungmin Daniel On
- Department of Anesthesiology, Rush University Medical Center, Chicago, Illinois, USA
| | - R Jason Yong
- Department of Anesthesiology, Perioperative and Pain Medicine, Brigham and Women's Hospital, Boston, Massachusetts, USA
- Harvard Medical School, Boston, Massachusetts, USA
| | - David Hao
- Department of Anesthesia, Critical Care and Pain Medicine, Massachusetts General Hospital, Boston, Massachusetts, USA
- Harvard Medical School, Boston, Massachusetts, USA
| |
Collapse
|
2
|
Bui T, Ruiz-Cardozo MA, Dave HS, Barot K, Kann MR, Joseph K, Lopez-Alviar S, Trevino G, Brehm S, Yahanda AT, Molina CA. Virtual, Augmented, and Mixed Reality Applications for Surgical Rehearsal, Operative Execution, and Patient Education in Spine Surgery: A Scoping Review. MEDICINA (KAUNAS, LITHUANIA) 2024; 60:332. [PMID: 38399619 PMCID: PMC10890632 DOI: 10.3390/medicina60020332] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/17/2024] [Revised: 02/05/2024] [Accepted: 02/11/2024] [Indexed: 02/25/2024]
Abstract
Background and Objectives: Advances in virtual reality (VR), augmented reality (AR), and mixed reality (MR) technologies have resulted in their increased application across many medical specialties. VR's main application has been for teaching and preparatory roles, while AR has been mostly used as a surgical adjunct. The objective of this study is to discuss the various applications and prospects for VR, AR, and MR specifically as they relate to spine surgery. Materials and Methods: A systematic review was conducted to examine the current applications of VR, AR, and MR with a focus on spine surgery. A literature search of two electronic databases (PubMed and Scopus) was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). The study quality was assessed using the MERSQI score for educational research studies, QUACS for cadaveric studies, and the JBI critical appraisal tools for clinical studies. Results: A total of 228 articles were identified in the primary literature review. Following title/abstract screening and full-text review, 46 articles were included in the review. These articles comprised nine studies performed in artificial models, nine cadaveric studies, four clinical case studies, nineteen clinical case series, one clinical case-control study, and four clinical parallel control studies. Teaching applications utilizing holographic overlays are the most intensively studied aspect of AR/VR; the most simulated surgical procedure is pedicle screw placement. Conclusions: VR provides a reproducible and robust medium for surgical training through surgical simulations and for patient education through various platforms. Existing AR/MR platforms enhance the accuracy and precision of spine surgeries and show promise as a surgical adjunct.
Collapse
Affiliation(s)
- Tim Bui
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Miguel A. Ruiz-Cardozo
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Harsh S. Dave
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Karma Barot
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Michael Ryan Kann
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
- University of Pittsburgh School of Medicine, Pittsburgh, PA 15261, USA
| | - Karan Joseph
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Sofia Lopez-Alviar
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Gabriel Trevino
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Samuel Brehm
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Alexander T. Yahanda
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Camilo A Molina
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| |
Collapse
|
3
|
Schwendner M, Ille S, Wostrack M, Meyer B. Evaluating a cutting-edge augmented reality-supported navigation system for spinal instrumentation. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2024; 33:282-288. [PMID: 37962688 DOI: 10.1007/s00586-023-08011-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/12/2023] [Revised: 08/27/2023] [Accepted: 10/16/2023] [Indexed: 11/15/2023]
Abstract
OBJECTIVE Dorsal instrumentation using pedicle screws is a standard treatment for multiple spinal pathologies, such as trauma, infection, or degenerative indications. Intraoperative three-dimensional (3D) imaging and navigated pedicle screw placement are used at multiple centers. For the present study, we evaluated a new navigation system enabling augmented reality (AR)-supported pedicle screw placement while integrating navigation cameras into the reference array and drill guide. The present study aimed to evaluate its clinical application regarding safety, efficacy, and accuracy. METHODS A total of 20 patients were operated on between 06/2021 and 01/2022 using the new technique for intraoperative navigation. Intraoperative data with a focus on accuracy and patient safety, including patient outcome, were analyzed. The accuracy of pedicle screw placement was evaluated by intraoperative CT imaging. RESULTS A median of 8 (4-18) pedicle screws were placed in each case. Percutaneous instrumentation was performed in 14 patients (70%). The duration of pedicle screw placement (duration scan-scan) was 56 ± 26 (30-107) min. Intraoperative screw revision was necessary for 3 of 180 pedicle screws (1.7%). Intraoperatively, no major complications occurred-one case of delay due to software issues and one case of difficult screw placement were reported. CONCLUSION The current study's results could confirm the use of the present AR-supported system for navigated pedicle screw placement for dorsal instrumentation in clinical routine. It provides a reliable and safe tool for 3D imaging-based pedicle screw placement, only requires a minimal intraoperative setup, and provides new opportunities by integrating AR.
Collapse
Affiliation(s)
- Maximilian Schwendner
- Department of Neurosurgery, Klinikum Rechts der Isar, Technische Universität München, Ismaninger Str. 22, 81675, Munich, Germany
- TUM Neuroimaging Center, School of Medicine, Klinikum Rechts der Isar, Technical University of Munich, Munich, Germany
| | - Sebastian Ille
- Department of Neurosurgery, Klinikum Rechts der Isar, Technische Universität München, Ismaninger Str. 22, 81675, Munich, Germany.
- TUM Neuroimaging Center, School of Medicine, Klinikum Rechts der Isar, Technical University of Munich, Munich, Germany.
| | - Maria Wostrack
- Department of Neurosurgery, Klinikum Rechts der Isar, Technische Universität München, Ismaninger Str. 22, 81675, Munich, Germany
| | - Bernhard Meyer
- Department of Neurosurgery, Klinikum Rechts der Isar, Technische Universität München, Ismaninger Str. 22, 81675, Munich, Germany
| |
Collapse
|
4
|
Shahzad H, Bhatti NS, Phillips FM, Khan SN. Applications of Augmented Reality in Orthopaedic Spine Surgery. J Am Acad Orthop Surg 2023; 31:e601-e609. [PMID: 37105182 DOI: 10.5435/jaaos-d-23-00023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Accepted: 03/27/2023] [Indexed: 04/29/2023] Open
Abstract
The application of augmented reality (AR) in surgical settings has primarily been as a navigation tool in the operating room because of its ease of use and minimal effect on surgical procedures. The surgeon can directly face the surgical field while viewing 3D anatomy virtually, thus reducing the need to look at an external display, such as a navigation system. Applications of AR are being explored in spine surgery. The basic principles of AR include data preparation, registration, tracking, and visualization. Current literature provides sufficient preclinical and clinical data evidence for the use of AR technology in spine surgery. AR systems are efficient assistive devices, providing greater accuracy for insertion points, more comfort for surgeons, and reduced operating time. AR technology also has beneficial applications in surgical training, education, and telementorship for spine surgery. However, costs associated with specially designed imaging equipment and physicians' comfort in using this technology continue to remain barriers to its adoption. As this technology evolves to a more widespread use, future applications will be directed by the cost-effectiveness of AR-assisted surgeries.
Collapse
Affiliation(s)
- Hania Shahzad
- From the Department of Orthopedics, The Ohio State University, Wexner Medical Center, Columbus, OH (Shahzad, Bhatti, and Khan) and Department of Orthopedics, Rush University Medical Center, Chicago, IL (Phillips)
| | | | | | | |
Collapse
|
5
|
Bounajem MT, Cameron B, Sorensen K, Parr R, Gibby W, Prashant G, Evans JJ, Karsy M. Improved Accuracy and Lowered Learning Curve of Ventricular Targeting Using Augmented Reality-Phantom and Cadaveric Model Testing. Neurosurgery 2023; 92:884-891. [PMID: 36562619 DOI: 10.1227/neu.0000000000002293] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2022] [Accepted: 09/23/2022] [Indexed: 12/24/2022] Open
Abstract
BACKGROUND Augmented reality (AR) has demonstrated significant potential in neurosurgical cranial, spine, and teaching applications. External ventricular drain (EVD) placement remains a common procedure, but with error rates in targeting between 10% and 40%. OBJECTIVE To evaluate Novarad VisAR guidance system for the placement of EVDs in phantom and cadaveric models. METHODS Two synthetic ventricular phantom models and a third cadaver model underwent computerized tomography imaging and registration with the VisAR system (Novarad). Root mean square (RMS), angular error (γ), and Euclidian distance were measured by multiple methods for various standard EVD placements. RESULTS Computerized tomography measurements on a phantom model (0.5-mm targets showed a mean Euclidean distance error of 1.20 ± 0.98 mm and γ of 1.25° ± 1.02°. Eight participants placed EVDs in lateral and occipital burr holes using VisAR in a second phantom anatomic ventricular model (mean RMS: 3.9 ± 1.8 mm, γ: 3.95° ± 1.78°). There were no statistically significant differences in accuracy for postgraduate year level, prior AR experience, prior EVD experience, or experience with video games ( P > .05). In comparing EVDs placed with anatomic landmarks vs VisAR navigation in a cadaver, VisAR demonstrated significantly better RMS and γ, 7.47 ± 0.94 mm and 7.12° ± 0.97°, respectively ( P ≤ .05). CONCLUSION The novel VisAR AR system resulted in accurate placement of EVDs with a rapid learning curve, which may improve clinical treatment and patient safety. Future applications of VisAR can be expanded to other cranial procedures.
Collapse
Affiliation(s)
- Michael T Bounajem
- Department of Neurosurgery, Clinical Neurosciences Center, University of Utah, Salt Lake City, Utah, USA
| | | | | | | | - Wendell Gibby
- Novarad, Provo, Utah, USA
- Department of Radiology, University of California-San Diego, San Diego, California, USA
| | - Giyarpuram Prashant
- Department of Neurosurgery, Thomas Jefferson University Hospital, Philadelphia, Pennsylvania, USA
| | - James J Evans
- Department of Neurosurgery, Thomas Jefferson University Hospital, Philadelphia, Pennsylvania, USA
| | - Michael Karsy
- Department of Neurosurgery, Clinical Neurosciences Center, University of Utah, Salt Lake City, Utah, USA
| |
Collapse
|
6
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 20] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
7
|
Remote Interactive Surgery Platform (RISP): Proof of Concept for an Augmented-Reality-Based Platform for Surgical Telementoring. J Imaging 2023; 9:jimaging9030056. [PMID: 36976107 PMCID: PMC10054087 DOI: 10.3390/jimaging9030056] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2022] [Revised: 02/15/2023] [Accepted: 02/17/2023] [Indexed: 02/26/2023] Open
Abstract
The “Remote Interactive Surgery Platform” (RISP) is an augmented reality (AR)-based platform for surgical telementoring. It builds upon recent advances of mixed reality head-mounted displays (MR-HMD) and associated immersive visualization technologies to assist the surgeon during an operation. It enables an interactive, real-time collaboration with a remote consultant by sharing the operating surgeon’s field of view through the Microsoft (MS) HoloLens2 (HL2). Development of the RISP started during the Medical Augmented Reality Summer School 2021 and is currently still ongoing. It currently includes features such as three-dimensional annotations, bidirectional voice communication and interactive windows to display radiographs within the sterile field. This manuscript provides an overview of the RISP and preliminary results regarding its annotation accuracy and user experience measured with ten participants.
Collapse
|
8
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
9
|
Fan X, Zhu Q, Tu P, Joskowicz L, Chen X. A review of advances in image-guided orthopedic surgery. Phys Med Biol 2023; 68. [PMID: 36595258 DOI: 10.1088/1361-6560/acaae9] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2022] [Accepted: 12/12/2022] [Indexed: 12/15/2022]
Abstract
Orthopedic surgery remains technically demanding due to the complex anatomical structures and cumbersome surgical procedures. The introduction of image-guided orthopedic surgery (IGOS) has significantly decreased the surgical risk and improved the operation results. This review focuses on the application of recent advances in artificial intelligence (AI), deep learning (DL), augmented reality (AR) and robotics in image-guided spine surgery, joint arthroplasty, fracture reduction and bone tumor resection. For the pre-operative stage, key technologies of AI and DL based medical image segmentation, 3D visualization and surgical planning procedures are systematically reviewed. For the intra-operative stage, the development of novel image registration, surgical tool calibration and real-time navigation are reviewed. Furthermore, the combination of the surgical navigation system with AR and robotic technology is also discussed. Finally, the current issues and prospects of the IGOS system are discussed, with the goal of establishing a reference and providing guidance for surgeons, engineers, and researchers involved in the research and development of this area.
Collapse
Affiliation(s)
- Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Qiyang Zhu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Leo Joskowicz
- School of Computer Science and Engineering, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China.,Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| |
Collapse
|
10
|
Chegini S, Edwards E, McGurk M, Clarkson M, Schilling C. Systematic review of techniques used to validate the registration of augmented-reality images using a head-mounted device to navigate surgery. Br J Oral Maxillofac Surg 2023; 61:19-27. [PMID: 36513525 DOI: 10.1016/j.bjoms.2022.08.007] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 07/31/2022] [Accepted: 08/17/2022] [Indexed: 12/14/2022]
Abstract
Augmented-reality (AR) head-mounted devices (HMD) allow the wearer to have digital images superposed on to their field of vision. They are being used to superpose annotations on to the surgical field akin to a navigation system. This review examines published validation studies on HMD-AR systems, their reported protocols, and outcomes. The aim was to establish commonalities and an acceptable registration outcome. Multiple databases were systematically searched for relevant articles between January 2015 and January 2021. Studies that examined the registration of AR content using a HMD to guide surgery were eligible for inclusion. The country of origin, year of publication, medical specialty, HMD device, software, and method of registration, were recorded. A meta-analysis of the mean registration error was conducted. A total of 4784 papers were identified, of which 23 met the inclusion criteria. They included studies using HoloLens (Microsoft) (n = 22) and nVisor ST60 (NVIS Inc) (n = 1). Sixty-six per cent of studies were in hard tissue specialties. Eleven studies reported registration errors using pattern markers (mean (SD) 2.6 (1.8) mm), and four reported registration errors using surface markers (mean (SD) 3.8 (3.7) mm). Three studies reported registration errors using manual alignment (mean (SD) 2.2 (1.3) mm). The majority of studies in this review used in-house software with a variety of registration methods and reported errors. The mean registration error calculated in this study can be considered as a minimum acceptable standard. It should be taken into consideration when procedural applications are selected.
Collapse
Affiliation(s)
- Soudeh Chegini
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom.
| | - Eddie Edwards
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Mark McGurk
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Matthew Clarkson
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Clare Schilling
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| |
Collapse
|
11
|
Bagher Zadeh Ansari N, Léger É, Kersten-Oertel M. VentroAR: an augmented reality platform for ventriculostomy using the Microsoft HoloLens. COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING: IMAGING & VISUALIZATION 2022. [DOI: 10.1080/21681163.2022.2156394] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/26/2022]
Affiliation(s)
| | - Étienne Léger
- Department of Computer Science and Software Engineering, Concordia University, Montreal, QC, Canada
| | - Marta Kersten-Oertel
- Department of Computer Science and Software Engineering, Concordia University, Montreal, QC, Canada
- PERFORM Centre, Concordia University, Montreal, QC, Canada
| |
Collapse
|
12
|
Gibby W, Cvetko S, Gibby A, Gibby C, Sorensen K, Andrews EG, Maroon J, Parr R. The application of augmented reality-based navigation for accurate target acquisition of deep brain sites: advances in neurosurgical guidance. J Neurosurg 2022; 137:489-495. [PMID: 34920422 DOI: 10.3171/2021.9.jns21510] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2021] [Accepted: 09/09/2021] [Indexed: 11/06/2022]
Abstract
OBJECTIVE The objective of this study is to quantify the navigational accuracy of an advanced augmented reality (AR)-based guidance system for neurological surgery, biopsy, and/or other minimally invasive neurological surgical procedures. METHODS Five burr holes were drilled through a plastic cranium, and 5 optical fiducials (AprilTags) printed with CT-visible ink were placed on the frontal, temporal, and parietal bones of a human skull model. Three 0.5-mm-diameter targets were mounted in the interior of the skull on nylon posts near the level of the tentorium cerebelli and the pituitary fossa. The skull was filled with ballistic gelatin to simulate brain tissue. A CT scan was taken and virtual needle tracts were annotated on the preoperative 3D workstation for the combination of 3 targets and 5 access holes (15 target tracts). The resulting annotated study was uploaded to and launched by VisAR software operating on the HoloLens 2 holographic visor by viewing an encrypted, printed QR code assigned to the study by the preoperative workstation. The DICOM images were converted to 3D holograms and registered to the skull by alignment of the holographic fiducials with the AprilTags attached to the skull. Five volunteers, familiar with the VisAR, used the software/visor combination to navigate an 18-gauge needle/trocar through the series of burr holes to the target, resulting in 70 data points (15 for 4 users and 10 for 1 user). After each attempt the needle was left in the skull, supported by the ballistic gelatin, and a high-resolution CT was taken. Radial error and angle of error were determined using vector coordinates. Summary statistics were calculated individually and collectively. RESULTS The combined angle of error of was 2.30° ± 1.28°. The mean radial error for users was 3.62 ± 1.71 mm. The mean target depth was 85.41 mm. CONCLUSIONS The mean radial error and angle of error with the associated variance measures demonstrates that VisAR navigation may have utility for guiding a small needle to neural lesions, or targets within an accuracy of 3.62 mm. These values are sufficiently accurate for the navigation of many neurological procedures such as ventriculostomy.
Collapse
Affiliation(s)
- Wendell Gibby
- 1Department of Radiology, University of California, San Diego, California
- 2Novarad, American Fork, Utah
- 3Blue Rock Medical, Provo, Utah; and
| | | | | | | | | | - Edward G Andrews
- 4Department of Neurosurgery, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania
| | - Joseph Maroon
- 4Department of Neurosurgery, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania
| | | |
Collapse
|
13
|
Augmented Reality Spine Surgery Navigation: Increasing Pedicle Screw Insertion Accuracy for Both Open and Minimally Invasive Spine Surgeries. Spine (Phila Pa 1976) 2022; 47:865-872. [PMID: 35132049 DOI: 10.1097/brs.0000000000004338] [Citation(s) in RCA: 24] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
STUDY DESIGN Collectively, seven cadavers were instrumented with 124 thoracolumbar pedicle screws using VisAR augmented reality/guidance. Sixty-five screws were inserted into four donors using open dissection spine surgery. Fifty-nine screws were positioned in three donors with a minimally invasive spine surgery (MISS) procedure. For both open and MISS, VisAR was used exclusively for pedicle screw navigation. OBJECTIVE The objective of this study was to determine the accuracy of pedicle screw placement using VisAR for open spine and MISS procedures. SUMMARY OF BACKGROUND DATA Pedicle screw placement can be challenging depending on anatomical location and a surgeon's experience. AR may minimize fluoroscopy use and speed screw insertion. METHODS Prior to computed tomography (CT) a series of four image visible April Tag optical fiducials were attached to the backs' of the donors. Resulting images were used preoperatively for planned virtual pedicle screw pathways including entry point, trajectory, and depth. The study link was encrypted on a quick response (QR) code, printed, and viewed in the operating room (OR) by the surgeon using VisAR (HoloLens 2 headset). Viewing the code wirelessly uploads and launches the study, converting the DICOM data to holographic images which register to the fiducials on the donor's back. The annotated pathways for each pedicle were called up by voice command and the surgeon positioned each screw by aligning with the virtual guidance hologram. RESULTS Overall, 124 pedicle screws were inserted with VisAR navigation with 96% accuracy (Gertzbein-Robbins grades A and B). The combined angle of error was 2.4° and the distance error was 1.9 mm. CONCLUSION Augmented reality is a highly accurate, emerging technology for navigating both open and minimally invasive spine surgery techniques with off-the-shelf headset hardware. LEVEL OF EVIDENCE N/A.
Collapse
|
14
|
Liu Y, Lee MG, Kim JS. Spine Surgery Assisted by Augmented Reality: Where Have We Been? Yonsei Med J 2022; 63:305-316. [PMID: 35352881 PMCID: PMC8965436 DOI: 10.3349/ymj.2022.63.4.305] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/19/2021] [Revised: 02/02/2022] [Accepted: 02/09/2022] [Indexed: 11/27/2022] Open
Abstract
This present systematic review examines spine surgery literature supporting augmented reality (AR) technology and summarizes its current status in spinal surgery technology. Database search strategies were retrieved from PubMed, Web of Science, Cochrane Library, Embase, from the earliest records to April 1, 2021. Our review briefly examines the history of AR, and enumerates different device application workflows in a variety of spinal surgeries. We also sort out the pros and cons of current mainstream AR devices and the latest updates. A total of 45 articles are included in our review. The most prevalent surgical applications included are the augmented reality surgical navigation system and head-mounted display. The most popular application of AR is pedicle screw instrumentation in spine surgery, and the primary responsible surgical levels are thoracic and lumbar. AR guidance systems show high potential value in practical clinical applications for the spine. The overall number of cases in AR-related studies is still rare compared to traditional surgical-assisted techniques. These lack long-term clinical efficacy and robust surgical-related statistical data. Changing healthcare laws as well as the increasing prevalence of spinal surgery are generating critical data that determines the value of AR technology.
Collapse
Affiliation(s)
- Yanting Liu
- Department of Neurosurgery, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Min-Gi Lee
- Department of Neurosurgery, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Jin-Sung Kim
- Department of Neurosurgery, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea.
| |
Collapse
|
15
|
Birlo M, Edwards PJE, Clarkson M, Stoyanov D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med Image Anal 2022; 77:102361. [PMID: 35168103 PMCID: PMC10466024 DOI: 10.1016/j.media.2022.102361] [Citation(s) in RCA: 19] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 11/17/2021] [Accepted: 01/10/2022] [Indexed: 12/11/2022]
Abstract
This article presents a systematic review of optical see-through head mounted display (OST-HMD) usage in augmented reality (AR) surgery applications from 2013 to 2020. Articles were categorised by: OST-HMD device, surgical speciality, surgical application context, visualisation content, experimental design and evaluation, accuracy and human factors of human-computer interaction. 91 articles fulfilled all inclusion criteria. Some clear trends emerge. The Microsoft HoloLens increasingly dominates the field, with orthopaedic surgery being the most popular application (28.6%). By far the most common surgical context is surgical guidance (n=58) and segmented preoperative models dominate visualisation (n=40). Experiments mainly involve phantoms (n=43) or system setup (n=21), with patient case studies ranking third (n=19), reflecting the comparative infancy of the field. Experiments cover issues from registration to perception with very different accuracy results. Human factors emerge as significant to OST-HMD utility. Some factors are addressed by the systems proposed, such as attention shift away from the surgical site and mental mapping of 2D images to 3D patient anatomy. Other persistent human factors remain or are caused by OST-HMD solutions, including ease of use, comfort and spatial perception issues. The significant upward trend in published articles is clear, but such devices are not yet established in the operating room and clinical studies showing benefit are lacking. A focused effort addressing technical registration and perceptual factors in the lab coupled with design that incorporates human factors considerations to solve clear clinical problems should ensure that the significant current research efforts will succeed.
Collapse
Affiliation(s)
- Manuel Birlo
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK.
| | - P J Eddie Edwards
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Matthew Clarkson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| |
Collapse
|
16
|
Sumdani H, Aguilar-Salinas P, Avila MJ, Barber SR, Dumont TM. Utility of Augmented Reality and Virtual Reality in Spine Surgery: A Systematic Review of the Literature. World Neurosurg 2021; 161:e8-e17. [PMID: 34384919 DOI: 10.1016/j.wneu.2021.08.002] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2021] [Revised: 08/01/2021] [Accepted: 08/02/2021] [Indexed: 12/11/2022]
Abstract
BACKGROUND Augmented reality, virtual reality, and mixed reality (AR, VR, MR) are emerging technologies that are starting to be translated into clinical practice. There is limited data available about these tools being used in live surgery of the spine. The objective of this paper was to systematically collect, analyze, and interpret the existing data regarding AR, VR, and MR use in spine surgery on live people. METHODS A systematic review was conducted using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines (PRISMA). PubMed, PubMed Central, Cochrane Reviews, and Embase databases were searched. Combinations and variations of "augmented reality", "virtual reality", and "spine surgery" in both AND and OR configurations were used to gather relevant articles. References of included articles from the systematic review were also screened for possible inclusion as a part of manual review. Included studies were full text publications written in English that had any spine surgery on live persons with the use of virtual or augmented reality. RESULTS A total of 1566 unique articles were found, and fifteen full-text publications met criteria for this study. The total number of patients from all studies was 241 with a weighted average age of 50.37. Surgical procedures utilizing AR, VR, and/or MR were diverse and spanned from simple discectomies to intradural spinal tumor resection. All patients experienced improvement in their symptoms from clinical presentation. The highest complication rate mentioned in the articles was 6.1% and was for suboptimal pedicle screw placement. There were no complications that led to clinical sequelae. CONCLUSIONS The systematically collected, analyzed, and interpreted data of existing peer-reviewed full text articles showed favorable metrics regarding surgical efficacy, pedicle screw target accuracy, radiation exposure, clinical outcome, and disability and pain in patients with spinal pathology treated with the help of AR, VR, and/or MR.
Collapse
Affiliation(s)
- Hasan Sumdani
- The University of Arizona College of Medicine, 1501 N Campbell Avenue, Room 4303, Tucson, Arizona, 85724-5070
| | - Pedro Aguilar-Salinas
- The University of Arizona College of Medicine, 1501 N Campbell Avenue, Room 4303, Tucson, Arizona, 85724-5070
| | - Mauricio J Avila
- The University of Arizona College of Medicine, 1501 N Campbell Avenue, Room 4303, Tucson, Arizona, 85724-5070
| | - Samuel R Barber
- The University of Arizona College of Medicine, Department of Otolaryngology, 1501 N Campbell Avenue, Tucson, Arizona, 85724-5070
| | - Travis M Dumont
- The University of Arizona College of Medicine, 1501 N Campbell Avenue, Room 4303, Tucson, Arizona, 85724-5070.
| |
Collapse
|
17
|
Liu PR, Lu L, Zhang JY, Huo TT, Liu SX, Ye ZW. Application of Artificial Intelligence in Medicine: An Overview. Curr Med Sci 2021; 41:1105-1115. [PMID: 34874486 PMCID: PMC8648557 DOI: 10.1007/s11596-021-2474-3] [Citation(s) in RCA: 62] [Impact Index Per Article: 20.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2020] [Accepted: 12/01/2020] [Indexed: 02/06/2023]
Abstract
Artificial intelligence (AI) is a new technical discipline that uses computer technology to research and develop the theory, method, technique, and application system for the simulation, extension, and expansion of human intelligence. With the assistance of new AI technology, the traditional medical environment has changed a lot. For example, a patient's diagnosis based on radiological, pathological, endoscopic, ultrasonographic, and biochemical examinations has been effectively promoted with a higher accuracy and a lower human workload. The medical treatments during the perioperative period, including the preoperative preparation, surgical period, and postoperative recovery period, have been significantly enhanced with better surgical effects. In addition, AI technology has also played a crucial role in medical drug production, medical management, and medical education, taking them into a new direction. The purpose of this review is to introduce the application of AI in medicine and to provide an outlook of future trends.
Collapse
Affiliation(s)
- Peng-ran Liu
- Department of Orthopedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022 China
| | - Lin Lu
- Department of Orthopedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022 China
| | - Jia-yao Zhang
- Department of Orthopedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022 China
| | - Tong-tong Huo
- Department of Orthopedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022 China
| | - Song-xiang Liu
- Department of Orthopedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022 China
| | - Zhe-wei Ye
- Department of Orthopedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022 China
| |
Collapse
|
18
|
Andrews CM, Henry AB, Soriano IM, Southworth MK, Silva JR. Registration Techniques for Clinical Applications of Three-Dimensional Augmented Reality Devices. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE 2020; 9:4900214. [PMID: 33489483 PMCID: PMC7819530 DOI: 10.1109/jtehm.2020.3045642] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/28/2020] [Revised: 11/13/2020] [Accepted: 12/03/2020] [Indexed: 12/15/2022]
Abstract
Many clinical procedures would benefit from direct and intuitive real-time visualization of anatomy, surgical plans, or other information crucial to the procedure. Three-dimensional augmented reality (3D-AR) is an emerging technology that has the potential to assist physicians with spatial reasoning during clinical interventions. The most intriguing applications of 3D-AR involve visualizations of anatomy or surgical plans that appear directly on the patient. However, commercially available 3D-AR devices have spatial localization errors that are too large for many clinical procedures. For this reason, a variety of approaches for improving 3D-AR registration accuracy have been explored. The focus of this review is on the methods, accuracy, and clinical applications of registering 3D-AR devices with the clinical environment. The works cited represent a variety of approaches for registering holograms to patients, including manual registration, computer vision-based registration, and registrations that incorporate external tracking systems. Evaluations of user accuracy when performing clinically relevant tasks suggest that accuracies of approximately 2 mm are feasible. 3D-AR device limitations due to the vergence-accommodation conflict or other factors attributable to the headset hardware add on the order of 1.5 mm of error compared to conventional guidance. Continued improvements to 3D-AR hardware will decrease these sources of error.
Collapse
Affiliation(s)
- Christopher M. Andrews
- Department of Biomedical EngineeringWashington University in St Louis, McKelvey School of EngineeringSt LouisMO63130USA
- SentiAR, Inc.St. LouisMO63108USA
| | | | | | | | - Jonathan R. Silva
- Department of Biomedical EngineeringWashington University in St Louis, McKelvey School of EngineeringSt LouisMO63130USA
| |
Collapse
|
19
|
Yan C, Wu T, Huang K, He J, Liu H, Hong Y, Wang B. The Application of Virtual Reality in Cervical Spinal Surgery: A Review. World Neurosurg 2020; 145:108-113. [PMID: 32931993 DOI: 10.1016/j.wneu.2020.09.040] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2020] [Revised: 09/08/2020] [Accepted: 09/09/2020] [Indexed: 02/05/2023]
Abstract
In recent years, clinicians have used virtual reality (VR) to simulate real-world environments for medical purposes. The use of VR systems in the field of cervical spine surgery can lead to effective surgical training programs without causing harm to patients. Moreover, both imaging and VR can be used before surgery to assist preoperative surgical planning. VR devices have a variety of built-in motion sensors, therefore kinematic data can be recorded while users are wearing VR devices and performing some actions for the evaluation of cervical spine activity and exercise ability. Therapists have also applied VR to cervical spine rehabilitation and showed good results. At present, the application of VR systems in cervical spine surgery has great potential, but current research is insufficient. Here, we review the latest advancements in VR technology used in cervical spine surgery and discuss potential directions for future work.
Collapse
Affiliation(s)
- Chunyi Yan
- Department of Orthopedics, West China Hospital, Sichuan University, Chengdu, Sichuan, China
| | - Tingkui Wu
- Department of Orthopedics, West China Hospital, Sichuan University, Chengdu, Sichuan, China
| | - Kangkang Huang
- Department of Orthopedics, West China Hospital, Sichuan University, Chengdu, Sichuan, China
| | - Junbo He
- Department of Orthopedics, West China Hospital, Sichuan University, Chengdu, Sichuan, China
| | - Hao Liu
- Department of Orthopedics, West China Hospital, Sichuan University, Chengdu, Sichuan, China
| | - Ying Hong
- Department of Anesthesia and Operation Center, West China Hospital, Sichuan University, Chengdu, Sichuan, China; West China School of Nursing, Sichuan University, Chengdu, Sichuan, China
| | - Beiyu Wang
- Department of Orthopedics, West China Hospital, Sichuan University, Chengdu, Sichuan, China.
| |
Collapse
|