1
|
Bui T, Ruiz-Cardozo MA, Dave HS, Barot K, Kann MR, Joseph K, Lopez-Alviar S, Trevino G, Brehm S, Yahanda AT, Molina CA. Virtual, Augmented, and Mixed Reality Applications for Surgical Rehearsal, Operative Execution, and Patient Education in Spine Surgery: A Scoping Review. MEDICINA (KAUNAS, LITHUANIA) 2024; 60:332. [PMID: 38399619 PMCID: PMC10890632 DOI: 10.3390/medicina60020332] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/17/2024] [Revised: 02/05/2024] [Accepted: 02/11/2024] [Indexed: 02/25/2024]
Abstract
Background and Objectives: Advances in virtual reality (VR), augmented reality (AR), and mixed reality (MR) technologies have resulted in their increased application across many medical specialties. VR's main application has been for teaching and preparatory roles, while AR has been mostly used as a surgical adjunct. The objective of this study is to discuss the various applications and prospects for VR, AR, and MR specifically as they relate to spine surgery. Materials and Methods: A systematic review was conducted to examine the current applications of VR, AR, and MR with a focus on spine surgery. A literature search of two electronic databases (PubMed and Scopus) was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). The study quality was assessed using the MERSQI score for educational research studies, QUACS for cadaveric studies, and the JBI critical appraisal tools for clinical studies. Results: A total of 228 articles were identified in the primary literature review. Following title/abstract screening and full-text review, 46 articles were included in the review. These articles comprised nine studies performed in artificial models, nine cadaveric studies, four clinical case studies, nineteen clinical case series, one clinical case-control study, and four clinical parallel control studies. Teaching applications utilizing holographic overlays are the most intensively studied aspect of AR/VR; the most simulated surgical procedure is pedicle screw placement. Conclusions: VR provides a reproducible and robust medium for surgical training through surgical simulations and for patient education through various platforms. Existing AR/MR platforms enhance the accuracy and precision of spine surgeries and show promise as a surgical adjunct.
Collapse
Affiliation(s)
- Tim Bui
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Miguel A. Ruiz-Cardozo
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Harsh S. Dave
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Karma Barot
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Michael Ryan Kann
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
- University of Pittsburgh School of Medicine, Pittsburgh, PA 15261, USA
| | - Karan Joseph
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Sofia Lopez-Alviar
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Gabriel Trevino
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Samuel Brehm
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Alexander T. Yahanda
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Camilo A Molina
- Department of Neurological Surgery, Washington University School of Medicine, St. Louis, MO 63110, USA
| |
Collapse
|
2
|
Liebmann F, von Atzigen M, Stütz D, Wolf J, Zingg L, Suter D, Cavalcanti NA, Leoty L, Esfandiari H, Snedeker JG, Oswald MR, Pollefeys M, Farshad M, Fürnstahl P. Automatic registration with continuous pose updates for marker-less surgical navigation in spine surgery. Med Image Anal 2024; 91:103027. [PMID: 37992494 DOI: 10.1016/j.media.2023.103027] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2023] [Revised: 10/29/2023] [Accepted: 11/09/2023] [Indexed: 11/24/2023]
Abstract
Established surgical navigation systems for pedicle screw placement have been proven to be accurate, but still reveal limitations in registration or surgical guidance. Registration of preoperative data to the intraoperative anatomy remains a time-consuming, error-prone task that includes exposure to harmful radiation. Surgical guidance through conventional displays has well-known drawbacks, as information cannot be presented in-situ and from the surgeon's perspective. Consequently, radiation-free and more automatic registration methods with subsequent surgeon-centric navigation feedback are desirable. In this work, we present a marker-less approach that automatically solves the registration problem for lumbar spinal fusion surgery in a radiation-free manner. A deep neural network was trained to segment the lumbar spine and simultaneously predict its orientation, yielding an initial pose for preoperative models, which then is refined for each vertebra individually and updated in real-time with GPU acceleration while handling surgeon occlusions. An intuitive surgical guidance is provided thanks to the integration into an augmented reality based navigation system. The registration method was verified on a public dataset with a median of 100% successful registrations, a median target registration error of 2.7 mm, a median screw trajectory error of 1.6°and a median screw entry point error of 2.3 mm. Additionally, the whole pipeline was validated in an ex-vivo surgery, yielding a 100% screw accuracy and a median target registration error of 1.0 mm. Our results meet clinical demands and emphasize the potential of RGB-D data for fully automatic registration approaches in combination with augmented reality guidance.
Collapse
Affiliation(s)
- Florentin Liebmann
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland.
| | - Marco von Atzigen
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland
| | - Dominik Stütz
- Computer Vision and Geometry Group, ETH Zurich, Zurich, Switzerland
| | - Julian Wolf
- Product Development Group, ETH Zurich, Zurich, Switzerland
| | - Lukas Zingg
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Daniel Suter
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Nicola A Cavalcanti
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Laura Leoty
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Hooman Esfandiari
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Jess G Snedeker
- Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland
| | - Martin R Oswald
- Computer Vision and Geometry Group, ETH Zurich, Zurich, Switzerland; Computer Vision Lab, University of Amsterdam, Amsterdam, Netherlands
| | - Marc Pollefeys
- Computer Vision and Geometry Group, ETH Zurich, Zurich, Switzerland; Microsoft Mixed Reality and AI Zurich Lab, Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| |
Collapse
|
3
|
He F, Qi X, Feng Q, Zhang Q, Pan N, Yang C, Liu S. Research on augmented reality navigation of in vitro fenestration of stent-graft based on deep learning and virtual-real registration. Comput Assist Surg (Abingdon) 2023; 28:2289339. [PMID: 38059572 DOI: 10.1080/24699322.2023.2289339] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/08/2023] Open
Abstract
OBJECTIVES In vitro fenestration of stent-graft (IVFS) demands high-precision navigation methods to achieve optimal surgical outcomes. This study aims to propose an augmented reality (AR) navigation method for IVFS, which can provide in situ overlay display to locate fenestration positions. METHODS We propose an AR navigation method to assist doctors in performing IVFS. A deep learning-based aorta segmentation algorithm is used to achieve automatic and rapid aorta segmentation. The Vuforia-based virtual-real registration and marker recognition algorithm are integrated to ensure accurate in situ AR image. RESULTS The proposed method can provide three-dimensional in situ AR image, and the fiducial registration error after virtual-real registration is 2.070 mm. The aorta segmentation experiment obtains dice similarity coefficient of 91.12% and Hausdorff distance of 2.59, better than conventional algorithms before improvement. CONCLUSIONS The proposed method can intuitively and accurately locate fenestration positions, and therefore can assist doctors in performing IVFS.
Collapse
Affiliation(s)
- Fengfeng He
- Institute of Biomedical Engineering, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Xiaoyu Qi
- Department of Vascular Surgery, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Qingmin Feng
- Institute of Biomedical Engineering, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Qiang Zhang
- Institute of Biomedical Engineering, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Ning Pan
- School of Biomedical Engineering, South-Central Minzu University, Wuhan, China
| | - Chao Yang
- Department of Vascular Surgery, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Shenglin Liu
- Institute of Biomedical Engineering, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| |
Collapse
|
4
|
Jiang B, Wang L, Xu K, Hossbach M, Demir A, Rajan P, Taylor RH, Moghekar A, Foroughi P, Kazanzides P, Boctor EM. Wearable Mechatronic Ultrasound-integrated AR Navigation System for Lumbar Puncture Guidance. IEEE TRANSACTIONS ON MEDICAL ROBOTICS AND BIONICS 2023; 5:966-977. [PMID: 38779126 PMCID: PMC11107797 DOI: 10.1109/tmrb.2023.3319963] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2024]
Abstract
As one of the most commonly performed spinal interventions in routine clinical practice, lumbar punctures are usually done with only hand palpation and trial-and-error. Failures can prolong procedure time and introduce complications such as cerebrospinal fluid leaks and headaches. Therefore, an effective needle insertion guidance method is desired. In this work, we present a complete lumbar puncture guidance system with the integration of (1) a wearable mechatronic ultrasound imaging device, (2) volume-reconstruction and bone surface estimation algorithms and (3) two alternative augmented reality user interfaces for needle guidance, including a HoloLens-based and a tablet-based solution. We conducted a quantitative evaluation of the end-to-end navigation accuracy, which shows that our system can achieve an overall needle navigation accuracy of 2.83 mm and 2.76 mm for the Tablet-based and the HoloLens-based solutions, respectively. In addition, we conducted a preliminary user study to qualitatively evaluate the effectiveness and ergonomics of our system on lumbar phantoms. The results show that users were able to successfully reach the target in an average of 1.12 and 1.14 needle insertion attempts for Tablet-based and HoloLens-based systems, respectively, exhibiting the potential to reduce the failure rates of lumbar puncture procedures with the proposed lumbar-puncture guidance.
Collapse
Affiliation(s)
- Baichuan Jiang
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Liam Wang
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Keshuai Xu
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| | | | - Alican Demir
- Clear Guide Medical Inc., Baltimore, MD 21211, USA
| | | | - Russell H. Taylor
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Abhay Moghekar
- Department of Neurology, Johns Hopkins Medical Institute, Baltimore, MD 21205, USA
| | | | - Peter Kazanzides
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Emad M. Boctor
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| |
Collapse
|
5
|
Cao B, Yuan B, Xu G, Zhao Y, Sun Y, Wang Z, Zhou S, Xu Z, Wang Y, Chen X. A Pilot Human Cadaveric Study on Accuracy of the Augmented Reality Surgical Navigation System for Thoracolumbar Pedicle Screw Insertion Using a New Intraoperative Rapid Registration Method. J Digit Imaging 2023; 36:1919-1929. [PMID: 37131064 PMCID: PMC10406793 DOI: 10.1007/s10278-023-00840-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2023] [Revised: 04/20/2023] [Accepted: 04/21/2023] [Indexed: 05/04/2023] Open
Abstract
To evaluate the feasibility and accuracy of AR-assisted pedicle screw placement using a new intraoperative rapid registration method of combining preoperative CT scanning and intraoperative C-arm 2D fluoroscopy in cadavers. Five cadavers with intact thoracolumbar spines were employed in this study. Intraoperative registration was performed using anteroposterior and lateral views of preoperative CT scanning and intraoperative 2D fluoroscopic images. Patient-specific targeting guides were used for pedicle screw placement from Th1-L5, totaling 166 screws. Instrumentation for each side was randomized (augmented reality surgical navigation (ARSN) vs. C-arm) with an equal distribution of 83 screws in each group. CT was performed to evaluate the accuracy of both techniques by assessing the screw positions and the deviations between the inserted screws and planned trajectories. Postoperative CT showed that 98.80% (82/83) screws in ARSN group and 72.29% (60/83) screws in C-arm group were within the 2-mm safe zone (p < 0.001). The mean time for instrumentation per level in ARSN group was significantly shorter than that in C-arm group (56.17 ± 3.33 s vs. 99.22 ± 9.03 s, p < 0.001). The overall intraoperative registration time was 17.2 ± 3.5 s per segment. AR-based navigation technology can provide surgeons with accurate guidance of pedicle screw insertion and save the operation time by using the intraoperative rapid registration method of combining preoperative CT scanning and intraoperative C-arm 2D fluoroscopy.
Collapse
Affiliation(s)
- Bing Cao
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Bo Yuan
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Guofeng Xu
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Yin Zhao
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Yanqing Sun
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Zhiwei Wang
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Shengyuan Zhou
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Zheng Xu
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Yao Wang
- Linyan Medical Technology Company Limited, 528 Ruiqing Road, Pudong New District, Shanghai, China
| | - Xiongsheng Chen
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China.
| |
Collapse
|
6
|
Taghian A, Abo-Zahhad M, Sayed MS, Abd El-Malek AH. Virtual and augmented reality in biomedical engineering. Biomed Eng Online 2023; 22:76. [PMID: 37525193 PMCID: PMC10391968 DOI: 10.1186/s12938-023-01138-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2022] [Accepted: 07/12/2023] [Indexed: 08/02/2023] Open
Abstract
BACKGROUND In the future, extended reality technology will be widely used. People will be led to utilize virtual reality (VR) and augmented reality (AR) technologies in their daily lives, hobbies, numerous types of entertainment, and employment. Medical augmented reality has evolved with applications ranging from medical education to picture-guided surgery. Moreover, a bulk of research is focused on clinical applications, with the majority of research devoted to surgery or intervention, followed by rehabilitation and treatment applications. Numerous studies have also looked into the use of augmented reality in medical education and training. METHODS Using the databases Semantic Scholar, Web of Science, Scopus, IEEE Xplore, and ScienceDirect, a scoping review was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) criteria. To find other articles, a manual search was also carried out in Google Scholar. This study presents studies carried out over the previous 14 years (from 2009 to 2023) in detail. We classify this area of study into the following categories: (1) AR and VR in surgery, which is presented in the following subsections: subsection A: MR in neurosurgery; subsection B: spine surgery; subsection C: oral and maxillofacial surgery; and subsection D: AR-enhanced human-robot interaction; (2) AR and VR in medical education presented in the following subsections; subsection A: medical training; subsection B: schools and curriculum; subsection C: XR in Biomedicine; (3) AR and VR for rehabilitation presented in the following subsections; subsection A: stroke rehabilitation during COVID-19; subsection B: cancer and VR, and (4) Millimeter-wave and MIMO systems for AR and VR. RESULTS In total, 77 publications were selected based on the inclusion criteria. Four distinct AR and/or VR applications groups could be differentiated: AR and VR in surgery (N = 21), VR and AR in Medical Education (N = 30), AR and VR for Rehabilitation (N = 15), and Millimeter-Wave and MIMO Systems for AR and VR (N = 7), where N is number of cited studies. We found that the majority of research is devoted to medical training and education, with surgical or interventional applications coming in second. The research is mostly focused on rehabilitation, therapy, and clinical applications. Moreover, the application of XR in MIMO has been the subject of numerous research. CONCLUSION Examples of these diverse fields of applications are displayed in this review as follows: (1) augmented reality and virtual reality in surgery; (2) augmented reality and virtual reality in medical education; (3) augmented reality and virtual reality for rehabilitation; and (4) millimeter-wave and MIMO systems for augmented reality and virtual reality.
Collapse
Affiliation(s)
- Aya Taghian
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt.
| | - Mohammed Abo-Zahhad
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt
- Department of Electrical Engineering, Assiut University, Assiut, Egypt
| | - Mohammed S Sayed
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt
- Department of Electronics and Communications Engineering, Zagazig University, Zagazig, Ash Sharqia, Egypt
| | - Ahmed H Abd El-Malek
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt
| |
Collapse
|
7
|
Seetohul J, Shafiee M, Sirlantzis K. Augmented Reality (AR) for Surgical Robotic and Autonomous Systems: State of the Art, Challenges, and Solutions. SENSORS (BASEL, SWITZERLAND) 2023; 23:6202. [PMID: 37448050 DOI: 10.3390/s23136202] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2023] [Revised: 06/09/2023] [Accepted: 07/03/2023] [Indexed: 07/15/2023]
Abstract
Despite the substantial progress achieved in the development and integration of augmented reality (AR) in surgical robotic and autonomous systems (RAS), the center of focus in most devices remains on improving end-effector dexterity and precision, as well as improved access to minimally invasive surgeries. This paper aims to provide a systematic review of different types of state-of-the-art surgical robotic platforms while identifying areas for technological improvement. We associate specific control features, such as haptic feedback, sensory stimuli, and human-robot collaboration, with AR technology to perform complex surgical interventions for increased user perception of the augmented world. Current researchers in the field have, for long, faced innumerable issues with low accuracy in tool placement around complex trajectories, pose estimation, and difficulty in depth perception during two-dimensional medical imaging. A number of robots described in this review, such as Novarad and SpineAssist, are analyzed in terms of their hardware features, computer vision systems (such as deep learning algorithms), and the clinical relevance of the literature. We attempt to outline the shortcomings in current optimization algorithms for surgical robots (such as YOLO and LTSM) whilst providing mitigating solutions to internal tool-to-organ collision detection and image reconstruction. The accuracy of results in robot end-effector collisions and reduced occlusion remain promising within the scope of our research, validating the propositions made for the surgical clearance of ever-expanding AR technology in the future.
Collapse
Affiliation(s)
- Jenna Seetohul
- Mechanical Engineering Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
| | - Mahmood Shafiee
- Mechanical Engineering Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
- School of Mechanical Engineering Sciences, University of Surrey, Guildford GU2 7XH, UK
| | - Konstantinos Sirlantzis
- School of Engineering, Technology and Design, Canterbury Christ Church University, Canterbury CT1 1QU, UK
- Intelligent Interactions Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
| |
Collapse
|
8
|
McCloskey K, Turlip R, Ahmad HS, Ghenbot YG, Chauhan D, Yoon JW. Virtual and Augmented Reality in Spine Surgery: A Systematic Review. World Neurosurg 2023; 173:96-107. [PMID: 36812986 DOI: 10.1016/j.wneu.2023.02.068] [Citation(s) in RCA: 20] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2022] [Revised: 02/13/2023] [Accepted: 02/14/2023] [Indexed: 02/24/2023]
Abstract
BACKGROUND Augmented reality (AR) and virtual reality (VR) implementation in spinal surgery has expanded rapidly over the past decade. This systematic review summarizes the use of AR/VR technology in surgical education, preoperative planning, and intraoperative guidance. METHODS A search query for AR/VR technology in spine surgery was conducted through PubMed, Embase, and Scopus. After exclusions, 48 studies were included. Included studies were then grouped into relevant subsections. Categorization into subsections yielded 12 surgical training studies, 5 preoperative planning, 24 intraoperative usage, and 10 radiation exposure. RESULTS VR-assisted training significantly reduced penetration rates or increased accuracy rates compared to lecture-based groups in 5 studies. Preoperative VR planning significantly influenced surgical recommendations and reduced radiation exposure, operating time, and estimated blood loss. For 3 patient studies, AR-assisted pedicle screw placement accuracy ranged from 95.77% to 100% using the Gertzbein grading scale. Head-mounted display was the most common interface used intraoperatively followed by AR microscope and projector. AR/VR also had applications in tumor resection, vertebroplasty, bone biopsy, and rod bending. Four studies reported significantly reduced radiation exposure in AR group compared to fluoroscopy group. CONCLUSIONS AR/VR technologies have the potential to usher in a paradigm shift in spine surgery. However, the current evidence indicates there is still a need for 1) defined quality and technical requirements for AR/VR devices, 2) more intraoperative studies that explore usage outside of pedicle screw placement, and 3) technological advancements to overcome registration errors via the development of an automatic registration method.
Collapse
Affiliation(s)
- Kyle McCloskey
- Department of Neurosurgery, Drexel University College of Medicine, Philadelphia, Pennsylvania, USA
| | - Ryan Turlip
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - Hasan S Ahmad
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - Yohannes G Ghenbot
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - Daksh Chauhan
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - Jang W Yoon
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, USA.
| |
Collapse
|
9
|
Matinfar S, Salehi M, Suter D, Seibold M, Dehghani S, Navab N, Wanivenhaus F, Fürnstahl P, Farshad M, Navab N. Sonification as a reliable alternative to conventional visual surgical navigation. Sci Rep 2023; 13:5930. [PMID: 37045878 PMCID: PMC10097653 DOI: 10.1038/s41598-023-32778-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2022] [Accepted: 04/02/2023] [Indexed: 04/14/2023] Open
Abstract
Despite the undeniable advantages of image-guided surgical assistance systems in terms of accuracy, such systems have not yet fully met surgeons' needs or expectations regarding usability, time efficiency, and their integration into the surgical workflow. On the other hand, perceptual studies have shown that presenting independent but causally correlated information via multimodal feedback involving different sensory modalities can improve task performance. This article investigates an alternative method for computer-assisted surgical navigation, introduces a novel four-DOF sonification methodology for navigated pedicle screw placement, and discusses advanced solutions based on multisensory feedback. The proposed method comprises a novel four-DOF sonification solution for alignment tasks in four degrees of freedom based on frequency modulation synthesis. We compared the resulting accuracy and execution time of the proposed sonification method with visual navigation, which is currently considered the state of the art. We conducted a phantom study in which 17 surgeons executed the pedicle screw placement task in the lumbar spine, guided by either the proposed sonification-based or the traditional visual navigation method. The results demonstrated that the proposed method is as accurate as the state of the art while decreasing the surgeon's need to focus on visual navigation displays instead of the natural focus on surgical tools and targeted anatomy during task execution.
Collapse
Affiliation(s)
- Sasan Matinfar
- Computer Aided Medical Procedures (CAMP), Technical University of Munich, 85748, Munich, Germany.
- Nuklearmedizin rechts der Isar, Technical University of Munich, 81675, Munich, Germany.
| | - Mehrdad Salehi
- Computer Aided Medical Procedures (CAMP), Technical University of Munich, 85748, Munich, Germany
| | - Daniel Suter
- Department of Orthopaedics, Balgrist University Hospital, 8008, Zurich, Switzerland
| | - Matthias Seibold
- Computer Aided Medical Procedures (CAMP), Technical University of Munich, 85748, Munich, Germany
- Research in Orthopedic Computer Science (ROCS), Balgrist University Hospital, University of Zurich, Balgrist Campus, 8008, Zurich, Switzerland
| | - Shervin Dehghani
- Computer Aided Medical Procedures (CAMP), Technical University of Munich, 85748, Munich, Germany
- Nuklearmedizin rechts der Isar, Technical University of Munich, 81675, Munich, Germany
| | - Navid Navab
- Topological Media Lab, Concordia University, Montreal, H3G 2W1, Canada
| | - Florian Wanivenhaus
- Department of Orthopaedics, Balgrist University Hospital, 8008, Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopedic Computer Science (ROCS), Balgrist University Hospital, University of Zurich, Balgrist Campus, 8008, Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopaedics, Balgrist University Hospital, 8008, Zurich, Switzerland
| | - Nassir Navab
- Computer Aided Medical Procedures (CAMP), Technical University of Munich, 85748, Munich, Germany
| |
Collapse
|
10
|
Zhang J, Lu V, Khanduja V. The impact of extended reality on surgery: a scoping review. INTERNATIONAL ORTHOPAEDICS 2023; 47:611-621. [PMID: 36645474 PMCID: PMC9841146 DOI: 10.1007/s00264-022-05663-z] [Citation(s) in RCA: 11] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/08/2022] [Accepted: 12/03/2022] [Indexed: 01/17/2023]
Abstract
PURPOSE Extended reality (XR) is defined as a spectrum of technologies that range from purely virtual environments to enhanced real-world environments. In the past two decades, XR-assisted surgery has seen an increase in its use and also in research and development. This scoping review aims to map out the historical trends in these technologies and their future prospects, with an emphasis on the reported outcomes and ethical considerations on the use of these technologies. METHODS A systematic search of PubMed, Scopus, and Embase for literature related to XR-assisted surgery and telesurgery was performed using Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for scoping reviews (PRISMA-ScR) guidelines. Primary studies, peer-reviewed articles that described procedures performed by surgeons on human subjects and cadavers, as well as studies describing general surgical education, were included. Non-surgical procedures, bedside procedures, veterinary procedures, procedures performed by medical students, and review articles were excluded. Studies were classified into the following categories: impact on surgery (pre-operative planning and intra-operative navigation/guidance), impact on the patient (pain and anxiety), and impact on the surgeon (surgical training and surgeon confidence). RESULTS One hundred and sixty-eight studies were included for analysis. Thirty-one studies investigated the use of XR for pre-operative planning concluded that virtual reality (VR) enhanced the surgeon's spatial awareness of important anatomical landmarks. This leads to shorter operating sessions and decreases surgical insult. Forty-nine studies explored the use of XR for intra-operative planning. They noted that augmented reality (AR) headsets highlight key landmarks, as well as important structures to avoid, which lowers the chance of accidental surgical trauma. Eleven studies investigated patients' pain and noted that VR is able to generate a meditative state. This is beneficial for patients, as it reduces the need for analgesics. Ten studies commented on patient anxiety, suggesting that VR is unsuccessful at altering patients' physiological parameters such as mean arterial blood pressure or cortisol levels. Sixty studies investigated surgical training whilst seven studies suggested that the use of XR-assisted technology increased surgeon confidence. CONCLUSION The growth of XR-assisted surgery is driven by advances in hardware and software. Whilst augmented virtuality and mixed reality are underexplored, the use of VR is growing especially in the fields of surgical training and pre-operative planning. Real-time intra-operative guidance is key for surgical precision, which is being supplemented with AR technology. XR-assisted surgery is likely to undertake a greater role in the near future, given the effect of COVID-19 limiting physical presence and the increasing complexity of surgical procedures.
Collapse
Affiliation(s)
- James Zhang
- School of Clinical Medicine, University of Cambridge, Cambridge, CB2 0SP UK
| | - Victor Lu
- School of Clinical Medicine, University of Cambridge, Cambridge, CB2 0SP UK
| | - Vikas Khanduja
- Young Adult Hip Service, Department of Trauma and Orthopaedics, Addenbrooke's Hospital, Cambridge University Hospital, Hills Road, Cambridge, CB2 0QQ, UK.
| |
Collapse
|
11
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
12
|
A novel motionless calibration method for augmented reality surgery navigation system based on optical tracker. Heliyon 2022; 8:e12115. [PMID: 36590529 PMCID: PMC9801086 DOI: 10.1016/j.heliyon.2022.e12115] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2022] [Revised: 09/25/2022] [Accepted: 11/28/2022] [Indexed: 12/13/2022] Open
Abstract
Augmented reality (AR) surgery navigation systems display the pre-operation planned virtual model at the accurate position in the real surgical scene to assist the operation. Accurate calibration of the mapping relationship between the virtual coordinate and the real world is the key to the virtual-real fusion effect. Former calibration methods require the doctor user to conduct complex manual procedures before usage. This paper introduces a novel motionless virtual-real calibration method. The method only requires to take a mixed reality image containing both virtual and real marker balls using the built-in forward camera of the AR glasses. The mapping relationship between the virtual and real spaces is calculated by using the camera coordinate system as a transformation medium. The composition and working process of the AR navigation system is introduced, and then the mathematical principle of the calibration is designed. The feasibility of the proposed calibration scheme is verified with a verification experiment, and the average registration accuracy of the scheme is around 5.80mm, which is of same level of formerly reported methods. The proposed method is convenient and rapid to implement, and the calibration accuracy is not dependent on the user experience. Further, it can potentially realize the real-time update of the registration transformation matrix, which can improve the AR fusion accuracy when the AR glasses moves. This motionless calibration method has great potential to be applied in future clinical navigation research.
Collapse
|
13
|
Augmented reality (AR) and fracture mapping model on middle-aged femoral neck fracture: A proof-of-concept towards interactive visualization. MEDICINE IN NOVEL TECHNOLOGY AND DEVICES 2022. [DOI: 10.1016/j.medntd.2022.100190] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
|
14
|
Gupta A, Ambade R. From Diagnosis to Therapy: The Role of Virtual and Augmented Reality in Orthopaedic Trauma Surgery. Cureus 2022; 14:e29099. [PMID: 36249662 PMCID: PMC9557249 DOI: 10.7759/cureus.29099] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2022] [Accepted: 09/12/2022] [Indexed: 11/28/2022] Open
Abstract
By reducing procedure-related problems, advancements in computer-assisted surgery (CAS) and surgical training aim to boost operative precision and enhance patient safety. Orthopaedic training and practice have started to change as a result of the incorporation of reality technologies like virtual reality (VR), augmented reality (AR), and mixed reality (MR) into CAS. Today's trainees can engage in realistic and highly involved operational simulations without supervision. With the coronavirus disease 2019 (COVID-19) pandemic, there is a greater need for breakthrough technology adoption. VR is an interactive technology that enables personalised care and could support successful patient-centered rehabilitation. It is a valid and trustworthy evaluation method for determining joint range of motion, function, and balance in physical rehabilitation. It may make it possible to customise care, encourage patients, boost compliance, and track their advancement. AR supplementation in orthopaedic surgery has shown promising results in pre-clinical settings, with improvements in surgical accuracy and reproducibility, decreased operating times, and less radiation exposure. As little patient observation is needed, this may lessen the workload clinicians must bear. The ability to use it for home-based therapy is often available commercially as well. The objectives of this review are to evaluate the technology available, comprehend the available evidence regarding the benefit, and take into account implementation problems in clinical practice. The use of this technology, its practical and moral ramifications, and how it will affect orthopaedic doctors and their patients are also covered. This review offers a current and thorough analysis of the reality technologies and their uses in orthopaedic surgery.
Collapse
|
15
|
Zhou S, Zhou F, Sun Y, Chen X, Diao Y, Zhao Y, Huang H, Fan X, Zhang G, Li X. The application of artificial intelligence in spine surgery. Front Surg 2022; 9:885599. [PMID: 36034349 PMCID: PMC9403075 DOI: 10.3389/fsurg.2022.885599] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Accepted: 07/25/2022] [Indexed: 11/13/2022] Open
Abstract
Due to its obvious advantages in processing big data and image information, the combination of artificial intelligence and medical care may profoundly change medical practice and promote the gradual transition from traditional clinical care to precision medicine mode. In this artical, we reviewed the relevant literatures and found that artificial intelligence was widely used in spine surgery. The application scenarios included etiology, diagnosis, treatment, postoperative prognosis and decision support systems of spinal diseases. The shift to artificial intelligence model in medicine constantly improved the level of doctors' diagnosis and treatment and the development of orthopedics.
Collapse
Affiliation(s)
- Shuai Zhou
- Department of Orthopaedics, Peking University Third Hospital, Beijing, China
- Engineering Research Center of Bone and Joint Precision Medicine, Peking University Third Hospital, Beijing, China
- Beijing Key Laboratory of Spinal Disease Research, Beijing, China
| | - Feifei Zhou
- Department of Orthopaedics, Peking University Third Hospital, Beijing, China
- Engineering Research Center of Bone and Joint Precision Medicine, Peking University Third Hospital, Beijing, China
- Beijing Key Laboratory of Spinal Disease Research, Beijing, China
- Correspondence: Feifei Zhou
| | - Yu Sun
- Department of Orthopaedics, Peking University Third Hospital, Beijing, China
- Engineering Research Center of Bone and Joint Precision Medicine, Peking University Third Hospital, Beijing, China
- Beijing Key Laboratory of Spinal Disease Research, Beijing, China
| | - Xin Chen
- Department of Orthopaedics, Peking University Third Hospital, Beijing, China
- Engineering Research Center of Bone and Joint Precision Medicine, Peking University Third Hospital, Beijing, China
- Beijing Key Laboratory of Spinal Disease Research, Beijing, China
| | - Yinze Diao
- Department of Orthopaedics, Peking University Third Hospital, Beijing, China
- Engineering Research Center of Bone and Joint Precision Medicine, Peking University Third Hospital, Beijing, China
- Beijing Key Laboratory of Spinal Disease Research, Beijing, China
| | - Yanbin Zhao
- Department of Orthopaedics, Peking University Third Hospital, Beijing, China
- Engineering Research Center of Bone and Joint Precision Medicine, Peking University Third Hospital, Beijing, China
- Beijing Key Laboratory of Spinal Disease Research, Beijing, China
| | - Haoge Huang
- Department of Orthopaedics, Peking University Third Hospital, Beijing, China
- Engineering Research Center of Bone and Joint Precision Medicine, Peking University Third Hospital, Beijing, China
- Beijing Key Laboratory of Spinal Disease Research, Beijing, China
| | - Xiao Fan
- Department of Orthopaedics, Peking University Third Hospital, Beijing, China
- Engineering Research Center of Bone and Joint Precision Medicine, Peking University Third Hospital, Beijing, China
- Beijing Key Laboratory of Spinal Disease Research, Beijing, China
| | - Gangqiang Zhang
- Department of Orthopaedics, Peking University Third Hospital, Beijing, China
- Engineering Research Center of Bone and Joint Precision Medicine, Peking University Third Hospital, Beijing, China
- Beijing Key Laboratory of Spinal Disease Research, Beijing, China
| | - Xinhang Li
- Department of Orthopaedics, Peking University Third Hospital, Beijing, China
- Engineering Research Center of Bone and Joint Precision Medicine, Peking University Third Hospital, Beijing, China
- Beijing Key Laboratory of Spinal Disease Research, Beijing, China
| |
Collapse
|
16
|
Gueziri HE, Georgiopoulos M, Santaguida C, Collins DL. Ultrasound-based navigated pedicle screw insertion without intraoperative radiation: feasibility study on porcine cadavers. Spine J 2022; 22:1408-1417. [PMID: 35523390 DOI: 10.1016/j.spinee.2022.04.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/02/2021] [Revised: 04/15/2022] [Accepted: 04/26/2022] [Indexed: 02/03/2023]
Abstract
BACKGROUND Navigation systems for spinal fusion surgery rely on intraoperative computed tomography (CT) or fluoroscopy imaging. Both expose patient, surgeons and operating room staff to significant amounts of radiation. Alternative methods involving intraoperative ultrasound (iUS) imaging have recently shown promise for image-to-patient registration. Yet, the feasibility and safety of iUS navigation in spinal fusion have not been demonstrated. PURPOSE To evaluate the accuracy of pedicle screw insertion in lumbar and thoracolumbar spinal fusion using a fully automated iUS navigation system. STUDY DESIGN Prospective porcine cadaver study. METHODS Five porcine cadavers were used to instrument the lumbar and thoracolumbar spine using posterior open surgery. During the procedure, iUS images were acquired and used to establish automatic registration between the anatomy and preoperative CT images. Navigation was performed with the preoperative CT using tracked instruments. The accuracy of the system was measured as the distance of manually collected points to the preoperative CT vertebral surface and compared against fiducial-based registration. A postoperative CT was acquired, and screw placements were manually verified. We report breach rates, as well as axial and sagittal screw deviations. RESULTS A total of 56 screws were inserted (5.50 mm diameter n=50, and 6.50 mm diameter n=6). Fifty-two screws were inserted safely without breach. Four screws (7.14%) presented a medial breach with an average deviation of 1.35±0.37 mm (all <2 mm). Two breaches were caused by 6.50 mm diameter screws, and two by 5.50 mm screws. For vertebrae instrumented with 5.50 mm screws, the average axial diameter of the pedicle was 9.29 mm leaving a 1.89 mm margin in the left and right pedicle. For vertebrae instrumented with 6.50 mm screws, the average axial diameter of the pedicle was 8.99 mm leaving a 1.24 mm error margin in the left and right pedicle. The average distance to the vertebral surface was 0.96 mm using iUS registration and 0.97 mm using fiducial-based registration. CONCLUSIONS We successfully implanted all pedicle screws in the thoracolumbar spine using the ultrasound-based navigation system. All breaches recorded were minor (<2 mm) and the breach rate (7.14%) was comparable to existing literature. More investigation is needed to evaluate consistency, reproducibility, and performance in surgical context. CLINICAL SIGNIFICANCE Intraoperative US-based navigation is feasible and practical for pedicle screw insertion in a porcine model. It might be used as a low-cost and radiation-free alternative to intraoperative CT and fluoroscopy in the future.
Collapse
Affiliation(s)
- Houssem-Eddine Gueziri
- McConnell Brain Imaging Center, Montreal Neurological Institute and Hospital, McGill University, 3801 University St, Montreal, Quebec, Canada.
| | - Miltiadis Georgiopoulos
- Department of Neurology and Neurosurgery, Montreal Neurological Institute and Hospital, McGill University, 3801 University St, Montreal, Quebec, Canada
| | - Carlo Santaguida
- Department of Neurology and Neurosurgery, Montreal Neurological Institute and Hospital, McGill University, 3801 University St, Montreal, Quebec, Canada
| | - D Louis Collins
- McConnell Brain Imaging Center, Montreal Neurological Institute and Hospital, McGill University, 3801 University St, Montreal, Quebec, Canada; Department of Neurology and Neurosurgery, Montreal Neurological Institute and Hospital, McGill University, 3801 University St, Montreal, Quebec, Canada
| |
Collapse
|
17
|
Wu B, Liu P, Xiong C, Li C, Zhang F, Shen S, Shao P, Yao P, Niu C, Xu R. Stereotactic co-axial projection imaging for augmented reality neuronavigation: a proof-of-concept study. Quant Imaging Med Surg 2022; 12:3792-3802. [PMID: 35782260 PMCID: PMC9246757 DOI: 10.21037/qims-21-1144] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2021] [Accepted: 04/27/2022] [Indexed: 11/07/2023]
Abstract
BACKGROUND Lack of intuitiveness and poor hand-eye coordination present a major technical challenge in neurosurgical navigation. METHODS We developed an integrated dexterous stereotactic co-axial projection imaging (sCPI) system featuring orthotopic image projection for augmented reality (AR) neurosurgical navigation. The performance characteristics of the sCPI system, including projection resolution and navigation accuracy, were quantitatively verified. The resolution of the sCPI was tested with a USAF1951 resolution test chart. The stereotactic navigation accuracy of the sCPI was measured using a calibration panel with a 7×7 circle array pattern. In benchtop validation, the navigation accuracy of the sCPI and the BrainLab Kick Navigation Station was compared using a skull phantom with 8 intracranial targets. Finally, we demonstrated the potential clinical application of sCPI through a clinical trial. RESULTS The resolution test showed that the resolution of the sCPI was 1.3 mm. In a stereotactic navigation accuracy test, the maximum and minimum error of the sCPI was 2.9 and 0.3 mm, and the mean error was 1.5 mm. The stereotactic navigation accuracy test also showed that the navigation error of the sCPI would increase with the pitch and yaw angle, but there was no obvious difference in navigation errors caused by different yaw directions, which meant that the navigation error is unbiased across all directions. The benchtop validation showed that the average navigation errors for the sCPI system and the Kick Navigation Station were 1.4±0.8 and 1.8±0.7 mm, the medians were 1.3 and 1.9 mm, and the average preparation times were 3 min 24 sec and 6 min 8 sec, respectively. The clinical feasibility of sCPI-assisted neurosurgical navigation was demonstrated in a clinical study. In comparison with the BrainLab device, the sCPI system required less time for preoperative preparation and enhanced the clinician experience in intraoperative visualization and navigation. CONCLUSIONS The sCPI technique can be potentially used in many surgical applications for intuitive visualization of medical information and intraoperative guidance of surgical trajectories.
Collapse
Affiliation(s)
- Bingxuan Wu
- Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China, Hefei, China
| | - Peng Liu
- Suzhou Institute for Advanced Research, University of Science and Technology of China, Suzhou, China
| | - Chi Xiong
- Department of Neurosurgery, The First Affiliated Hospital of USTC, Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei, China
| | - Chenmeng Li
- Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China, Hefei, China
| | - Fan Zhang
- Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China, Hefei, China
| | - Shuwei Shen
- Suzhou Institute for Advanced Research, University of Science and Technology of China, Suzhou, China
| | - Pengfei Shao
- Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China, Hefei, China
| | - Peng Yao
- Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China, Hefei, China
| | - Chaoshi Niu
- Department of Neurosurgery, The First Affiliated Hospital of USTC, Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei, China
| | - Ronald Xu
- Suzhou Institute for Advanced Research, University of Science and Technology of China, Suzhou, China
| |
Collapse
|
18
|
Ma L, Liang H, Han B, Yang S, Zhang X, Liao H. Augmented reality navigation with ultrasound-assisted point cloud registration for percutaneous ablation of liver tumors. Int J Comput Assist Radiol Surg 2022; 17:1543-1552. [PMID: 35704238 DOI: 10.1007/s11548-022-02671-7] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2022] [Accepted: 05/02/2022] [Indexed: 11/28/2022]
Abstract
PURPOSE We present a novel augmented reality (AR) surgical navigation method with ultrasound-assisted point cloud registration for percutaneous ablation of liver tumors. A preliminary study is carried out to verify its feasibility. METHODS Two three-dimensional (3D) point clouds of the liver surface are derived from the preoperative images and intraoperative tracked US images, respectively. To compensate for the soft tissue deformation, the point cloud registration between the preoperative images and the liver is performed using the non-rigid iterative closest point (ICP) algorithm. A 3D AR device based on integral videography technology is designed to accurately display naked-eye 3D images for surgical navigation. Based on the above registration, naked-eye 3D images of the liver surface, planning path, entry points, and tumor can be overlaid in situ through our 3D AR device. Finally, the AR-guided targeting accuracy is evaluated through entry point positioning. RESULTS Experiments on both the liver phantom and in vitro pork liver were conducted. Several entry points on the liver surface were used to evaluate the targeting accuracy. The preliminary validation on the liver phantom showed average entry-point errors (EPEs) of 2.34 ± 0.45 mm, 2.25 ± 0.72 mm, 2.71 ± 0.82 mm, and 2.50 ± 1.11 mm at distinct US point cloud coverage rates of 100%, 75%, 50%, and 25%, respectively. The average EPEs of the deformed pork liver were 4.49 ± 1.88 mm and 5.02 ± 2.03 mm at the coverage rates of 100% and 75%, and the average covered-entry-point errors (CEPEs) were 4.96 ± 2.05 mm and 2.97 ± 1.37 mm at 50% and 25%, respectively. CONCLUSION Experimental outcomes demonstrate that the proposed AR navigation method based on US-assisted point cloud registration has achieved an acceptable targeting accuracy on the liver surface even in the case of liver deformation.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, China
| | - Hanying Liang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, China
| | - Boxuan Han
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, China
| | - Shizhong Yang
- Hepato-Pancreato-Biliary Center, Beijing Tsinghua Changgung Hospital, School of Clinical Medicine, Tsinghua University, Beijing, 102218, China
| | - Xinran Zhang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, China.
| |
Collapse
|
19
|
Liu Y, Lee MG, Kim JS. Spine Surgery Assisted by Augmented Reality: Where Have We Been? Yonsei Med J 2022; 63:305-316. [PMID: 35352881 PMCID: PMC8965436 DOI: 10.3349/ymj.2022.63.4.305] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/19/2021] [Revised: 02/02/2022] [Accepted: 02/09/2022] [Indexed: 11/27/2022] Open
Abstract
This present systematic review examines spine surgery literature supporting augmented reality (AR) technology and summarizes its current status in spinal surgery technology. Database search strategies were retrieved from PubMed, Web of Science, Cochrane Library, Embase, from the earliest records to April 1, 2021. Our review briefly examines the history of AR, and enumerates different device application workflows in a variety of spinal surgeries. We also sort out the pros and cons of current mainstream AR devices and the latest updates. A total of 45 articles are included in our review. The most prevalent surgical applications included are the augmented reality surgical navigation system and head-mounted display. The most popular application of AR is pedicle screw instrumentation in spine surgery, and the primary responsible surgical levels are thoracic and lumbar. AR guidance systems show high potential value in practical clinical applications for the spine. The overall number of cases in AR-related studies is still rare compared to traditional surgical-assisted techniques. These lack long-term clinical efficacy and robust surgical-related statistical data. Changing healthcare laws as well as the increasing prevalence of spinal surgery are generating critical data that determines the value of AR technology.
Collapse
Affiliation(s)
- Yanting Liu
- Department of Neurosurgery, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Min-Gi Lee
- Department of Neurosurgery, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Jin-Sung Kim
- Department of Neurosurgery, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea.
| |
Collapse
|
20
|
Patel MR, Jacob KC, Parsons AW, Chavez FA, Ribot MA, Munim MA, Vanjani NN, Pawlowski H, Prabhu MC, Singh K. Systematic Review: Applications of Intraoperative Ultrasound in Spinal Surgery. World Neurosurg 2022; 164:e45-e58. [PMID: 35259500 DOI: 10.1016/j.wneu.2022.02.130] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2022] [Accepted: 02/28/2022] [Indexed: 10/18/2022]
Abstract
BACKGROUND Due to increased practicality and decreased costs and radiation, interest has risen for intraoperative ultrasound (iUS) in spinal surgery applications; however, few studies have provided a robust overview of its use in spinal surgery. We synthesize findings of existing literature on usage of iUS in navigation, pedicle screw placement, and identification of anatomy during spinal interventions. METHODS Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines were utilized in this systematic review. Studies were identified through PubMed, Scopus, and Google Scholar databases using the search string. Abstracts mentioning iUS in spine applications were included. Upon full-text review, exclusion criteria were implemented, including outdated studies or those with weak topic relevance or statistical power. Upon elimination of duplicates, multi-reviewer screening for eligibility, and citation search, 44 manuscripts were analyzed. RESULTS Navigation using iUS is safe, effective, and economical. iUS registration accuracy and success is within clinically acceptable limits for image-guided navigation (Table 2). Pedicle screw instrumentation with iUS is precise with a favorable safety profile (Table 2). Anatomical landmarks are reliably identified with iUS, and surgeons are overwhelmingly successful in neural or vascular tissue identification with iUS modalities including standard B mode, doppler, and contrast-enhanced ultrasound (CE-US) (Table 3). iUS use in traumatic reduction of fractures properly identifies anatomical structures, intervertebral disc space, and vasculature (Table 3). CONCLUSION iUS eliminates radiation, decreases costs, and provides sufficient accuracy and reliability in identification of anatomical and neurovascular structures in various spinal surgery settings.
Collapse
Affiliation(s)
- Madhav R Patel
- Department of Orthopaedic Surgery, Rush University Medical Center, 1611 W. Harrison St. Suite #300, Chicago, IL, 60612
| | - Kevin C Jacob
- Department of Orthopaedic Surgery, Rush University Medical Center, 1611 W. Harrison St. Suite #300, Chicago, IL, 60612
| | - Alexander W Parsons
- Department of Orthopaedic Surgery, Rush University Medical Center, 1611 W. Harrison St. Suite #300, Chicago, IL, 60612
| | - Frank A Chavez
- Department of Orthopaedic Surgery, Rush University Medical Center, 1611 W. Harrison St. Suite #300, Chicago, IL, 60612
| | - Max A Ribot
- Department of Orthopaedic Surgery, Rush University Medical Center, 1611 W. Harrison St. Suite #300, Chicago, IL, 60612
| | - Mohammed A Munim
- Department of Orthopaedic Surgery, Rush University Medical Center, 1611 W. Harrison St. Suite #300, Chicago, IL, 60612
| | - Nisheka N Vanjani
- Department of Orthopaedic Surgery, Rush University Medical Center, 1611 W. Harrison St. Suite #300, Chicago, IL, 60612
| | - Hanna Pawlowski
- Department of Orthopaedic Surgery, Rush University Medical Center, 1611 W. Harrison St. Suite #300, Chicago, IL, 60612
| | - Michael C Prabhu
- Department of Orthopaedic Surgery, Rush University Medical Center, 1611 W. Harrison St. Suite #300, Chicago, IL, 60612
| | - Kern Singh
- Department of Orthopaedic Surgery, Rush University Medical Center, 1611 W. Harrison St. Suite #300, Chicago, IL, 60612.
| |
Collapse
|
21
|
Tu P, Qin C, Guo Y, Li D, Lungu AJ, Wang H, Chen X. Ultrasound image guided and mixed reality-based surgical system with real-time soft tissue deformation computing for robotic cervical pedicle screw placement. IEEE Trans Biomed Eng 2022; 69:2593-2603. [PMID: 35157575 DOI: 10.1109/tbme.2022.3150952] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Cervical pedicle screw (CPS) placement surgery remains technically demanding due to the complicated anatomy with neurovascular structures. State-of-the-art surgical navigation or robotic systems still suffer from the problem of hand-eye coordination and soft tissue deformation. In this study, we aim at tracking the intraoperative soft tissue deformation and constructing a virtual physical fusion surgical scene, and integrating them into the robotic system for CPS placement surgery. Firstly, we propose a real-time deformation computation method based on the prior shape model and intraoperative partial information acquired from ultrasound images. According to the generated posterior shape, the structure representation of deformed target tissue gets updated continuously. Secondly, a hand tremble compensation method is proposed to improve the accuracy and robustness of the virtual-physical calibration procedure, and a mixed reality based surgical scene is further constructed for CPS placement surgery. Thirdly, we integrate the soft tissue deformation method and virtual-physical fusion method into our previously proposed surgical robotic system, and the surgical workflow for CPS placement surgery is introduced. We conducted phantom and animal experiments to evaluate the feasibility and accuracy of the proposed system. Our system yielded a mean surface distance error of 1.52 ± 0.43 mm for soft tissue deformation computing, and an average distance deviation of 1.04 ± 0.27 mm for CPS placement. Results demonstrated that our system involves tremendous clinical application potential. Our proposed system promotes the efficiency and safety of the CPS placement surgery.
Collapse
|
22
|
Farshad M, Spirig JM, Suter D, Hoch A, Burkhard MD, Liebmann F, Farshad-Amacker NA, Fürnstahl P. Operator independent reliability of direct augmented reality navigated pedicle screw placement and rod bending. NORTH AMERICAN SPINE SOCIETY JOURNAL 2022; 8:100084. [PMID: 35141649 PMCID: PMC8819958 DOI: 10.1016/j.xnsj.2021.100084] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/04/2021] [Revised: 09/21/2021] [Accepted: 10/02/2021] [Indexed: 12/17/2022]
Abstract
Background AR based navigation of spine surgeries may not only provide accurate surgical execution but also operator independency by compensating for potential skill deficits. “Direct” AR-navigation, namely superposing trajectories on anatomy directly, have not been investigated regarding their accuracy and operator's dependence. Purpose of this study was to prove operator independent reliability and accuracy of both AR assisted pedicle screw navigation and AR assisted rod bending in a cadaver setting. Methods Two experienced spine surgeons and two biomedical engineers (laymen) performed independently from each other pedicle screw instrumentations from L1-L5 in a total of eight lumbar cadaver specimens (20 screws/operator) using a fluoroscopy-free AR based navigation method. Screw fitting rods from L1 to S2-Ala-Ileum were bent bilaterally using an AR based rod bending navigation method (4 rods/operator). Outcome measures were pedicle perforations, accuracy compared to preoperative plan, registration time, navigation time, total rod bending time and operator's satisfaction for these procedures. Results 97.5% of all screws were safely placed (<2 mm perforation), overall mean deviation from planned trajectory was 6.8±3.9°, deviation from planned entry point was 4±2.7 mm, registration time per vertebra was 2:25 min (00:56 to 10:00 min), navigation time per screw was 1:07 min (00:15 to 12:43 min) rod bending time per rod was 4:22 min (02:07 to 10:39 min), operator's satisfaction with AR based screw and rod navigation was 5.38±0.67 (1 to 6, 6 being the best rate). Comparison of surgeons and laymen revealed significant difference in navigation time (1:01 min; 00:15 to 3:00 min vs. 01:37 min; 00:23 to 12:43 min; p = 0.004, respectively) but not in pedicle perforation rate. Conclusions Direct AR based screw and rod navigation using a surface digitization registration technique is reliable and independent of surgical experience. The accuracy of pedicle screw insertion in the lumbar spine is comparable with the current standard techniques.
Collapse
Affiliation(s)
- Mazda Farshad
- University Spine Center Zürich, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - José Miguel Spirig
- University Spine Center Zürich, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - Daniel Suter
- University Spine Center Zürich, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland.,ROCS: Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - Armando Hoch
- ROCS: Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - Marco D Burkhard
- University Spine Center Zürich, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - Florentin Liebmann
- University Spine Center Zürich, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - Nadja A Farshad-Amacker
- Radiology, Balgrist University Hospital, University of Zürich, Forchstrasse 340, 8008 Zürich
| | - Philipp Fürnstahl
- ROCS: Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| |
Collapse
|
23
|
Yang R, Li C, Tu P, Ahmed A, Ji T, Chen X. Development and Application of Digital Maxillofacial Surgery System Based on Mixed Reality Technology. Front Surg 2022; 8:719985. [PMID: 35174201 PMCID: PMC8841731 DOI: 10.3389/fsurg.2021.719985] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2021] [Accepted: 12/16/2021] [Indexed: 11/23/2022] Open
Abstract
Objective To realize the three-dimensional visual output of surgical navigation information by studying the cross-linking of mixed reality display devices and high-precision optical navigators. Methods Applying quaternion-based point alignment algorithms to realize the positioning configuration of mixed reality display devices, high-precision optical navigators, real-time patient tracking and calibration technology; based on open source SDK and development tools, developing mixed reality surgery based on visual positioning and tracking system. In this study, four patients were selected for mixed reality-assisted tumor resection and reconstruction and re-examined 1 month after the operation. We reconstructed postoperative CT and use 3DMeshMetric to form the error distribution map, and completed the error analysis and quality control. Results Realized the cross-linking of mixed reality display equipment and high-precision optical navigator, developed a digital maxillofacial surgery system based on mixed reality technology and successfully implemented mixed reality-assisted tumor resection and reconstruction in 4 cases. Conclusions The maxillofacial digital surgery system based on mixed reality technology can superimpose and display three-dimensional navigation information in the surgeon's field of vision. Moreover, it solves the problem of visual conversion and space conversion of the existing navigation system. It improves the work efficiency of digitally assisted surgery, effectively reduces the surgeon's dependence on spatial experience and imagination, and protects important anatomical structures during surgery. It is a significant clinical application value and potential.
Collapse
Affiliation(s)
- Rong Yang
- Shanghai Key Laboratory of Stomatology/Shanghai Institute of Stomatology, Department of Oral and Maxillofacial Head and Neck Oncology, National Clinical Research Center for Oral Diseases, School of Medicine, The Ninth People's Hospital, Shanghai Jiao Tong University, Shanghai, China
| | - Chenyao Li
- Shanghai Key Laboratory of Stomatology/Shanghai Institute of Stomatology, Department of Oral and Maxillofacial Head and Neck Oncology, National Clinical Research Center for Oral Diseases, School of Medicine, The Ninth People's Hospital, Shanghai Jiao Tong University, Shanghai, China
| | - Puxun Tu
- School of Mechanical and Engineering, Shanghai Jiaotong University, Shanghai, China
| | - Abdelrehem Ahmed
- Department of Craniomaxillofacial and Plastic Surgery, Faculty of Dentistry, Alexandria University, Alexandria, Egypt
| | - Tong Ji
- Shanghai Key Laboratory of Stomatology/Shanghai Institute of Stomatology, Department of Oral and Maxillofacial Head and Neck Oncology, National Clinical Research Center for Oral Diseases, School of Medicine, The Ninth People's Hospital, Shanghai Jiao Tong University, Shanghai, China
- *Correspondence: Tong Ji
| | - Xiaojun Chen
- School of Mechanical and Engineering, Shanghai Jiaotong University, Shanghai, China
- Xiaojun Chen
| |
Collapse
|
24
|
XR (Extended Reality: Virtual Reality, Augmented Reality, Mixed Reality) Technology in Spine Medicine: Status Quo and Quo Vadis. J Clin Med 2022; 11:jcm11020470. [PMID: 35054164 PMCID: PMC8779726 DOI: 10.3390/jcm11020470] [Citation(s) in RCA: 28] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 01/01/2022] [Accepted: 01/11/2022] [Indexed: 02/06/2023] Open
Abstract
In recent years, with the rapid advancement and consumerization of virtual reality, augmented reality, mixed reality, and extended reality (XR) technology, the use of XR technology in spine medicine has also become increasingly popular. The rising use of XR technology in spine medicine has also been accelerated by the recent wave of digital transformation (i.e., case-specific three-dimensional medical images and holograms, wearable sensors, video cameras, fifth generation, artificial intelligence, and head-mounted displays), and further accelerated by the COVID-19 pandemic and the increase in minimally invasive spine surgery. The COVID-19 pandemic has a negative impact on society, but positive impacts can also be expected, including the continued spread and adoption of telemedicine services (i.e., tele-education, tele-surgery, tele-rehabilitation) that promote digital transformation. The purpose of this narrative review is to describe the accelerators of XR (VR, AR, MR) technology in spine medicine and then to provide a comprehensive review of the use of XR technology in spine medicine, including surgery, consultation, education, and rehabilitation, as well as to identify its limitations and future perspectives (status quo and quo vadis).
Collapse
|
25
|
Augmented Reality (AR) in Orthopedics: Current Applications and Future Directions. Curr Rev Musculoskelet Med 2021; 14:397-405. [PMID: 34751894 DOI: 10.1007/s12178-021-09728-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 09/27/2021] [Indexed: 01/05/2023]
Abstract
PURPOSE OF REVIEW Imaging technologies (X-ray, CT, MRI, and ultrasound) have revolutionized orthopedic surgery, allowing for the more efficient diagnosis, monitoring, and treatment of musculoskeletal aliments. The current review investigates recent literature surrounding the impact of augmented reality (AR) imaging technologies on orthopedic surgery. In particular, it investigates the impact that AR technologies may have on provider cognitive burden, operative times, occupational radiation exposure, and surgical precision and outcomes. RECENT FINDINGS Many AR technologies have been shown to lower provider cognitive burden and reduce operative time and radiation exposure while improving surgical precision in pre-clinical cadaveric and sawbones models. So far, only a few platforms focusing on pedicle screw placement have been approved by the FDA. These technologies have been implemented clinically with mixed results when compared to traditional free-hand approaches. It remains to be seen if current AR technologies can deliver upon their multitude of promises, and the ability to do so seems contingent upon continued technological progress. Additionally, the impact of these platforms will likely be highly conditional on clinical indication and provider type. It remains unclear if AR will be broadly accepted and utilized or if it will be reserved for niche indications where it adds significant value. One thing is clear, orthopedics' high utilization of pre- and intra-operative imaging, combined with the relative ease of tracking rigid structures like bone as compared to soft tissues, has made it the clear beachhead market for AR technologies in medicine.
Collapse
|
26
|
ZHANG RIWEI, SHEN JUN, LIU QUANQUAN, QI YONG, WU XIAODONG, CAI SHUTING, GUO JING, XIONG XIAOMING. AUGMENTED REALITY NAVIGATION FRAMEWORK FOR TOTAL HIP ARTHROPLASTY SURGERY. J MECH MED BIOL 2021. [DOI: 10.1142/s0219519421500676] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
In a total hip arthroplasty surgery, correctly implanting the artificial acetabulum and the femoral head is essential for a successful treatment. An augmented reality (AR) navigation framework is proposed in this paper to provide accurate surgical guidance in a total hip arthroplasty procedure. The AR framework consists of three parts: (1) preoperative surgical planning to generate virtual information for AR; (2) computer vision-based tracking for the real-time localization of both acetabular cup positioner and bony landmarks during surgery; (3) registration of a virtual object onto a real-world operative field to properly overlay the preoperative surgical planning data onto a three-dimensional (3D)-printed pelvis model. The cost-effective framework is designed with our clinical partner based on real surgical conditions. The bony landmarks are automatically detected and used for the registration between virtual and real objects. The AR performance is evaluated with a pelvis model, and it presents mean errors of 2.2[Formula: see text]mm and 0.8∘ in position and orientation, respectively, between real and virtual spaces. These small errors are within the tolerance of positive surgical outcomes.
Collapse
Affiliation(s)
- RIWEI ZHANG
- School of Automation, Guangdong University of Technology, Guangzhou 510000, P. R. China
| | - JUN SHEN
- School of Automation, Guangdong University of Technology, Guangzhou 510000, P. R. China
| | - QUANQUAN LIU
- Department of Neurology, The First Affiliated Hospital of Shenzhen University, Shenzhen 518000, P. R. China
- Shenzhen Institute of Geriatrics, Shenzhen 518000, P. R. China
| | - YONG QI
- Department of Joint Orthopedics, Guangdong Second Provincial General Hospital, Guangzhou 510000, P. R. China
- The Second School of Clinical Medicine, Southern Medical University, Guangzhou 510000, P. R. China
| | - XIAODONG WU
- Department of Joint Orthopedics, Guangdong Second Provincial General Hospital, Guangzhou 510000, P. R. China
- Department of Neurology, The First Affiliated Hospital of Shenzhen University, Shenzhen 518000, P. R. China
| | - SHUTING CAI
- School of Automation, Guangdong University of Technology, Guangzhou 510000, P. R. China
| | - JING GUO
- School of Automation, Guangdong University of Technology, Guangzhou 510000, P. R. China
| | - XIAO MING XIONG
- School of Automation, Guangdong University of Technology, Guangzhou 510000, P. R. China
| |
Collapse
|
27
|
Pan J, Yu D, Li R, Huang X, Wang X, Zheng W, Zhu B, Liu X. Multi-Modality guidance based surgical navigation for percutaneous endoscopic transforaminal discectomy. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2021; 212:106460. [PMID: 34736173 DOI: 10.1016/j.cmpb.2021.106460] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/18/2021] [Accepted: 10/06/2021] [Indexed: 06/13/2023]
Abstract
OBJECTIVE Fluoroscopic guidance is a critical step for the puncture procedure in percutaneous endoscopic transforaminal discectomy (PETD). However, two-dimensional observations of the three-dimensional anatomic structure suffer from the effects of projective simplification. To accurately assess the spatial relations between the patient vertebra tissues and puncture needle, a considerable number of fluoroscopic images from different orientations need to be acquired by the surgeons. This process significantly increases the radiation risk for both the patient and surgeons. METHODS In this paper, we propose an augmented reality (AR) surgical navigation system for PETD based on multi-modality information, which contains fluoroscopy, optical tracking, and depth camera. To register the fluoroscopic image with the intraoperative video, we design a lightweight non-invasive fiducial with markers and detect the markers based on the deep learning method. It can display the intraoperative video fused with the registered fluoroscopic images. We also present a self-adaptive calibration and transformation method between a 6-DOF optical tracking device and a depth camera, which are in different coordinate systems. RESULTS With the substantially reduced frequency of fluoroscopy imaging, the system can accurately track and superimpose the virtual puncture needle on fluoroscopy images in real-time. From operating theatre in vivo animal experiments, the results illustrate that the system average positioning accuracy can reach 1.98mm and the orientation accuracy can reach 1.19∘. From the clinical validation results, the system significantly lower the frequency of fluoroscopy imaging (42.7%) and reduce the radiation risk for both the patient and surgeons. CONCLUSION Coupled with the user study, both the quantitative and qualitative results indicate that our navigation system has the potential to be highly useful in clinical practice. Compared with the existing navigation systems, which are usually equipped with a variety of large and high-cost medical equipments, such as O-arm, cone-beam CT, and robots, our navigation system does not need special equipment and can be implemented with common equipment in the operating room, such as C-arm, desktop, etc., even in small hospitals.
Collapse
Affiliation(s)
- Junjun Pan
- State Key Laboratory of Virtual Reality Technology and Systems, Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing 100191, China; PENG CHENG Laboratory, Shenzhen 518000, China.
| | - Dongfang Yu
- State Key Laboratory of Virtual Reality Technology and Systems, Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing 100191, China
| | - Ranyang Li
- State Key Laboratory of Virtual Reality Technology and Systems, Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing 100191, China; PENG CHENG Laboratory, Shenzhen 518000, China.
| | - Xin Huang
- The Pain Medicine Center, Peking University Third Hospital, Beijing, China
| | - Xinliang Wang
- State Key Laboratory of Virtual Reality Technology and Systems, Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing 100191, China
| | - Wenhao Zheng
- State Key Laboratory of Virtual Reality Technology and Systems, Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing 100191, China
| | - Bin Zhu
- The Pain Medicine Center, Peking University Third Hospital, Beijing, China
| | - Xiaoguang Liu
- The Pain Medicine Center, Peking University Third Hospital, Beijing, China
| |
Collapse
|
28
|
Augmented and virtual reality in spine surgery, current applications and future potentials. Spine J 2021; 21:1617-1625. [PMID: 33774210 DOI: 10.1016/j.spinee.2021.03.018] [Citation(s) in RCA: 67] [Impact Index Per Article: 22.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/15/2021] [Accepted: 03/17/2021] [Indexed: 02/03/2023]
Abstract
BACKGROUND CONTEXT The field of artificial intelligence (AI) is rapidly advancing, especially with recent improvements in deep learning (DL) techniques. Augmented (AR) and virtual reality (VR) are finding their place in healthcare, and spine surgery is no exception. The unique capabilities and advantages of AR and VR devices include their low cost, flexible integration with other technologies, user-friendly features and their application in navigation systems, which makes them beneficial across different aspects of spine surgery. Despite the use of AR for pedicle screw placement, targeted cervical foraminotomy, bone biopsy, osteotomy planning, and percutaneous intervention, the current applications of AR and VR in spine surgery remain limited. PURPOSE The primary goal of this study was to provide the spine surgeons and clinical researchers with the general information about the current applications, future potentials, and accessibility of AR and VR systems in spine surgery. STUDY DESIGN/SETTING We reviewed titles of more than 250 journal papers from google scholar and PubMed with search words: augmented reality, virtual reality, spine surgery, and orthopaedic, out of which 89 related papers were selected for abstract review. Finally, full text of 67 papers were analyzed and reviewed. METHODS The papers were divided into four groups: technological papers, applications in surgery, applications in spine education and training, and general application in orthopaedic. A team of two reviewers performed paper reviews and a thorough web search to ensure the most updated state of the art in each of four group is captured in the review. RESULTS In this review we discuss the current state of the art in AR and VR hardware, their preoperative applications and surgical applications in spine surgery. Finally, we discuss the future potentials of AR and VR and their integration with AI, robotic surgery, gaming, and wearables. CONCLUSIONS AR and VR are promising technologies that will soon become part of standard of care in spine surgery.
Collapse
|
29
|
Augmented reality-navigated pedicle screw placement: a cadaveric pilot study. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2021; 30:3731-3737. [PMID: 34350487 DOI: 10.1007/s00586-021-06950-w] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/08/2020] [Revised: 11/04/2020] [Accepted: 07/25/2021] [Indexed: 10/20/2022]
Abstract
PURPOSE Augmented reality (AR) is an emerging technology with great potential for surgical navigation through its ability to provide 3D holographic projection of otherwise hidden anatomical information. This pilot cadaver study investigated the feasibility and accuracy of one of the first holographic navigation techniques for lumbar pedicle screw placement. METHODS Lumbar computer tomography scans (CT) of two cadaver specimens and their reconstructed 3D models were used for pedicle screw trajectory planning. Planned trajectories and 3D models were subsequently uploaded to an AR head-mounted device. Randomly, k-wires were placed either into the left or the right pedicle of a vertebra (L1-5) with or without AR-navigation (by holographic projection of the planned trajectory). CT-scans were subsequently performed to assess accuracy of both techniques. RESULTS A total of 18 k-wires could be placed (8 navigated, 10 free hand) by two experienced spine surgeons. In two vertebrae, the AR-navigation was aborted because the registration of the preoperative plan with the intraoperative anatomy was imprecise due to a technical failure. The average differences of the screw entry points between planning and execution were 4.74 ± 2.37 mm in the freehand technique and 5.99 ± 3.60 mm in the AR-navigated technique (p = 0.39). The average deviation from the planned trajectories was 11.21° ± 7.64° in the freehand technique and 5.88° ± 3.69° in the AR-navigated technique (p = 0.09). CONCLUSION This pilot study demonstrates improved angular precision in one of the first AR-navigated pedicle screw placement studies worldwide. Technical shortcomings need to be eliminated before potential clinical applications.
Collapse
|
30
|
Yahanda AT, Moore E, Ray WZ, Pennicooke B, Jennings JW, Molina CA. First in-human report of the clinical accuracy of thoracolumbar percutaneous pedicle screw placement using augmented reality guidance. Neurosurg Focus 2021; 51:E10. [PMID: 34333484 DOI: 10.3171/2021.5.focus21217] [Citation(s) in RCA: 25] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Indexed: 11/06/2022]
Abstract
OBJECTIVE Augmented reality (AR) is an emerging technology that has great potential for guiding the safe and accurate placement of spinal hardware, including percutaneous pedicle screws. The goal of this study was to assess the accuracy of 63 percutaneous pedicle screws placed at a single institution using an AR head-mounted display (ARHMD) system. METHODS Retrospective analyses were performed for 9 patients who underwent thoracic and/or lumbar percutaneous pedicle screw placement guided by ARHMD technology. Clinical accuracy was assessed via the Gertzbein-Robbins scale by the authors and by an independent musculoskeletal radiologist. Thoracic pedicle subanalysis was also performed to assess screw accuracy based on pedicle morphology. RESULTS Nine patients received thoracic or lumbar AR-guided percutaneous pedicle screws. The mean age at the time of surgery was 71.9 ± 11.5 years and the mean number of screws per patient was 7. Indications for surgery were spinal tumors (n = 4, 44.4%), degenerative disease (n = 3, 33.3%), spinal deformity (n = 1, 11.1%), and a combination of deformity and infection (n = 1, 11.1%). Presenting symptoms were most commonly low-back pain (n = 7, 77.8%) and lower-extremity weakness (n = 5, 55.6%), followed by radicular lower-extremity pain, loss of lower-extremity sensation, or incontinence/urinary retention (n = 3 each, 33.3%). In all, 63 screws were placed (32 thoracic, 31 lumbar). The accuracy for these screws was 100% overall; all screws were Gertzbein-Robbins grade A or B (96.8% grade A, 3.2% grade B). This accuracy was achieved in the thoracic spine regardless of pedicle cancellous bone morphology. CONCLUSIONS AR-guided surgery demonstrated a 100% accuracy rate for the insertion of 63 percutaneous pedicle screws in 9 patients (100% rate of Gertzbein-Robbins grade A or B screw placement). Using an ARHMS system for the placement of percutaneous pedicle screws showed promise, but further validation using a larger cohort of patients across multiple surgeons and institutions will help to determine the true accuracy enabled by this technology.
Collapse
Affiliation(s)
| | - Emelia Moore
- 2Wayne State University School of Medicine, Detroit, Michigan
| | | | | | - Jack W Jennings
- 3Radiology, Washington University School of Medicine in St. Louis, Missouri; and
| | | |
Collapse
|
31
|
Farshad M, Fürnstahl P, Spirig JM. First in man in-situ augmented reality pedicle screw navigation. NORTH AMERICAN SPINE SOCIETY JOURNAL 2021; 6:100065. [PMID: 35141630 PMCID: PMC8819976 DOI: 10.1016/j.xnsj.2021.100065] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/05/2021] [Revised: 04/17/2021] [Accepted: 04/20/2021] [Indexed: 12/29/2022]
Abstract
Background Augmented reality (AR) is a rising technology gaining increasing utility in medicine. By superimposing the surgical site and the operator's visual field with computer-generated information, it has the potential to enhance the cognitive skills of surgeons. This is the report of the first in man case with "direct holographic navigation" as part of a randomized controlled trial. Case description A pointing instrument was equipped with a sterile fiducial marker, which was used to obtain a digital representation of the intraoperative bony anatomy of the lumbar spine. Subsequently, a previously validated registration method was applied to superimpose the surgery plan with the intraoperative anatomy. The registration result is shown in situ as a 3D AR hologram of the preoperative 3D vertebra model with the planned screw trajectory and entry point for validation and approval by the surgeon. After achieving alignment with the surgery plan, a borehole is drilled and the pedicle screw placed. Postoperativ computer tomography was used to measure accuracy of this novel method for surgical navigation. Outcome Correct screw positions entirely within bone were documented with a postoperative CT, with an accuracy similar to current standard of care methods for surgical navigation. The patient was mobilized uneventfully on the first postoperative day with little pain medication and dismissed on the fourth postoperative day. Conclusion This first in man report of direct AR navigation demonstrates feasibility in vivo. The continuation of this randomized controlled study will evaluate the value of this novel technology.
Collapse
Affiliation(s)
- Mazda Farshad
- Spine Division, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
- Corresponding author.
| | - Philipp Fürnstahl
- ROCS: Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - José Miguel Spirig
- Spine Division, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| |
Collapse
|
32
|
Augmented reality in the operating room: a clinical feasibility study. BMC Musculoskelet Disord 2021; 22:451. [PMID: 34006234 PMCID: PMC8132365 DOI: 10.1186/s12891-021-04339-w] [Citation(s) in RCA: 27] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/05/2020] [Accepted: 05/06/2021] [Indexed: 11/20/2022] Open
Abstract
Background Augmented Reality (AR) is a rapidly emerging technology finding growing acceptance and application in different fields of surgery. Various studies have been performed evaluating the precision and accuracy of AR guided navigation. This study investigates the feasibility of a commercially available AR head mounted device during orthopedic surgery. Methods Thirteen orthopedic surgeons from a Swiss university clinic performed 25 orthopedic surgical procedures wearing a holographic AR headset (HoloLens, Microsoft, Redmond, WA, USA) providing complementary three-dimensional, patient specific anatomic information. The surgeon’s experience of using the device during surgery was recorded using a standardized 58-item questionnaire grading different aspects on a 100-point scale with anchor statements. Results Surgeons were generally satisfied with image quality (85 ± 17 points) and accuracy of the virtual objects (84 ± 19 point). Wearing the AR device was rated as fairly comfortable (79 ± 13 points). Functionality of voice commands (68 ± 20 points) and gestures (66 ± 20 points) provided less favorable results. The greatest potential in the use of the AR device was found for surgical correction of deformities (87 ± 15 points). Overall, surgeons were satisfied with the application of this novel technology (78 ± 20 points) and future access to it was demanded (75 ± 22 points). Conclusion AR is a rapidly evolving technology with large potential in different surgical settings, offering the opportunity to provide a compact, low cost alternative requiring a minimum of infrastructure compared to conventional navigation systems. While surgeons where generally satisfied with image quality of the here tested head mounted AR device, some technical and ergonomic shortcomings were pointed out. This study serves as a proof of concept for the use of an AR head mounted device in a real-world sterile setting in orthopedic surgery. Supplementary Information The online version contains supplementary material available at 10.1186/s12891-021-04339-w.
Collapse
|
33
|
Pojskić M, Bopp M, Saß B, Kirschbaum A, Nimsky C, Carl B. Intraoperative Computed Tomography-Based Navigation with Augmented Reality for Lateral Approaches to the Spine. Brain Sci 2021; 11:brainsci11050646. [PMID: 34063546 PMCID: PMC8156391 DOI: 10.3390/brainsci11050646] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2021] [Revised: 05/10/2021] [Accepted: 05/12/2021] [Indexed: 11/23/2022] Open
Abstract
Background. Lateral approaches to the spine have gained increased popularity due to enabling minimally invasive access to the spine, less blood loss, decreased operative time, and less postoperative pain. The objective of the study was to analyze the use of intraoperative computed tomography with navigation and the implementation of augmented reality in facilitating a lateral approach to the spine. Methods. We prospectively analyzed all patients who underwent surgery with a lateral approach to the spine from September 2016 to January 2021 using intraoperative CT applying a 32-slice movable CT scanner, which was used for automatic navigation registration. Sixteen patients, with a median age of 64.3 years, were operated on using a lateral approach to the thoracic and lumbar spine and using intraoperative CT with navigation. Indications included a herniated disc (six patients), tumors (seven), instability following the fracture of the thoracic or lumbar vertebra (two), and spondylodiscitis (one). Results. Automatic registration, applying intraoperative CT, resulted in high accuracy (target registration error: 0.84 ± 0.10 mm). The effective radiation dose of the registration CT scans was 6.16 ± 3.91 mSv. In seven patients, a control iCT scan was performed for resection and implant control, with an ED of 4.51 ± 2.48 mSv. Augmented reality (AR) was used to support surgery in 11 cases, by visualizing the tumor outline, pedicle screws, herniated discs, and surrounding structures. Of the 16 patients, corpectomy was performed in six patients with the implantation of an expandable cage, and one patient underwent discectomy using the XLIF technique. One patient experienced perioperative complications. One patient died in the early postoperative course due to severe cardiorespiratory failure. Ten patients had improved and five had unchanged neurological status at the 3-month follow up. Conclusions. Intraoperative computed tomography with navigation facilitates the application of lateral approaches to the spine for a variety of indications, including fusion procedures, tumor resection, and herniated disc surgery.
Collapse
Affiliation(s)
- Mirza Pojskić
- Department of Neurosurgery, University of Marburg, Baldingerstraße, 35043 Marburg, Germany; (M.B.); (B.S.); (C.N.); (B.C.)
- Correspondence: ; Tel.: +49-64215869848
| | - Miriam Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstraße, 35043 Marburg, Germany; (M.B.); (B.S.); (C.N.); (B.C.)
- Marburg Center for Mind, Brain and Behavior (MCMBB), 35043 Marburg, Germany
| | - Benjamin Saß
- Department of Neurosurgery, University of Marburg, Baldingerstraße, 35043 Marburg, Germany; (M.B.); (B.S.); (C.N.); (B.C.)
| | - Andreas Kirschbaum
- Department of Visceral, Thoracic and Vascular Surgery, University of Marburg, 35043 Marburg, Germany;
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstraße, 35043 Marburg, Germany; (M.B.); (B.S.); (C.N.); (B.C.)
- Marburg Center for Mind, Brain and Behavior (MCMBB), 35043 Marburg, Germany
| | - Barbara Carl
- Department of Neurosurgery, University of Marburg, Baldingerstraße, 35043 Marburg, Germany; (M.B.); (B.S.); (C.N.); (B.C.)
- Department of Neurosurgery, Helios Dr. Horst Schmidt Kliniken, 65199 Wiesbaden, Germany
| |
Collapse
|
34
|
Augmented Reality, Virtual Reality and Artificial Intelligence in Orthopedic Surgery: A Systematic Review. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app11073253] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/26/2022]
Abstract
Background: The application of virtual and augmented reality technologies to orthopaedic surgery training and practice aims to increase the safety and accuracy of procedures and reducing complications and costs. The purpose of this systematic review is to summarise the present literature on this topic while providing a detailed analysis of current flaws and benefits. Methods: A comprehensive search on the PubMed, Cochrane, CINAHL, and Embase database was conducted from inception to February 2021. The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines were used to improve the reporting of the review. The Cochrane Risk of Bias Tool and the Methodological Index for Non-Randomized Studies (MINORS) was used to assess the quality and potential bias of the included randomized and non-randomized control trials, respectively. Results: Virtual reality has been proven revolutionary for both resident training and preoperative planning. Thanks to augmented reality, orthopaedic surgeons could carry out procedures faster and more accurately, improving overall safety. Artificial intelligence (AI) is a promising technology with limitless potential, but, nowadays, its use in orthopaedic surgery is limited to preoperative diagnosis. Conclusions: Extended reality technologies have the potential to reform orthopaedic training and practice, providing an opportunity for unidirectional growth towards a patient-centred approach.
Collapse
|
35
|
Chen F, Cui X, Han B, Liu J, Zhang X, Liao H. Augmented reality navigation for minimally invasive knee surgery using enhanced arthroscopy. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2021; 201:105952. [PMID: 33561710 DOI: 10.1016/j.cmpb.2021.105952] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Accepted: 01/21/2021] [Indexed: 06/12/2023]
Abstract
PURPOSE During the minimally invasive knee surgery, surgeons insert surgical instruments and arthroscopy through small incisions, and implement treatment assisted by 2D arthroscopic images. However, this 2D arthroscopic navigation faces several problems. Firstly, the guidance information is displayed on a screen away from the surgical area, which makes hand/eye coordination difficult. Secondly, the small incision limits the surgeons to view the internal knee structures only from an arthroscopic camera. In addition, arthroscopic images commonly appear obscure visions. METHODS To solve these problems, we proposed a novel in-situ augmented reality navigation system with the enhanced arthroscopic information. Firstly, intraoperative anatomical locations were obtained by using arthroscopic images and arthroscopy calibration. Secondly, tissue properties-based model deformation method was proposed to update the 3D preoperative knee model with anatomical location information. Then, the updated model was further rendered with glasses-free real 3D display for achieving the global in-situ augmented reality view. In addition, virtual arthroscopic images were generated from the updated preoperative model to provide the anatomical information of the operation area. RESULTS Experimental results demonstrated that virtual arthroscopic images could reflect the correct structure information with a mean error of 0.32 mm. Compared with 2D arthroscopic navigation, the proposed augmented reality navigation reduced the targeting errors by 2.10 mm and 2.70 mm for the experiments of knee phantom and in-vitro swine knee, respectively. CONCLUSION Our navigation method is helpful for minimally invasive knee surgery since it can provide the global in-situ information and detail anatomical information.
Collapse
Affiliation(s)
- Fang Chen
- Department of Computer Science and Engineering, Nanjing University of Aeronautics and Astronautics, MIIT Key Laboratory of Pattern Analysis and Machine Intelligence, Nanjing, China.
| | - Xiwen Cui
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China
| | - Boxuan Han
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China
| | - Jia Liu
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China
| | - Xinran Zhang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China.
| |
Collapse
|
36
|
Burström G, Persson O, Edström E, Elmi-Terander A. Augmented reality navigation in spine surgery: a systematic review. Acta Neurochir (Wien) 2021; 163:843-852. [PMID: 33506289 PMCID: PMC7886712 DOI: 10.1007/s00701-021-04708-3] [Citation(s) in RCA: 52] [Impact Index Per Article: 17.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2020] [Accepted: 01/06/2021] [Indexed: 02/07/2023]
Abstract
BACKGROUND Conventional spinal navigation solutions have been criticized for having a negative impact on time in the operating room and workflow. AR navigation could potentially alleviate some of these concerns while retaining the benefits of navigated spine surgery. The objective of this study is to summarize the current evidence for using augmented reality (AR) navigation in spine surgery. METHODS We performed a systematic review to explore the current evidence for using AR navigation in spine surgery. PubMed and Web of Science were searched from database inception to November 27, 2020, for data on the AR navigation solutions; the reported efficacy of the systems; and their impact on workflow, radiation, and cost-benefit relationships. RESULTS In this systematic review, 28 studies were included in the final analysis. The main findings were superior workflow and non-inferior accuracy when comparing AR to free-hand (FH) or conventional surgical navigation techniques. A limited number of studies indicated decreased use of radiation. There were no studies reporting mortality, morbidity, or cost-benefit relationships. CONCLUSIONS AR provides a meaningful addition to FH surgery and traditional navigation methods for spine surgery. However, the current evidence base is limited and prospective studies on clinical outcomes and cost-benefit relationships are needed.
Collapse
|
37
|
Yuk FJ, Maragkos GA, Sato K, Steinberger J. Current innovation in virtual and augmented reality in spine surgery. ANNALS OF TRANSLATIONAL MEDICINE 2021; 9:94. [PMID: 33553387 PMCID: PMC7859743 DOI: 10.21037/atm-20-1132] [Citation(s) in RCA: 24] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
Abstract
In spinal surgery, outcomes are directly related both to patient and procedure selection, as well as the accuracy and precision of instrumentation placed. Poorly placed instrumentation can lead to spinal cord, nerve root or vascular injury. Traditionally, spine surgery was performed by open methods and placement of instrumentation under direct visualization. However, minimally invasive surgery (MIS) has seen substantial advances in spine, with an ever-increasing range of indications and procedures. For these reasons, novel methods to visualize anatomy and precisely guide surgery, such as intraoperative navigation, are extremely useful in this field. In this review, we present the recent advances and innovations utilizing simulation methods in spine surgery. The application of these techniques is still relatively new, however quickly being integrated in and outside the operating room. These include virtual reality (VR) (where the entire simulation is virtual), mixed reality (MR) (a combination of virtual and physical components), and augmented reality (AR) (the superimposition of a virtual component onto physical reality). VR and MR have primarily found applications in a teaching and preparatory role, while AR is mainly applied in hands-on surgical settings. The present review attempts to provide an overview of the latest advances and applications of these methods in the neurosurgical spine setting.
Collapse
Affiliation(s)
- Frank J Yuk
- Department of Neurosurgery, Mount Sinai Hospital, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Georgios A Maragkos
- Department of Neurosurgery, Mount Sinai Hospital, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Kosuke Sato
- Hospital for Special Surgery, New York, NY, USA
| | - Jeremy Steinberger
- Department of Neurosurgery, Mount Sinai Hospital, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| |
Collapse
|
38
|
Lungu AJ, Swinkels W, Claesen L, Tu P, Egger J, Chen X. A review on the applications of virtual reality, augmented reality and mixed reality in surgical simulation: an extension to different kinds of surgery. Expert Rev Med Devices 2020; 18:47-62. [PMID: 33283563 DOI: 10.1080/17434440.2021.1860750] [Citation(s) in RCA: 51] [Impact Index Per Article: 12.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Background: Research proves that the apprenticeship model, which is the gold standard for training surgical residents, is obsolete. For that reason, there is a continuing effort toward the development of high-fidelity surgical simulators to replace the apprenticeship model. Applying Virtual Reality Augmented Reality (AR) and Mixed Reality (MR) in surgical simulators increases the fidelity, level of immersion and overall experience of these simulators.Areas covered: The objective of this review is to provide a comprehensive overview of the application of VR, AR and MR for distinct surgical disciplines, including maxillofacial surgery and neurosurgery. The current developments in these areas, as well as potential future directions, are discussed.Expert opinion: The key components for incorporating VR into surgical simulators are visual and haptic rendering. These components ensure that the user is completely immersed in the virtual environment and can interact in the same way as in the physical world. The key components for the application of AR and MR into surgical simulators include the tracking system as well as the visual rendering. The advantages of these surgical simulators are the ability to perform user evaluations and increase the training frequency of surgical residents.
Collapse
Affiliation(s)
- Abel J Lungu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Wout Swinkels
- Computational Sensing Systems, Department of Engineering Technology, Hasselt University, Diepenbeek, Belgium
| | - Luc Claesen
- Computational Sensing Systems, Department of Engineering Technology, Hasselt University, Diepenbeek, Belgium
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Jan Egger
- Graz University of Technology, Institute of Computer Graphics and Vision, Graz, Austria.,Graz Department of Oral &maxillofacial Surgery, Medical University of Graz, Graz, Austria.,The Laboratory of Computer Algorithms for Medicine, Medical University of Graz, Graz, Austria
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|
39
|
Salmas M, Fiska A, Vassiou A, Demesticha T, Paraskevas G, Protogerou V, Chytas D. Letter to the Editor Regarding "Enhancing Reality: A Systematic Review of Augmented Reality in Neuronavigation and Education". World Neurosurg 2020; 140:430-431. [PMID: 32797957 DOI: 10.1016/j.wneu.2020.04.213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2020] [Accepted: 04/27/2020] [Indexed: 11/29/2022]
Affiliation(s)
- Marios Salmas
- Department of Anatomy, School of Medicine, National and Kapodistrian University of Athens, Athens, Greece
| | - Aliki Fiska
- Department of Anatomy, School of Medicine, Democritus University of Thrace, Alexandroupolis, Greece
| | - Aikaterini Vassiou
- Department of Anatomy, Faculty of Medicine, University of Thessaly, Larissa, Greece
| | - Theano Demesticha
- Department of Anatomy, School of Medicine, National and Kapodistrian University of Athens, Athens, Greece
| | - Georgios Paraskevas
- Department of Anatomy and Surgical Anatomy, Faculty of Medicine, Aristotle University of Thessaloniki, Thessaloniki, Greece
| | - Vassilios Protogerou
- Department of Anatomy, School of Medicine, National and Kapodistrian University of Athens, Athens, Greece
| | - Dimitrios Chytas
- Department of Anatomy, School of Medicine, European University of Cyprus, Engomi, Nicosia, Cyprus; 2nd Orthopedic Department, School of Medicine, National and Kapodistrian University of Athens, Nea Ionia, Greece.
| |
Collapse
|
40
|
Gueziri HE, Santaguida C, Collins DL. The state-of-the-art in ultrasound-guided spine interventions. Med Image Anal 2020; 65:101769. [PMID: 32668375 DOI: 10.1016/j.media.2020.101769] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2019] [Revised: 06/23/2020] [Accepted: 06/25/2020] [Indexed: 02/07/2023]
Abstract
During the last two decades, intra-operative ultrasound (iUS) imaging has been employed for various surgical procedures of the spine, including spinal fusion and needle injections. Accurate and efficient registration of pre-operative computed tomography or magnetic resonance images with iUS images are key elements in the success of iUS-based spine navigation. While widely investigated in research, iUS-based spine navigation has not yet been established in the clinic. This is due to several factors including the lack of a standard methodology for the assessment of accuracy, robustness, reliability, and usability of the registration method. To address these issues, we present a systematic review of the state-of-the-art techniques for iUS-guided registration in spinal image-guided surgery (IGS). The review follows a new taxonomy based on the four steps involved in the surgical workflow that include pre-processing, registration initialization, estimation of the required patient to image transformation, and a visualization process. We provide a detailed analysis of the measurements in terms of accuracy, robustness, reliability, and usability that need to be met during the evaluation of a spinal IGS framework. Although this review is focused on spinal navigation, we expect similar evaluation criteria to be relevant for other IGS applications.
Collapse
Affiliation(s)
- Houssem-Eddine Gueziri
- McConnell Brain Imaging Center, Montreal Neurological Institute and Hospital, Montreal (QC), Canada; McGill University, Montreal (QC), Canada.
| | - Carlo Santaguida
- Department of Neurology and Neurosurgery, McGill University Health Center, Montreal (QC), Canada
| | - D Louis Collins
- McConnell Brain Imaging Center, Montreal Neurological Institute and Hospital, Montreal (QC), Canada; McGill University, Montreal (QC), Canada
| |
Collapse
|
41
|
Dennler C, Jaberg L, Spirig J, Agten C, Götschi T, Fürnstahl P, Farshad M. Augmented reality-based navigation increases precision of pedicle screw insertion. J Orthop Surg Res 2020; 15:174. [PMID: 32410636 PMCID: PMC7227090 DOI: 10.1186/s13018-020-01690-x] [Citation(s) in RCA: 41] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/01/2019] [Accepted: 04/29/2020] [Indexed: 02/07/2023] Open
Abstract
Background Precise insertion of pedicle screws is important to avoid injury to closely adjacent neurovascular structures. The standard method for the insertion of pedicle screws is based on anatomical landmarks (free-hand technique). Head-mounted augmented reality (AR) devices can be used to guide instrumentation and implant placement in spinal surgery. This study evaluates the feasibility and precision of AR technology to improve precision of pedicle screw insertion compared to the current standard technique. Methods Two board-certified orthopedic surgeons specialized in spine surgery and two novice surgeons were each instructed to drill pilot holes for 40 pedicle screws in eighty lumbar vertebra sawbones models in an agar-based gel. One hundred and sixty pedicles were randomized into two groups: the standard free-hand technique (FH) and augmented reality technique (AR). A 3D model of the vertebral body was superimposed over the AR headset. Half of the pedicles were drilled using the FH method, and the other half using the AR method. Results The average minimal distance of the drill axis to the pedicle wall (MAPW) was similar in both groups for expert surgeons (FH 4.8 ± 1.0 mm vs. AR 5.0 ± 1.4 mm, p = 0.389) but for novice surgeons (FH 3.4 mm ± 1.8 mm, AR 4.2 ± 1.8 mm, p = 0.044). Expert surgeons showed 0 primary drill pedicle perforations (PDPP) in both the FH and AR groups. Novices showed 3 (7.5%) PDPP in the FH group and one perforation (2.5%) in the AR group, respectively (p > 0.005). Experts showed no statistically significant difference in average secondary screw pedicle perforations (SSPP) between the AR and the FH set 6-, 7-, and 8-mm screws (p > 0.05). Novices showed significant differences of SSPP between most groups: 6-mm screws, 18 (45%) vs. 7 (17.5%), p = 0.006; 7-mm screws, 20 (50%) vs. 10 (25%), p = 0.013; and 8-mm screws, 22 (55%) vs. 15 (37.5%), p = 0.053, in the FH and AR group, respectively. In novices, the average optimal medio-lateral convergent angle (oMLCA) was 3.23° (STD 4.90) and 0.62° (STD 4.56) for the FH and AR set screws (p = 0.017), respectively. Novices drilled with a higher precision with respect to the cranio-caudal inclination angle (CCIA) category (p = 0.04) with AR. Conclusion In this study, the additional anatomical information provided by the AR headset superimposed to real-world anatomy improved the precision of drilling pilot holes for pedicle screws in a laboratory setting and decreases the effect of surgeon’s experience. Further technical development and validations studies are currently being performed to investigate potential clinical benefits of the herein described AR-based navigation approach.
Collapse
Affiliation(s)
- Cyrill Dennler
- Spine Division, University Hospital Balgrist, University of Zürich, Forchstrasse 340, 8008, Zurich, Switzerland.
| | - Laurenz Jaberg
- Spine Division, University Hospital Balgrist, University of Zürich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - José Spirig
- Spine Division, University Hospital Balgrist, University of Zürich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - Christoph Agten
- Department of Radiology, University Hospital Balgrist, University of Zürich, Zurich, Switzerland
| | - Tobias Götschi
- Laboratory for Biomechanics, University Hospital Balgrist, University of Zürich, Zurich, Switzerland
| | - Philipp Fürnstahl
- Computer Assisted Research and Development Group, University Hospital Balgrist, University of Zürich, Zurich, Switzerland
| | - Mazda Farshad
- Spine Division, University Hospital Balgrist, University of Zürich, Forchstrasse 340, 8008, Zurich, Switzerland
| |
Collapse
|
42
|
Abstract
STUDY DESIGN A prospective, case-based, observational study. OBJECTIVES To investigate how microscope-based augmented reality (AR) support can be utilized in various types of spine surgery. METHODS In 42 spinal procedures (12 intra- and 8 extradural tumors, 7 other intradural lesions, 11 degenerative cases, 2 infections, and 2 deformities) AR was implemented using operating microscope head-up displays (HUDs). Intraoperative low-dose computed tomography was used for automatic registration. Nonlinear image registration was applied to integrate multimodality preoperative images. Target and risk structures displayed by AR were defined in preoperative images by automatic anatomical mapping and additional manual segmentation. RESULTS AR could be successfully applied in all 42 cases. Low-dose protocols ensured a low radiation exposure for registration scanning (effective dose cervical 0.29 ± 0.17 mSv, thoracic 3.40 ± 2.38 mSv, lumbar 3.05 ± 0.89 mSv). A low registration error (0.87 ± 0.28 mm) resulted in a reliable AR representation with a close matching of visualized objects and reality, distinctly supporting anatomical orientation in the surgical field. Flexible AR visualization applying either the microscope HUD or video superimposition, including the ability to selectively activate objects of interest, as well as different display modes allowed a smooth integration in the surgical workflow, without disturbing the actual procedure. On average, 7.1 ± 4.6 objects were displayed visualizing target and risk structures reliably. CONCLUSIONS Microscope-based AR can be applied successfully to various kinds of spinal procedures. AR improves anatomical orientation in the surgical field supporting the surgeon, as well as it offers a potential tool for education.
Collapse
Affiliation(s)
- Barbara Carl
- Department of Neurosurgery, University Marburg, Marburg, Germany
| | - Miriam Bopp
- Department of Neurosurgery, University Marburg, Marburg, Germany
- Marburg Center for Mind, Brain and Behavior (MCMBB), Marburg, Germany
| | - Benjamin Saß
- Department of Neurosurgery, University Marburg, Marburg, Germany
| | - Mirza Pojskic
- Department of Neurosurgery, University Marburg, Marburg, Germany
| | | | - Christopher Nimsky
- Department of Neurosurgery, University Marburg, Marburg, Germany
- Marburg Center for Mind, Brain and Behavior (MCMBB), Marburg, Germany
| |
Collapse
|
43
|
Müller F, Roner S, Liebmann F, Spirig JM, Fürnstahl P, Farshad M. Augmented reality navigation for spinal pedicle screw instrumentation using intraoperative 3D imaging. Spine J 2020; 20:621-628. [PMID: 31669611 DOI: 10.1016/j.spinee.2019.10.012] [Citation(s) in RCA: 51] [Impact Index Per Article: 12.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/06/2019] [Revised: 10/19/2019] [Accepted: 10/21/2019] [Indexed: 02/03/2023]
Abstract
BACKGROUND CONTEXT Due to recent developments in augmented reality with head-mounted devices, holograms of a surgical plan can be displayed directly in the surgeon's field of view. To the best of our knowledge, three dimensional (3D) intraoperative fluoroscopy has not been explored for the use with holographic navigation by head-mounted devices in spine surgery. PURPOSE To evaluate the surgical accuracy of holographic pedicle screw navigation by head-mounted device using 3D intraoperative fluoroscopy. STUDY DESIGN In this experimental cadaver study, the accuracy of surgical navigation using a head-mounted device was compared with navigation with a state-of-the-art pose-tracking system. METHODS Three lumbar cadaver spines were embedded in nontransparent agar gel, leaving only commonly visible anatomy in sight. Intraoperative registration of preoperative planning was achieved by 3D fluoroscopy and fiducial markers attached to lumbar vertebrae. Trackable custom-made drill sleeve guides enabled real-time navigation. In total, 20 K-wires were navigated into lumbar pedicles using AR-navigation, 10 K-wires by the state-of-the-art pose-tracking system. 3D models obtained from postexperimental CT scans were used to measure surgical accuracy. MF is the founder and shareholder of Incremed AG, a Balgrist University Hospital start-up focusing on the development of innovative techniques for surgical executions. The other authors declare no conflict of interest concerning the contents of this study. No external funding was received for this study. RESULTS No significant difference in accuracy was measured between AR-navigated drillings and the gold standard with pose-tracking system with mean translational errors between entry points (3D vector distance; p=.85) of 3.4±1.6 mm compared with 3.2±2.0 mm, and mean angular errors between trajectories (3D angle; p=.30) of 4.3°±2.3° compared with 3.5°±1.4°. CONCLUSIONS In conclusion, holographic navigation by use of a head-mounted device achieve accuracy comparable to the gold standard of high-end pose-tracking systems. CLINICAL SIGNIFICANCE These promising results could result in a new way of surgical navigation with minimal infrastructural requirements but now have to be confirmed in clinical studies.
Collapse
Affiliation(s)
- Fabio Müller
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland.
| | - Simon Roner
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - Florentin Liebmann
- Computer Assisted Research and Development Group, Balgrist University Hospital, University of Zurich, Lengghalde 5, 8008 Zurich, Switzerland; Laboratory for Orthopedic Biomechanics, ETH Zurich, Forchstrasse 328, 8008 Zurich, Switzerland
| | - José M Spirig
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - Philipp Fürnstahl
- Computer Assisted Research and Development Group, Balgrist University Hospital, University of Zurich, Lengghalde 5, 8008 Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| |
Collapse
|
44
|
Vadalà G, De Salvatore S, Ambrosio L, Russo F, Papalia R, Denaro V. Robotic Spine Surgery and Augmented Reality Systems: A State of the Art. Neurospine 2020; 17:88-100. [PMID: 32252158 PMCID: PMC7136092 DOI: 10.14245/ns.2040060.030] [Citation(s) in RCA: 36] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2020] [Accepted: 02/24/2020] [Indexed: 12/26/2022] Open
Abstract
Instrumented spine procedures have been performed for decades to treat a wide variety of spinal disorders. New technologies have been employed to obtain a high degree of precision, to minimize risks of damage to neurovascular structures and to diminish harmful exposure of patients and the operative team to ionizing radiations. Robotic spine surgery comprehends 3 major categories: telesurgical robotic systems, robotic-assisted navigation (RAN) and virtual augmented reality (AR) systems, including AR and virtual reality. Telesurgical systems encompass devices that can be operated from a remote command station, allowing to perform surgery via instruments being manipulated by the robot. On the other hand, RAN technologies are characterized by the robotic guidance of surgeon-operated instruments based on real-time imaging. Virtual AR systems are able to show images directly on special visors and screens allowing the surgeon to visualize information about the patient and the procedure (i.e., anatomical landmarks, screw direction and inclination, distance from neurological and vascular structures etc.). The aim of this review is to focus on the current state of the art of robotics and AR in spine surgery and perspectives of these emerging technologies that hold promises for future applications.
Collapse
Affiliation(s)
- Gianluca Vadalà
- Department of Orthopaedic and Trauma Surgery, Campus Bio-Medico University of Rome, Rome, Italy
| | - Sergio De Salvatore
- Department of Orthopaedic and Trauma Surgery, Campus Bio-Medico University of Rome, Rome, Italy
| | - Luca Ambrosio
- Department of Orthopaedic and Trauma Surgery, Campus Bio-Medico University of Rome, Rome, Italy
| | - Fabrizio Russo
- Department of Orthopaedic and Trauma Surgery, Campus Bio-Medico University of Rome, Rome, Italy
| | - Rocco Papalia
- Department of Orthopaedic and Trauma Surgery, Campus Bio-Medico University of Rome, Rome, Italy
| | - Vincenzo Denaro
- Department of Orthopaedic and Trauma Surgery, Campus Bio-Medico University of Rome, Rome, Italy
| |
Collapse
|
45
|
Chen F, Cui X, Liu J, Han B, Zhang X, Zhang D, Liao H. Tissue Structure Updating for In Situ Augmented Reality Navigation Using Calibrated Ultrasound and Two-Level Surface Warping. IEEE Trans Biomed Eng 2020; 67:3211-3222. [PMID: 32175853 DOI: 10.1109/tbme.2020.2979535] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
OBJECTIVE In minimally invasive surgery (MIS), in situ augmented reality (AR) navigation systems are usually implemented using a glasses-free 3D display to represent the preoperative tissue structure, and can provide intuitive see-through guidance information. However, due to changes in intraoperative tissue, the preoperative tissue structure is not able to exactly correspond to reality, which influences the precision of in situ AR navigation. To solve this problem, we propose a method to update the tissue structure for in situ AR navigation in such way to reflect changes in intraoperative tissue. METHODS The proposed method to update the tissue structure is based on the calibrated ultrasound and two-level surface warping technologies. Firstly, the particle filter-based calibration is implemented to perform ultrasound calibration and obtain intraoperative position of anatomical points. Secondly, intraoperative positions of anatomical points are inputted in the two-level surface warping method to update the preoperative tissue structure. Finally, the glasses-free real 3-D display of the updated tissue structure is finished, and is superimposed onto a patient by a translucent mirror for in situ AR navigation. RESULTS we validated the proposed method by simulating liver tissue intervention, and achieved the tissue updating accuracy of 92.86%. Furthermore, the targeting error of AR navigation based on the proposed method was also evaluated through minimally invasive liver surgery, and the acquired mean targeting error was 1.92 mm. CONCLUSION The results demonstrate that the proposed AR navigation method is effective. SIGNIFICANCE The proposed method can facilitate MIS, as it provides accurate 3D navigation.
Collapse
|
46
|
Jud L, Fotouhi J, Andronic O, Aichmair A, Osgood G, Navab N, Farshad M. Applicability of augmented reality in orthopedic surgery - A systematic review. BMC Musculoskelet Disord 2020; 21:103. [PMID: 32061248 PMCID: PMC7023780 DOI: 10.1186/s12891-020-3110-2] [Citation(s) in RCA: 64] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/12/2019] [Accepted: 02/03/2020] [Indexed: 02/07/2023] Open
Abstract
BACKGROUND Computer-assisted solutions are changing surgical practice continuously. One of the most disruptive technologies among the computer-integrated surgical techniques is Augmented Reality (AR). While Augmented Reality is increasingly used in several medical specialties, its potential benefit in orthopedic surgery is not yet clear. The purpose of this article is to provide a systematic review of the current state of knowledge and the applicability of AR in orthopedic surgery. METHODS A systematic review of the current literature was performed to find the state of knowledge and applicability of AR in Orthopedic surgery. A systematic search of the following three databases was performed: "PubMed", "Cochrane Library" and "Web of Science". The systematic review followed the Preferred Reporting Items on Systematic Reviews and Meta-analysis (PRISMA) guidelines and it has been published and registered in the international prospective register of systematic reviews (PROSPERO). RESULTS 31 studies and reports are included and classified into the following categories: Instrument / Implant Placement, Osteotomies, Tumor Surgery, Trauma, and Surgical Training and Education. Quality assessment could be performed in 18 studies. Among the clinical studies, there were six case series with an average score of 90% and one case report, which scored 81% according to the Joanna Briggs Institute Critical Appraisal Checklist (JBI CAC). The 11 cadaveric studies scored 81% according to the QUACS scale (Quality Appraisal for Cadaveric Studies). CONCLUSION This manuscript provides 1) a summary of the current state of knowledge and research of Augmented Reality in orthopedic surgery presented in the literature, and 2) a discussion by the authors presenting the key remarks required for seamless integration of Augmented Reality in the future surgical practice. TRIAL REGISTRATION PROSPERO registration number: CRD42019128569.
Collapse
Affiliation(s)
- Lukas Jud
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich, Switzerland
| | - Javad Fotouhi
- Computer Aided Medical Procedure, Johns Hopkins University, 3400 N Charles Street, Baltimore, 21210 USA
| | - Octavian Andronic
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich, Switzerland
| | - Alexander Aichmair
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich, Switzerland
| | - Greg Osgood
- Johns Hopkins Hospital, Department of Orthopedics Surgery, 1800 Orleans Street, Baltimore, 21287 USA
| | - Nassir Navab
- Computer Aided Medical Procedure, Johns Hopkins University, 3400 N Charles Street, Baltimore, 21210 USA
- Computer Aided Medical Procedure, Technical University of Munich, Boltzmannstrasse 3, 85748 Munich, Germany
| | - Mazda Farshad
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich, Switzerland
| |
Collapse
|
47
|
Chen L, Zhang F, Zhan W, Gan M, Sun L. Optimization of virtual and real registration technology based on augmented reality in a surgical navigation system. Biomed Eng Online 2020; 19:1. [PMID: 31915014 PMCID: PMC6950982 DOI: 10.1186/s12938-019-0745-z] [Citation(s) in RCA: 33] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Accepted: 12/30/2019] [Indexed: 12/19/2022] Open
Abstract
Background The traditional navigation interface was intended only for two-dimensional observation by doctors; thus, this interface does not display the total spatial information for the lesion area. Surgical navigation systems have become essential tools that enable for doctors to accurately and safely perform complex operations. The image navigation interface is separated from the operating area, and the doctor needs to switch the field of vision between the screen and the patient’s lesion area. In this paper, augmented reality (AR) technology was applied to spinal surgery to provide more intuitive information to surgeons. The accuracy of virtual and real registration was improved via research on AR technology. During the operation, the doctor could observe the AR image and the true shape of the internal spine through the skin. Methods To improve the accuracy of virtual and real registration, a virtual and real registration technique based on an improved identification method and robot-assisted method was proposed. The experimental method was optimized by using the improved identification method. X-ray images were used to verify the effectiveness of the puncture performed by the robot. Results The final experimental results show that the average accuracy of the virtual and real registration based on the general identification method was 9.73 ± 0.46 mm (range 8.90–10.23 mm). The average accuracy of the virtual and real registration based on the improved identification method was 3.54 ± 0.13 mm (range 3.36–3.73 mm). Compared with the virtual and real registration based on the general identification method, the accuracy was improved by approximately 65%. The highest accuracy of the virtual and real registration based on the robot-assisted method was 2.39 mm. The accuracy was improved by approximately 28.5% based on the improved identification method. Conclusion The experimental results show that the two optimized methods are highly very effective. The proposed AR navigation system has high accuracy and stability. This system may have value in future spinal surgeries.
Collapse
Affiliation(s)
- Long Chen
- School of Mechanical and Electrical Engineering, Soochow University, Suzhou, 215006, China
| | - Fengfeng Zhang
- School of Mechanical and Electrical Engineering, Soochow University, Suzhou, 215006, China. .,Collaborative Innovation Center of Suzhou Nano Science and Technology, Soochow University, Suzhou, 215123, China.
| | - Wei Zhan
- Department of Radiation Oncology, The First Affiliated Hospital of Soochow University, Suzhou, China
| | - Minfeng Gan
- Department of Radiation Oncology, The First Affiliated Hospital of Soochow University, Suzhou, China
| | - Lining Sun
- School of Mechanical and Electrical Engineering, Soochow University, Suzhou, 215006, China.,Collaborative Innovation Center of Suzhou Nano Science and Technology, Soochow University, Suzhou, 215123, China
| |
Collapse
|
48
|
Lohre R, Wang JC, Lewandrowski KU, Goel DP. Virtual reality in spinal endoscopy: a paradigm shift in education to support spine surgeons. JOURNAL OF SPINE SURGERY 2020; 6:S208-S223. [PMID: 32195429 DOI: 10.21037/jss.2019.11.16] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Indexed: 12/17/2022]
Abstract
Background Minimally invasive spine surgery (MISS) and endoscopic spine surgery have continually evolving indications in the cervical, thoracic, and lumbar spine. Endoscopic spine surgery entails treatment of disc disease, stenosis, spondylolisthesis, radiculopathy, and deformity. MISS involves complex motor skills in regions of variable anatomy. Simulator use has been proposed to aid in training and skill retention, preoperative planning, and intraoperative use. Methods A systematic review of five databases was performed for publications pertaining to the use of virtual (VR), augmented (AR), and mixed (MR) reality in MISS and spinal endoscopic surgery. Qualitative data analysis was undertaken with focus of study design, quality, and reported outcomes. Study quality was assessed using the Medical Education Research Quality Instrument (MERSQI) score and level of evidence (LoE) by a modified Oxford Centre for Evidence-Based Medicine (OCEBM) level for simulation in medicine. Results Thirty-eight studies were retained for data collection. Studies were of intervention-control, clinical application, and pilot or cross-sectional design. Identified articles illustrated use of VR, AR, and MR in all study designs. Procedures included pedicle cannulation and screw insertion, vertebroplasty, kyphoplasty, percutaneous transforaminal endoscopic discectomy (PTED), lumbar puncture and facet injection, transvertebral anterior cervical foraminotomy (TVACF) and posterior cervical laminoforaminotomy. Overall MERSQI score was low-to-medium [M =9.71 (SD =2.60); range, 4.5-13.5], and LoE was predominantly low given the number of purely descriptive articles, or low-quality randomized studies. Conclusions The current scope of VR, AR, and MR surgical simulators in MISS and spinal endoscopic surgery was described. Studies demonstrate improvement in technical skill and patient outcomes in short term follow-up. Despite this, overall study quality and levels of evidence remain low. Cohesive study design and reporting with focus on transfer validity in training scenarios, and patient derived outcome measures in clinical studies are required to further advance the field.
Collapse
Affiliation(s)
- Ryan Lohre
- Department of Orthopaedics, University of British Columbia, Vancouver, BC, USA
| | - Jeffrey C Wang
- USC Spine Center, Keck School of Medicine at University of Southern California, Los Angeles, USA
| | - Kai-Uwe Lewandrowski
- Center for Advanced Spine Care of Southern Arizona and Surgical Institute of Tucson, Tucson, AZ, USA.,Department of Neurosurgery, UNIRIO, Rio de Janeiro, Brazil
| | - Danny P Goel
- Department of Orthopaedics, University of British Columbia, Vancouver, BC, Canada
| |
Collapse
|
49
|
Pérez-Pachón L, Poyade M, Lowe T, Gröning F. Image Overlay Surgery Based on Augmented Reality: A Systematic Review. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2020; 1260:175-195. [PMID: 33211313 DOI: 10.1007/978-3-030-47483-6_10] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2022]
Abstract
Augmented Reality (AR) applied to surgical guidance is gaining relevance in clinical practice. AR-based image overlay surgery (i.e. the accurate overlay of patient-specific virtual images onto the body surface) helps surgeons to transfer image data produced during the planning of the surgery (e.g. the correct resection margins of tissue flaps) to the operating room, thus increasing accuracy and reducing surgery times. We systematically reviewed 76 studies published between 2004 and August 2018 to explore which existing tracking and registration methods and technologies allow healthcare professionals and researchers to develop and implement these systems in-house. Most studies used non-invasive markers to automatically track a patient's position, as well as customised algorithms, tracking libraries or software development kits (SDKs) to compute the registration between patient-specific 3D models and the patient's body surface. Few studies combined the use of holographic headsets, SDKs and user-friendly game engines, and described portable and wearable systems that combine tracking, registration, hands-free navigation and direct visibility of the surgical site. Most accuracy tests included a low number of subjects and/or measurements and did not normally explore how these systems affect surgery times and success rates. We highlight the need for more procedure-specific experiments with a sufficient number of subjects and measurements and including data about surgical outcomes and patients' recovery. Validation of systems combining the use of holographic headsets, SDKs and game engines is especially interesting as this approach facilitates an easy development of mobile AR applications and thus the implementation of AR-based image overlay surgery in clinical practice.
Collapse
Affiliation(s)
- Laura Pérez-Pachón
- School of Medicine, Medical Sciences and Nutrition, University of Aberdeen, Aberdeen, UK.
| | - Matthieu Poyade
- School of Simulation and Visualisation, Glasgow School of Art, Glasgow, UK
| | - Terry Lowe
- School of Medicine, Medical Sciences and Nutrition, University of Aberdeen, Aberdeen, UK
- Head and Neck Oncology Unit, Aberdeen Royal Infirmary (NHS Grampian), Aberdeen, UK
| | - Flora Gröning
- School of Medicine, Medical Sciences and Nutrition, University of Aberdeen, Aberdeen, UK
| |
Collapse
|
50
|
Carl B, Bopp M, Saß B, Pojskic M, Gjorgjevski M, Voellger B, Nimsky C. Reliable navigation registration in cranial and spine surgery based on intraoperative computed tomography. Neurosurg Focus 2019; 47:E11. [DOI: 10.3171/2019.8.focus19621] [Citation(s) in RCA: 27] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2019] [Accepted: 08/26/2019] [Indexed: 11/06/2022]
Abstract
OBJECTIVELow registration errors are an important prerequisite for reliable navigation, independent of its use in cranial or spinal surgery. Regardless of whether navigation is used for trajectory alignment in biopsy or implant procedures, or for sophisticated augmented reality applications, all depend on a correct registration of patient space and image space. In contrast to fiducial, landmark, or surface matching–based registration, the application of intraoperative imaging allows user-independent automatic patient registration, which is less error prone. The authors’ aim in this paper was to give an overview of their experience using intraoperative CT (iCT) scanning for automatic registration with a focus on registration accuracy and radiation exposure.METHODSA total of 645 patients underwent iCT scanning with a 32-slice movable CT scanner in combination with navigation for trajectory alignment in biopsy and implantation procedures (n = 222) and for augmented reality (n = 437) in cranial and spine procedures (347 craniotomies and 42 transsphenoidal, 56 frameless stereotactic, 59 frame-based stereotactic, and 141 spinal procedures). The target registration error was measured using skin fiducials that were not part of the registration procedure. The effective dose was calculated by multiplying the dose length product with conversion factors.RESULTSAmong all 1281 iCT scans obtained, 1172 were used for automatic patient registration (645 initial registration scans and 527 repeat iCT scans). The overall mean target registration error was 0.86 ± 0.38 mm (± SD) (craniotomy, 0.88 ± 0.39 mm; transsphenoidal, 0.92 ± 0.39 mm; frameless, 0.74 ± 0.39 mm; frame-based, 0.84 ± 0.34 mm; and spinal, 0.80 ± 0.28 mm). Compared with standard diagnostic scans, a distinct reduction of the effective dose could be achieved using low-dose protocols for the initial registration scan with mean effective doses of 0.06 ± 0.04 mSv for cranial, 0.50 ± 0.09 mSv for cervical, 4.12 ± 2.13 mSv for thoracic, and 3.37 ± 0.93 mSv for lumbar scans without impeding registration accuracy.CONCLUSIONSReliable automatic patient registration can be achieved using iCT scanning. Low-dose protocols ensured a low radiation exposure for the patient. Low-dose scanning had no negative effect on navigation accuracy.
Collapse
Affiliation(s)
- Barbara Carl
- 1Department of Neurosurgery, University of Marburg; and
| | - Miriam Bopp
- 1Department of Neurosurgery, University of Marburg; and
- 2Marburg Center for Mind, Brain and Behavior (MCMBB), Marburg, Germany
| | - Benjamin Saß
- 1Department of Neurosurgery, University of Marburg; and
| | - Mirza Pojskic
- 1Department of Neurosurgery, University of Marburg; and
| | | | | | - Christopher Nimsky
- 1Department of Neurosurgery, University of Marburg; and
- 2Marburg Center for Mind, Brain and Behavior (MCMBB), Marburg, Germany
| |
Collapse
|