1
|
Johnson PB, Bradley J, Lampotang S, Jackson A, Lizdas D, Johnson W, Brooks E, Vega RBM, Mendenhall N. First-in-human trial using mixed-reality visualization for patient setup during breast or chest wall radiotherapy. Radiat Oncol 2024; 19:163. [PMID: 39558366 PMCID: PMC11574990 DOI: 10.1186/s13014-024-02552-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2024] [Accepted: 10/31/2024] [Indexed: 11/20/2024] Open
Abstract
BACKGROUND The purpose of this study is to assess the feasibility of mixed-reality (MixR) visualization for patient setup in breast and chest wall radiotherapy (RT) by performing a first-in-human clinical trial comparing MixR with a 3-point alignment. METHODS IRB approval was granted for a study incorporating MixR during the setup process for patients undergoing proton (n = 10) or photon (n = 8) RT to the breast or chest wall. For each patient, MixR was utilized for five fractions and compared against another five fractions using 3-point alignment. During fractions with MixR, the patient was aligned by at least one therapist wearing a HoloLens 2 device who was able to guide the process by simultaneously and directly viewing the patient and a hologram of the patient's surface derived from their simulation CT scan. Alignment accuracy was quantified with cone-beam CT (CBCT) for photon treatments and CBCT plus kV/kV imaging for proton treatments. Registration time was tracked throughout the setup process as well as the amount of image guidance (IGRT) utilized for final alignment. RESULTS In the proton cohort, the mean 3D shift was 0.96 cm using 3-point alignment and 1.18 cm using MixR. An equivalence test indicated that the difference in registration accuracy between the two techniques was less than 0.5 cm. In the photon cohort, the mean 3D shift was 1.18 cm using 3-point alignment and 1.00 cm using MixR. An equivalence test indicated that the difference in registration accuracy was less than 0.3 cm. Minor differences were seen in registration time and the amount of IGRT utilization. CONCLUSIONS MixR for patient setup for breast cancer RT is possible at the level of accuracy and efficiency provided by a 3-point alignment. Further developments in marker tracking, feedback, and a better understanding of the perceptual challenges of MixR are needed to achieve a similar level of accuracy as provided by modern surface-guided radiotherapy (SGRT) systems. TRIAL REGISTRATION ClinicalTrials.gov, UFHPTI 2015-BR05: Improving Breast Radiotherapy Setup and Delivery Using Mixed-Reality Visualization, NCT05178927.
Collapse
Affiliation(s)
- Perry B Johnson
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA.
- University of Florida College of Medicine, Gainesville, FL, USA.
| | - Julie Bradley
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
- University of Florida College of Medicine, Gainesville, FL, USA
| | - Samsun Lampotang
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
| | - Amanda Jackson
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
| | - David Lizdas
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
| | - William Johnson
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
| | - Eric Brooks
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
- University of Florida College of Medicine, Gainesville, FL, USA
| | - Raymond B Mailhot Vega
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
- University of Florida College of Medicine, Gainesville, FL, USA
| | - Nancy Mendenhall
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
- University of Florida College of Medicine, Gainesville, FL, USA
| |
Collapse
|
2
|
Xu Y, Chen X, Wang L, You M, Deng Q, Liu D, Lin Y, Liu W, Li PC, Li J. Efficacy of a computer vision-based system for exercise management in patients with knee osteoarthritis: a study protocol for a randomised controlled pilot trial. BMJ Open 2024; 14:e077455. [PMID: 39500602 PMCID: PMC11552600 DOI: 10.1136/bmjopen-2023-077455] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/11/2023] [Accepted: 10/16/2024] [Indexed: 11/13/2024] Open
Abstract
INTRODUCTION This study aims to evaluate the efficacy of a computer vision system in guiding exercise management for patients with knee osteoarthritis (OA) by comparing functional improvement between a tele-rehabilitation versus an outpatient intervention program. METHODS AND ANALYSIS This is a prospective, single-blind, randomised controlled trial of 60 patients with knee OA who will be randomly assigned to exercise therapy (n=30) or control (n=30) . Both groups will receive treatment two times per week for 12 weeks. The primary outcome of the study will be assessed using the University of Western Ontario and McMaster University Osteoarthritis Index (WOMAC). The Knee Injury and Osteoarthritis Outcome Score will be assessed, as well as the visual analogue scale, quality of life score and physical fitness score. All observations will be collected at baseline and at weeks 4, 8 and 12 during the intervention period, as well as at weeks 4, 8, 12 and 24 during the follow-up visits after the end of the intervention. ETHICS AND DISSEMINATION This evaluator-blinded, prospective, randomised controlled study was approved by the Biomedical Ethics Review Committee of West China Hospital of Sichuan University. TRIAL REGISTRATION NUMBER ChiCTR2300070319.
Collapse
Affiliation(s)
- Yang Xu
- Sports Medicine Center, Sichuan University West China Hospital, Chengdu, Sichuan, China
- Department of Orthopedics and Orthopedic Research Institute, West China Hospital of Sichuan University, Chengdu, Sichuan, China
| | - Xi Chen
- Sports Medicine Center, Sichuan University West China Hospital, Chengdu, Sichuan, China
- Department of Orthopedics and Orthopedic Research Institute, West China Hospital of Sichuan University, Chengdu, Sichuan, China
| | - Li Wang
- Sports Medicine Center, Sichuan University West China Hospital, Chengdu, Sichuan, China
- Department of Orthopedics and Orthopedic Research Institute, West China Hospital of Sichuan University, Chengdu, Sichuan, China
| | - Mingke You
- Sports Medicine Center, Sichuan University West China Hospital, Chengdu, Sichuan, China
- Department of Orthopedics and Orthopedic Research Institute, West China Hospital of Sichuan University, Chengdu, Sichuan, China
| | - Qian Deng
- Sichuan University West China Hospital, Chengdu, Sichuan, China
| | - Di Liu
- Jiakang Zhongzhi Technology Company, Chengdu, China
| | - Ye Lin
- University of Chicago Department of Medicine, Chicago, Illinois, USA
| | - Weizhi Liu
- Sichuan University West China Hospital, Chengdu, Sichuan, China
| | - Peng-Cheng Li
- Sports Medicine Center, Sichuan University West China Hospital, Chengdu, Sichuan, China
- Department of Orthopedics and Orthopedic Research Institute, West China Hospital of Sichuan University, Chengdu, Sichuan, China
| | - Jian Li
- Sports Medicine Center, Sichuan University West China Hospital, Chengdu, Sichuan, China
- Department of Orthopedics and Orthopedic Research Institute, West China Hospital of Sichuan University, Chengdu, Sichuan, China
| |
Collapse
|
3
|
Dean MC, Oeding JF, Diniz P, Seil R, Samuelsson K. Leveraging digital twins for improved orthopaedic evaluation and treatment. J Exp Orthop 2024; 11:e70084. [PMID: 39530111 PMCID: PMC11551062 DOI: 10.1002/jeo2.70084] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/03/2024] [Revised: 10/22/2024] [Accepted: 10/25/2024] [Indexed: 11/16/2024] Open
Abstract
Purpose The purpose of this article is to explore the potential of digital twin technologies in orthopaedics and to evaluate how their integration with artificial intelligence (AI) and deep learning (DL) can improve orthopaedic evaluation and treatment. This review addresses key applications of digital twins, including surgical planning, patient-specific outcome prediction, augmented reality-assisted surgery and simulation-based surgical training. Methods Existing studies on digital twins in various domains, including engineering, biomedical and orthopaedics are reviewed. We also reviewed advancements in AI and DL relevant to digital twins. We focused on identifying key benefits, challenges and future directions for the implementation of digital twins in orthopaedic practice. Results The review highlights that digital twins offer significant potential to revolutionise orthopaedic care by enabling precise surgical planning, real-time outcome prediction and enhanced training. Digital twins can model patient-specific anatomy using advanced imaging techniques and dynamically update with real-time data, providing valuable insights during surgery and postoperative care. However, challenges such as the need for large-scale data sets, technological limitations and integration issues must be addressed to fully realise these benefits. Conclusion Digital twins represent a promising frontier in orthopaedic research and practice, with the potential to improve patient outcomes and enhance surgical precision. To enable widespread adoption, future research must focus on overcoming current challenges and further refining the integration of digital twins with AI and DL technologies. Level of Evidence Level V.
Collapse
Affiliation(s)
- Michael C. Dean
- School of MedicineMayo Clinic Alix School of MedicineRochesterMinnesotaUSA
| | - Jacob F. Oeding
- Department of Orthopaedics, Institute of Clinical Sciences, The Sahlgrenska AcademyUniversity of GothenburgGothenburgSweden
| | - Pedro Diniz
- Department of Orthopaedic SurgeryCentre Hospitalier de Luxembourg—Clinique d'EichLuxembourgLuxembourg
| | - Romain Seil
- Department of Orthopaedic SurgeryCentre Hospitalier de Luxembourg—Clinique d'EichLuxembourgLuxembourg
| | - Kristian Samuelsson
- Department of Orthopaedics, Institute of Clinical Sciences, The Sahlgrenska AcademyUniversity of GothenburgGothenburgSweden
| | | |
Collapse
|
4
|
Huddleston HP, Credille K, Alzein MM, Cregar WM, Hevesi M, Inoue N, Yanke AB. iPhone-Based Cartilage Topography Scanning Yields Similar Results to Computed Tomography Scanning. Arthrosc Sports Med Rehabil 2024; 6:100936. [PMID: 39421352 PMCID: PMC11480792 DOI: 10.1016/j.asmr.2024.100936] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2023] [Accepted: 03/24/2024] [Indexed: 10/19/2024] Open
Abstract
Purpose To investigate the feasibility and accuracy of 3-dimensional (3D) iPhone scans using commercially available applications compared with computed tomography (CT) for mapping chondral surface topography of the knee. Methods Ten cadaveric dysplastic trochleae, 16 patellae, and 24 distal femoral condyles (DFCs) underwent CT scans and 3D scans using 3 separate optical scanning applications on an iPhone X. The 3D surface models were compared by measuring surface-to-surface least distance distribution of overlapped models using a validated 3D-3D registration volume merge method. The absolute least mean square distances for the iPhone-generated models from each scanning application were calculated in comparison to CT models using a point-to-surface distance algorithm allowing regional "inside/outside" measurement of the absolute distance between models. Results Only 1 of the 3 scanning applications created models usable for quantitative analysis. Overall, there was a median absolute least mean square distance between the usable model and CT-generated models of 0.18 mm. The trochlea group had a significantly lower median absolute least mean square distance compared with the DFC group (0.14 mm [interquartile range, 0.13-0.17] vs 0.19 mm [0.17-0.25], P = .002). iPhone models were smaller compared with CT models (negative signed distances) for all trochleae, 83% of DFCs, and 69% of patellae. Conclusions In this study, we found minimal differences between a 3D iPhone scanning application and conventional CT scanning when analyzing surface topography. Clinical Relevance Emerging 3D iPhone scanning technology can create accurate, inexpensive, real-time 3D models of the intended target. Surface topography evaluation may be useful in graft selection during surgical procedures such as osteochondral allograft transplantation.
Collapse
Affiliation(s)
| | | | - Mohamad M. Alzein
- Department of Orthopedics, Rush University Medical Center, Chicago, Illinois, U.S.A
| | | | | | - Nozomu Inoue
- Hospital for Special Surgery, New York, New York, U.S.A
| | - Adam B. Yanke
- Hospital for Special Surgery, New York, New York, U.S.A
| |
Collapse
|
5
|
Necker FN, Cholok DJ, Fischer MJ, Shaheen MS, Gifford K, Januszyk M, Leuze CW, Scholz M, Daniel BL, Momeni A. HoloDIEP-Faster and More Accurate Intraoperative DIEA Perforator Mapping Using a Novel Mixed Reality Tool. J Reconstr Microsurg 2024. [PMID: 39038461 DOI: 10.1055/s-0044-1788548] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/24/2024]
Abstract
BACKGROUND Microsurgical breast reconstruction using abdominal tissue is a complex procedure, in part, due to variable vascular/perforator anatomy. Preoperative computed tomography angiography (CTA) has mitigated this challenge to some degree; yet it continues to pose certain challenges. The ability to map perforators with Mixed Reality has been demonstrated in case studies, but its accuracy has not been studied intraoperatively. Here, we compare the accuracy of "HoloDIEP" in identifying perforator location (vs. Doppler ultrasound) by using holographic 3D models derived from preoperative CTA. METHODS Using a custom application on HoloLens, the deep inferior epigastric artery vascular tree was traced in 15 patients who underwent microsurgical breast reconstruction. Perforator markings were compared against the 3D model in a coordinate system centered on the umbilicus. Holographic- and Doppler-identified markings were compared using a perspective-corrected photo technique against the 3D model along with measurement of duration of perforator mapping for each technique. RESULTS Vascular points in HoloDIEP skin markings were -0.97 ± 6.2 mm (perforators: -0.62 ± 6.13 mm) away from 3D-model ground-truth in radial length from the umbilicus at a true distance of 10.81 ± 6.14 mm (perforators: 11.40 ± 6.15 mm). Absolute difference in radial distance was twice as high for Doppler markings compared with Holo-markings (9.71 ± 6.16 and 4.02 ± 3.20 mm, respectively). Only in half of all cases (7/14), more than 50% of the Doppler-identified points were reasonably close (<30 mm) to 3D-model ground-truth. HoloDIEP was twice as fast as Doppler ultrasound (76.9s vs. 150.4 s per abdomen). CONCLUSION HoloDIEP allows for faster and more accurate intraoperative perforator mapping than Doppler ultrasound.
Collapse
Affiliation(s)
- Fabian N Necker
- Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, California
- Digital Anatomy Lab, Faculty of Medicine, Institute of Functional and Clinical Anatomy, Friedrich-Alexander Universität Erlangen-Nürnberg (FAU), Erlangen, Germany
- Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, California
| | - David J Cholok
- Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, California
| | - Marc J Fischer
- Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, California
| | - Mohammed S Shaheen
- Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, California
| | - Kyle Gifford
- Department of Radiology, 3D and Quantitative Imaging, Stanford University School of Medicine, Stanford, California
| | - Michael Januszyk
- Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, California
| | - Christoph W Leuze
- Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, California
| | - Michael Scholz
- Digital Anatomy Lab, Faculty of Medicine, Institute of Functional and Clinical Anatomy, Friedrich-Alexander Universität Erlangen-Nürnberg (FAU), Erlangen, Germany
| | - Bruce L Daniel
- Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, California
| | - Arash Momeni
- Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, California
| |
Collapse
|
6
|
Polt M, Viehöfer AF, Casari FA, Imhoff FB, Wirth SH, Zimmermann SM. Conventional vs Augmented Reality-Guided Lateral Calcaneal Lengthening Simulated in a Foot Bone Model. Foot Ankle Int 2024; 45:773-783. [PMID: 38501722 PMCID: PMC11290017 DOI: 10.1177/10711007241237532] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 03/20/2024]
Abstract
BACKGROUND Acquired adult flatfoot deformity (AAFD) results in a loss of the medial longitudinal arch of the foot and dysfunction of the posteromedial soft tissues. Hintermann osteotomy (H-O) is often used to treat stage II AAFD. The procedure is challenging because of variations in the subtalar facets and limited intraoperative visibility. We aimed to assess the impact of augmented reality (AR) guidance on surgical accuracy and the facet violation rate. METHODS Sixty AR-guided and 60 conventional osteotomies were performed on foot bone models. For AR osteotomies, the ideal osteotomy plane was uploaded to a Microsoft HoloLens 1 headset and carried out in strict accordance with the superimposed holographic plane. The conventional osteotomies were performed relying solely on the anatomy of the calcaneal lateral column. The rate and severity of facet joint violation was measured, as well as accuracy of entry and exit points. The results were compared across AR-guided and conventional osteotomies, and between experienced and inexperienced surgeons. RESULTS Experienced surgeons showed significantly greater accuracy for the osteotomy entry point using AR, with the mean deviation of 1.6 ± 0.9 mm (95% CI 1.26, 1.93) compared to 2.3 ± 1.3 mm (95% CI 1.87, 2.79) in the conventional method (P = .035). The inexperienced had improved accuracy, although not statistically significant (P = .064), with the mean deviation of 2.0 ± 1.5 mm (95% CI 1.47, 2.55) using AR compared with 2.7 ± 1.6 mm (95% CI 2.18, 3.32) in the conventional method. AR helped the experienced surgeons avoid full violation of the posterior facet (P = .011). Inexperienced surgeons had a higher rate of middle and posterior facet injury with both methods (P = .005 and .021). CONCLUSION Application of AR guidance during H-O was associated with improved accuracy for experienced surgeons, demonstrated by a better accuracy of the osteotomy entry point. More crucially, AR guidance prevented full violation of the posterior facet in the experienced group. Further research is needed to address limitations and test this technology on cadaver feet. Ultimately, the use of AR in surgery has the potential to improve patient and surgeon safety while minimizing radiation exposure. CLINICAL RELEVANCE Subtalar facet injury during lateral column lengthening osteotomy represents a real problem in clinical orthopaedic practice. Because of limited intraoperative visibility and variable anatomy, it is hard to resolve this issue with conventional means. This study suggests the potential of augmented reality to improve the osteotomy accuracy.
Collapse
Affiliation(s)
- Maksym Polt
- Department of Orthopaedics, Balgrist University Hospital, University of Zürich, Zürich, Switzerland
| | - Arnd F. Viehöfer
- Department of Orthopaedics, Balgrist University Hospital, University of Zürich, Zürich, Switzerland
| | - Fabio A. Casari
- Department of Orthopaedics, Balgrist University Hospital, University of Zürich, Zürich, Switzerland
| | - Florian B. Imhoff
- Department of Orthopaedics, Balgrist University Hospital, University of Zürich, Zürich, Switzerland
| | - Stephan H. Wirth
- Department of Orthopaedics, Balgrist University Hospital, University of Zürich, Zürich, Switzerland
| | - Stefan M. Zimmermann
- Department of Orthopaedics, Balgrist University Hospital, University of Zürich, Zürich, Switzerland
| |
Collapse
|
7
|
Martin-Gomez A, Li H, Song T, Yang S, Wang G, Ding H, Navab N, Zhao Z, Armand M. STTAR: Surgical Tool Tracking Using Off-the-Shelf Augmented Reality Head-Mounted Displays. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:3578-3593. [PMID: 37021885 PMCID: PMC10959446 DOI: 10.1109/tvcg.2023.3238309] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
The use of Augmented Reality (AR) for navigation purposes has shown beneficial in assisting physicians during the performance of surgical procedures. These applications commonly require knowing the pose of surgical tools and patients to provide visual information that surgeons can use during the performance of the task. Existing medical-grade tracking systems use infrared cameras placed inside the Operating Room (OR) to identify retro-reflective markers attached to objects of interest and compute their pose. Some commercially available AR Head-Mounted Displays (HMDs) use similar cameras for self-localization, hand tracking, and estimating the objects' depth. This work presents a framework that uses the built-in cameras of AR HMDs to enable accurate tracking of retro-reflective markers without the need to integrate any additional electronics into the HMD. The proposed framework can simultaneously track multiple tools without having previous knowledge of their geometry and only requires establishing a local network between the headset and a workstation. Our results show that the tracking and detection of the markers can be achieved with an accuracy of 0.09±0.06 mm on lateral translation, 0.42 ±0.32 mm on longitudinal translation and 0.80 ±0.39° for rotations around the vertical axis. Furthermore, to showcase the relevance of the proposed framework, we evaluate the system's performance in the context of surgical procedures. This use case was designed to replicate the scenarios of k-wire insertions in orthopedic procedures. For evaluation, seven surgeons were provided with visual navigation and asked to perform 24 injections using the proposed framework. A second study with ten participants served to investigate the capabilities of the framework in the context of more general scenarios. Results from these studies provided comparable accuracy to those reported in the literature for AR-based navigation procedures.
Collapse
Affiliation(s)
- Alejandro Martin-Gomez
- Laboratory for Computational Sensing and Robotics, Whiting School of Engineering, Johns Hopkins University, United States of America
| | - Haowei Li
- Department of Biomedical Engineering, Tsinghua University, China
| | - Tianyu Song
- Chair for Computer Aided Medical Procedures and Augmented Reality, Department of Informatics, Technical University of Munich, Germany
| | - Sheng Yang
- Department of Biomedical Engineering, Tsinghua University, China
| | - Guangzhi Wang
- Department of Biomedical Engineering, Tsinghua University, China
| | - Hui Ding
- Department of Biomedical Engineering, Tsinghua University, China
| | - Nassir Navab
- Laboratory for Computational Sensing and Robotics, Whiting School of Engineering, Johns Hopkins University, United States of America
- Chair for Computer Aided Medical Procedures and Augmented Reality, Department of Informatics, Technical University of Munich, Germany
| | - Zhe Zhao
- Department of Orthopaedics, Beijing Tsinghua Changgung Hospital. School of Clinical Medicine, Tsinghua University
| | - Mehran Armand
- Laboratory for Computational Sensing and Robotics, Whiting School of Engineering, Johns Hopkins University, United States of America
- Department of Orthopaedic Surgery, Johns Hopkins University School of Medicine, United States of America
| |
Collapse
|
8
|
Welch N, Montgomery BK, Ross K, Mota F, Mo M, Grigoriou E, Tarchala M, Roaten J, Miller P, Hedequist D, Birch CM. Using Immersive Virtual Reality to Classify Pediatric Thoracolumbar Spine Injuries. Cureus 2024; 16:e64851. [PMID: 39156384 PMCID: PMC11330310 DOI: 10.7759/cureus.64851] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/07/2024] [Indexed: 08/20/2024] Open
Abstract
Objective This study aimed to assess the reliability and reproducibility of the AO Spine Thoracolumbar Injury Classification System by using virtual reality (VR). We hypothesized that VR is a highly reliable and reproducible method to classify traumatic spine injuries. Methods VR 3D models were created from CT scans of 26 pediatric patients with thoracolumbar spine injuries. Seven orthopedic trainees were educated on the VR platform and AO Spine Thoracolumbar Injury Classification System. Classifications were summarized by primary class and subclass for both rater readings performed two weeks apart with image order randomized. Intra-observer reproducibility was quantified by Fleiss's kappa (kF) for primary classifications and Krippendorff's alpha (aK) for subclassifications along with 95% confidence intervals (CIs) for each rater and across all raters. Inter-observer reliability was quantified by kF for primary classifications and aK for subclassifications along with 95% CIs across all raters for the first read, the second read, and all reads combined. The interpretations were as follows: 0-0.2: slight; 0.2-0.4: fair; 0.4-0.6: moderate; 0.6-0.8: substantial; and >0.8: almost perfect agreement. Results A total of 364 classifications were submitted by seven raters. Intra-observer reproducibility ranged from moderate (kF=0.55) to almost perfect (kF=0.94) for primary classifications and from substantial (aK=0.68) to almost perfect (aK=0.91) for subclassifications. Reproducibility was substantial across all raters for the primary class (kF=0.71; 95% CI=0.61-9.82) and subclass (aK=0.79; 95% CI=0.69-0.86). Inter-observer reliability was substantial (kF=0.63; 95% CI=0.57-0.69) for the first read, moderate (kF=0.58; 95% CI=0.52-0.64) for the second read, and substantial (kF=0.61; 95% CI=0.56-0.65) for all reads for primary classifications. For subclassifications, inter-observer reliability was substantial (aK=0.74; 95% CI=0.58-0.83) for the first read, second read (aK=0.70; 95% CI=0.53-0.80), and all reads (aK=0.72; 95% CI=0.60-0.79). Conclusions Based on our findings, VR is a reliable and reproducible method for the classification of pediatric spine trauma, besides its ability to function as an educational tool for trainees. Further research is needed to evaluate its application for other spine conditions.
Collapse
Affiliation(s)
- Nicole Welch
- Department of Orthopedic Surgery, Boston Children's Hospital, Boston, USA
| | - Blake K Montgomery
- Department of Orthopedic Surgery, Boston Children's Hospital, Boston, USA
| | - Kirsten Ross
- Department of Orthopedic Surgery, Boston Children's Hospital, Boston, USA
| | - Frank Mota
- Department of Orthopedic Surgery, Boston Children's Hospital, Boston, USA
| | - Michelle Mo
- Department of Orthopedic Surgery, Boston Children's Hospital, Boston, USA
| | | | - Magdalena Tarchala
- Department of Orthopedic Surgery, Boston Children's Hospital, Boston, USA
| | - John Roaten
- Department of Orthopedic Surgery, Boston Children's Hospital, Boston, USA
| | - Patricia Miller
- Department of Orthopedic Surgery, Boston Children's Hospital, Boston, USA
| | - Daniel Hedequist
- Department of Orthopedic Surgery, Boston Children's Hospital, Boston, USA
| | - Craig M Birch
- Department of Orthopedic Surgery, Boston Children's Hospital, Boston, USA
| |
Collapse
|
9
|
Longo UG, Lalli A, Gobbato B, Nazarian A. Metaverse, virtual reality and augmented reality in total shoulder arthroplasty: a systematic review. BMC Musculoskelet Disord 2024; 25:396. [PMID: 38773483 PMCID: PMC11106997 DOI: 10.1186/s12891-024-07436-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/27/2023] [Accepted: 04/11/2024] [Indexed: 05/23/2024] Open
Abstract
PURPOSE This systematic review aims to provide an overview of the current knowledge on the role of the metaverse, augmented reality, and virtual reality in reverse shoulder arthroplasty. METHODS A systematic review was performed using the PRISMA guidelines. A comprehensive review of the applications of the metaverse, augmented reality, and virtual reality in in-vivo intraoperative navigation, in the training of orthopedic residents, and in the latest innovations proposed in ex-vivo studies was conducted. RESULTS A total of 22 articles were included in the review. Data on navigated shoulder arthroplasty was extracted from 14 articles: seven hundred ninety-three patients treated with intraoperative navigated rTSA or aTSA were included. Also, three randomized control trials (RCTs) reported outcomes on a total of fifty-three orthopedics surgical residents and doctors receiving VR-based training for rTSA, which were also included in the review. Three studies reporting the latest VR and AR-based rTSA applications and two proof of concept studies were also included in the review. CONCLUSIONS The metaverse, augmented reality, and virtual reality present immense potential for the future of orthopedic surgery. As these technologies advance, it is crucial to conduct additional research, foster development, and seamlessly integrate them into surgical education to fully harness their capabilities and transform the field. This evolution promises enhanced accuracy, expanded training opportunities, and improved surgical planning capabilities.
Collapse
Affiliation(s)
- Umile Giuseppe Longo
- Fondazione Policlinico Universitario Campus Bio-Medico, Via Alvaro del Portillo, 200, Roma, 00128, Italy.
- Research Unit of Orthopaedic and Trauma Surgery, Department of Medicine and Surgery, Università Campus Bio-Medico di Roma, Via Alvaro del Portillo, 21, Roma, 00128, Italy.
| | - Alberto Lalli
- Fondazione Policlinico Universitario Campus Bio-Medico, Via Alvaro del Portillo, 200, Roma, 00128, Italy
- Research Unit of Orthopaedic and Trauma Surgery, Department of Medicine and Surgery, Università Campus Bio-Medico di Roma, Via Alvaro del Portillo, 21, Roma, 00128, Italy
| | - Bruno Gobbato
- Department of Orthopaedic Surgery, Hospital Sao Jose Jaraguá do Sul, Jaraguá, SC, 89251-830, Brazil
| | - Ara Nazarian
- Musculoskeletal Translational Innovation Initiative, Carl J. Shapiro Department of Orthopaedic Surgery, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA, USA
| |
Collapse
|
10
|
Canton SP, Austin CN, Steuer F, Dadi S, Sharma N, Kass NM, Fogg D, Clayton E, Cunningham O, Scott D, LaBaze D, Andrews EG, Biehl JT, Hogan MV. Feasibility and Usability of Augmented Reality Technology in the Orthopaedic Operating Room. Curr Rev Musculoskelet Med 2024; 17:117-128. [PMID: 38607522 PMCID: PMC11068703 DOI: 10.1007/s12178-024-09888-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 02/06/2024] [Indexed: 04/13/2024]
Abstract
PURPOSE OF REVIEW Augmented reality (AR) has gained popularity in various sectors, including gaming, entertainment, and healthcare. The desire for improved surgical navigation within orthopaedic surgery has led to the evaluation of the feasibility and usability of AR in the operating room (OR). However, the safe and effective use of AR technology in the OR necessitates a proper understanding of its capabilities and limitations. This review aims to describe the fundamental elements of AR, highlight limitations for use within the field of orthopaedic surgery, and discuss potential areas for development. RECENT FINDINGS To date, studies have demonstrated evidence that AR technology can be used to enhance navigation and performance in orthopaedic procedures. General hardware and software limitations of the technology include the registration process, ergonomics, and battery life. Other limitations are related to the human response factors such as inattentional blindness, which may lead to the inability to see complications within the surgical field. Furthermore, the prolonged use of AR can cause eye strain and headache due to phenomena such as the vergence-convergence conflict. AR technology may prove to be a better alternative to current orthopaedic surgery navigation systems. However, the current limitations should be mitigated to further improve the feasibility and usability of AR in the OR setting. It is important for both non-clinicians and clinicians to work in conjunction to guide the development of future iterations of AR technology and its implementation into the OR workflow.
Collapse
Affiliation(s)
- Stephen P Canton
- Department of Orthopaedic Surgery, University of Pittsburgh, 3471 Fifth Ave, Pittsburgh, PA, 15213, USA.
| | | | - Fritz Steuer
- University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
| | - Srujan Dadi
- Rowan-Virtua School of Osteopathic Medicine, Stratford, NJ, USA
| | - Nikhil Sharma
- University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
| | - Nicolás M Kass
- University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
| | - David Fogg
- Texas Tech University Health Sciences Center El Paso, El Paso, TX, USA
| | - Elizabeth Clayton
- Department of Orthopaedic Surgery, University of Pittsburgh, 3471 Fifth Ave, Pittsburgh, PA, 15213, USA
| | - Onaje Cunningham
- University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
| | - Devon Scott
- Department of Orthopaedic Surgery, University of Pittsburgh, 3471 Fifth Ave, Pittsburgh, PA, 15213, USA
| | - Dukens LaBaze
- Department of Orthopaedic Surgery, University of Pittsburgh, 3471 Fifth Ave, Pittsburgh, PA, 15213, USA
| | - Edward G Andrews
- Department of Neurological Surgery University of Pittsburgh Medical Center, Pittsburgh, PA, USA
| | - Jacob T Biehl
- School of Computing and Information, University of Pittsburgh, Pittsburgh, PA, USA
| | - MaCalus V Hogan
- Department of Orthopaedic Surgery, University of Pittsburgh, 3471 Fifth Ave, Pittsburgh, PA, 15213, USA
| |
Collapse
|
11
|
Hong HT, Koh YG, Cho BW, Kwon HM, Park KK, Kang KT. An Image-Based Augmented Reality System for Achieving Accurate Bone Resection in Total Knee Arthroplasty. Cureus 2024; 16:e58281. [PMID: 38752081 PMCID: PMC11094513 DOI: 10.7759/cureus.58281] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/15/2024] [Indexed: 05/18/2024] Open
Abstract
Background and objective With the steady advancement of computer-assisted surgical techniques, the importance of assessing and researching technology related to total knee arthroplasty (TKA) procedures has increased. Augmented reality (AR), a recently proposed next-generation technology, is expected to enhance the precision of orthopedic surgery by providing a more efficient and cost-effective approach. However, the accuracy of image-based AR in TKA surgery has not been established. Therefore, this study aimed to determine whether accurate bone resection can be achieved in TKA surgery using image-based AR. Methods In this study, we replaced traditional CT imaging and reconstructions for creating a bone 3D model by direct 3D scanning of the femur and tibia. The preoperative planning involved identifying anatomical landmarks and determining the surgical details. During surgery, markers were employed to create a local coordinate system for an AR-assisted surgical system using a Polaris camera. This approach helped minimize discrepancies between the 3D model and actual positioning, ensuring accurate alignment. Results The AR-assisted surgery using the image method resulted in fewer errors [average error: 0.32 mm; standard deviation (SD): 0.143] between the bone resection depth of the preoperative surgical plan and the bone model test results. Conclusions Our findings demonstrated the accuracy of bone resectioning by using image-based AR-assisted navigation for TKA surgery. Image-based AR-assisted navigation in TKA surgery is a valuable tool not only for enhancing accuracy by using smart glasses and sensors but also for improving the efficiency of the procedure. Therefore, we anticipate that image-based AR-assisted navigation in TKA surgery will gain wide acceptance in practice.
Collapse
Affiliation(s)
| | - Yong-Gon Koh
- Joint Reconstruction Center, Department of Orthopedic Surgery, Yonsei Sarang Hospital, Seoul, KOR
| | - Byung Woo Cho
- Department of Orthopedic Surgery, Severance Hospital, Yonsei University College of Medicine, Seoul, KOR
| | - Hyuck Min Kwon
- Department of Orthopedic Surgery, Yonsei University College of Medicine, Seoul, KOR
| | - Kwan Kyu Park
- Department of Orthopedic Surgery, Yonsei University College of Medicine, Seoul, KOR
| | - Kyoung-Tak Kang
- Skyve R&D LAB, Skyve Co. LTD., Seoul, KOR
- Mechanical Engineering, Yonsei University, Seoul, KOR
| |
Collapse
|
12
|
Cai Y, Zhu M, He B, Zhang J. Distributed visual positioning for surgical instrument tracking. Phys Eng Sci Med 2024; 47:273-286. [PMID: 38194180 DOI: 10.1007/s13246-023-01363-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2023] [Accepted: 11/28/2023] [Indexed: 01/10/2024]
Abstract
In clinical operations, it is crucial for surgeons to know the location of the surgical instrument. Traditional positioning systems have difficulty dealing with camera occlusion, marker occlusion, and environmental interference. To address these issues, we propose a distributed visual positioning system for surgical instrument tracking in surgery. First, we design the marker pattern with a black and white triangular grid and dot that can be adapted to various instrument surfaces and improve the marker location accuracy of the feature. The cross-points in the marker are the features that each feature has a unique ID. Furthermore, we proposed detection and identification for the position-sensing marker to realize the accurate location and identification of features. Second, we introduce multi Perspective-n-Point (mPnP) method, which fuses feature coordinates from all cameras to deduce the final result directly by the intrinsic and extrinsic parameters. This method provides a reliable initial value for the Bundle Adjustment algorithms. During instrument tracking, we assess the motion state of the instrument and select either dynamic or static Kalman filtering to mitigate any jitter in the instrument's movement. The core algorithms comparison experiment indicates our positioning algorithm has a lower reprojection error comparison to the mainstream algorithms. A series of quantitative experiments showed that the proposed system positioning error is below 0.207 mm, and the run time is below 118.842 ms. The results demonstrate the tremendous clinical application potential of our system providing accurate positioning of instruments promoting the efficiency and safety of clinical surgery.
Collapse
Affiliation(s)
- Yu Cai
- School of Mechanical Engineering, Fuzhou University, Fuzhou, 350108, China
| | - Mingzhu Zhu
- School of Mechanical Engineering, Fuzhou University, Fuzhou, 350108, China.
| | - Bingwei He
- School of Mechanical Engineering, Fuzhou University, Fuzhou, 350108, China.
| | - Jianwei Zhang
- Department of Informatics, University of Hamburg, 22527, Hamburg, Germany
| |
Collapse
|
13
|
Cholok DJ, Fischer MJ, Leuze CW, Januszyk M, Daniel BL, Momeni A. Spatial Fidelity of Microvascular Perforating Vessels as Perceived by Augmented Reality Virtual Projections. Plast Reconstr Surg 2024; 153:524-534. [PMID: 37092985 DOI: 10.1097/prs.0000000000010592] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/25/2023]
Abstract
BACKGROUND Autologous breast reconstruction yields improved long-term aesthetic results but requires increased resources of practitioners and hospital systems. Innovations in radiographic imaging have been increasingly used to improve the efficiency and success of free flap harvest. Augmented reality affords the opportunity to superimpose relevant imaging on a surgeon's native field of view, potentially facilitating dissection of anatomically variable structures. To validate the spatial fidelity of augmented reality projections of deep inferior epigastric perforator flap-relevant anatomy, comparisons of three-dimensional (3D) models and their virtual renderings were performed by four independent observers. Measured discrepancies between the real and holographic models were evaluated. METHODS The 3D-printed models of deep inferior epigastric perforator flap-relevant anatomy were fabricated from computed tomographic angiography data from 19 de-identified patients. The corresponding computed tomographic angiography data were similarly formatted for the Microsoft HoloLens to generate corresponding projections. Anatomic points were initially measured on 3D models, after which the corresponding points were measured on the HoloLens projections from two separate vantage points (V1 and V2). Statistical analyses, including generalized linear modeling, were performed to characterize spatial fidelity regarding translation, rotation, and scale of holographic projections. RESULTS Among all participants, the median translational displacement at corresponding points was 9.0 mm between the real-3D model and V1, 12.1 mm between the 3D model and V2, and 13.5 mm between V1 and V2. CONCLUSION Corresponding points, including topography of perforating vessels, for the purposes of breast reconstruction can be identified within millimeters, but there remain multiple independent contributors of error, most notably the participant and location at which the projection is perceived.
Collapse
Affiliation(s)
| | - Marc J Fischer
- Department of Radiology, Stanford University School of Medicine
| | | | | | - Bruce L Daniel
- Department of Radiology, Stanford University School of Medicine
| | - Arash Momeni
- From the Division of Plastic and Reconstructive Surgery
| |
Collapse
|
14
|
Mavrodontis II, Trikoupis IG, Kontogeorgakos VA, Savvidou OD, Papagelopoulos PJ. Point-of-Care Orthopedic Oncology Device Development. Curr Oncol 2023; 31:211-228. [PMID: 38248099 PMCID: PMC10814108 DOI: 10.3390/curroncol31010014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2023] [Revised: 12/08/2023] [Accepted: 12/26/2023] [Indexed: 01/23/2024] Open
Abstract
BACKGROUND The triad of 3D design, 3D printing, and xReality technologies is explored and exploited to collaboratively realize patient-specific products in a timely manner with an emphasis on designs with meta-(bio)materials. METHODS A case study on pelvic reconstruction after oncological resection (osteosarcoma) was selected and conducted to evaluate the applicability and performance of an inter-epistemic workflow and the feasibility and potential of 3D technologies for modeling, optimizing, and materializing individualized orthopedic devices at the point of care (PoC). RESULTS Image-based diagnosis and treatment at the PoC can be readily deployed to develop orthopedic devices for pre-operative planning, training, intra-operative navigation, and bone substitution. CONCLUSIONS Inter-epistemic symbiosis between orthopedic surgeons and (bio)mechanical engineers at the PoC, fostered by appropriate quality management systems and end-to-end workflows under suitable scientifically amalgamated synergies, could maximize the potential benefits. However, increased awareness is recommended to explore and exploit the full potential of 3D technologies at the PoC to deliver medical devices with greater customization, innovation in design, cost-effectiveness, and high quality.
Collapse
Affiliation(s)
- Ioannis I. Mavrodontis
- First Department of Orthopaedic Surgery, School of Medicine, National and Kapodistrian University of Athens, 11527 Athens, Greece; (I.G.T.); (V.A.K.); (O.D.S.); (P.J.P.)
| | | | | | | | | |
Collapse
|
15
|
Moglia A, Marsilio L, Rossi M, Pinelli M, Lettieri E, Mainardi L, Manzotti A, Cerveri P. Mixed Reality and Artificial Intelligence: A Holistic Approach to Multimodal Visualization and Extended Interaction in Knee Osteotomy. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE 2023; 12:279-290. [PMID: 38410183 PMCID: PMC10896423 DOI: 10.1109/jtehm.2023.3335608] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/25/2023] [Revised: 10/16/2023] [Accepted: 11/17/2023] [Indexed: 02/28/2024]
Abstract
OBJECTIVE Recent advancements in augmented reality led to planning and navigation systems for orthopedic surgery. However little is known about mixed reality (MR) in orthopedics. Furthermore, artificial intelligence (AI) has the potential to boost the capabilities of MR by enabling automation and personalization. The purpose of this work is to assess Holoknee prototype, based on AI and MR for multimodal data visualization and surgical planning in knee osteotomy, developed to run on the HoloLens 2 headset. METHODS Two preclinical test sessions were performed with 11 participants (eight surgeons, two residents, and one medical student) executing three times six tasks, corresponding to a number of holographic data interactions and preoperative planning steps. At the end of each session, participants answered a questionnaire on user perception and usability. RESULTS During the second trial, the participants were faster in all tasks than in the first one, while in the third one, the time of execution decreased only for two tasks ("Patient selection" and "Scrolling through radiograph") with respect to the second attempt, but without statistically significant difference (respectively [Formula: see text] = 0.14 and [Formula: see text] = 0.13, [Formula: see text]). All subjects strongly agreed that MR can be used effectively for surgical training, whereas 10 (90.9%) strongly agreed that it can be used effectively for preoperative planning. Six (54.5%) agreed and two of them (18.2%) strongly agreed that it can be used effectively for intraoperative guidance. DISCUSSION/CONCLUSION In this work, we presented Holoknee, the first holistic application of AI and MR for surgical planning for knee osteotomy. It reported promising results on its potential translation to surgical training, preoperative planning, and surgical guidance. Clinical and Translational Impact Statement - Holoknee can be helpful to support surgeons in the preoperative planning of knee osteotomy. It has the potential to impact positively the training of the future generation of residents and aid surgeons in the intraoperative stage.
Collapse
Affiliation(s)
- Andrea Moglia
- Department of ElectronicsInformation and BioengineeringPolitecnico di Milano20133MilanItaly
| | - Luca Marsilio
- Department of ElectronicsInformation and BioengineeringPolitecnico di Milano20133MilanItaly
| | - Matteo Rossi
- Department of ElectronicsInformation and BioengineeringPolitecnico di Milano20133MilanItaly
- Istituto Auxologico Italiano IRCCS20149MilanItaly
| | - Maria Pinelli
- Department of Management, Economics and Industrial EngineeringPolitecnico di Milano20133MilanItaly
| | - Emanuele Lettieri
- Department of Management, Economics and Industrial EngineeringPolitecnico di Milano20133MilanItaly
| | - Luca Mainardi
- Department of ElectronicsInformation and BioengineeringPolitecnico di Milano20133MilanItaly
| | | | - Pietro Cerveri
- Department of ElectronicsInformation and BioengineeringPolitecnico di Milano20133MilanItaly
- Istituto Auxologico Italiano IRCCS20149MilanItaly
| |
Collapse
|
16
|
Orchard L, Van M, Abbas J, Malik R, Stevenson J, Tolley N. Mixed-reality technology for clinical communication: objective assessment of the HoloLens 2 as a clinical communication device in a simulated on-call scenario. J Laryngol Otol 2023; 137:1165-1169. [PMID: 36992658 DOI: 10.1017/s0022215123000531] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/31/2023]
Abstract
OBJECTIVE Specialty on-call clinicians cover large areas and complex workloads. This study aimed to assess clinical communication using the mixed-reality HoloLens 2 device within a simulated on-call scenario. METHOD This study was structured as a randomised, within-participant, controlled study. Thirty ENT trainees used either the HoloLens 2 or a traditional telephone to communicate a clinical case to a consultant. The quality of the clinical communication was scored objectively and subjectively. RESULTS Clinical communication using the HoloLens 2 scored statistically higher than telephone (n = 30) (11.9 of 15 vs 10.2 of 15; p = 0.001). Subjectively, consultants judged more communication episodes to be inadequate when using the telephone (7 of 30) versus the HoloLens 2 (0 of 30) (p = 0.01). Qualitative feedback indicates that the HoloLens 2 was easy to use and would add value during an on-call scenario with remote consultant supervision. CONCLUSION This study demonstrated the benefit that mixed-reality devices, such as the HoloLens 2 can bring to clinical communication through increasing the accuracy of communication and confidence of the users.
Collapse
Affiliation(s)
- L Orchard
- Department of ENT Surgery, St Mary's Hospital, Praed St, London, UK
| | - M Van
- Department of ENT Surgery, St Mary's Hospital, Praed St, London, UK
| | - J Abbas
- Human Factors Academy, Manchester University NHS Trust, University of Manchester, Manchester, UK
| | - R Malik
- Medical School, Imperial College London, London, UK
| | - J Stevenson
- Infomation Technology, Imperial College Healthcare NHS Trust, London, UK
| | - N Tolley
- Department of ENT Surgery, St Mary's Hospital, Praed St, London, UK
| |
Collapse
|
17
|
Suter D, Hodel S, Liebmann F, Fürnstahl P, Farshad M. Factors affecting augmented reality head-mounted device performance in real OR. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2023; 32:3425-3433. [PMID: 37552327 DOI: 10.1007/s00586-023-07826-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/25/2022] [Revised: 05/01/2023] [Accepted: 06/12/2023] [Indexed: 08/09/2023]
Abstract
PURPOSE Over the last years, interest and efforts to implement augmented reality (AR) in orthopedic surgery through head-mounted devices (HMD) have increased. However, the majority of experiments were preclinical and within a controlled laboratory environment. The operating room (OR) is a more challenging environment with various confounding factors potentially affecting the performance of an AR-HMD. The aim of this study was to assess the performance of an AR-HMD in a real-life OR setting. METHODS An established AR application using the HoloLens 2 HMD was tested in an OR and in a laboratory by two users. The accuracy of the hologram overlay, the time to complete the trial, the number of rejected registration attempts, the delay in live overlay of the hologram, and the number of completely failed runs were recorded. Further, different OR setting parameters (light condition, setting up partitions, movement of personnel, and anchor placement) were modified and compared. RESULTS Time for full registration was higher with 48 s (IQR 24 s) in the OR versus 33 s (IQR 10 s) in the laboratory setting (p < 0.001). The other investigated parameters didn't differ significantly if an optimal OR setting was used. Within the OR, the strongest influence on performance of the AR-HMD was different light conditions with direct light illumination on the situs being the least favorable. CONCLUSION AR-HMDs are affected by different OR setups. Standardization measures for better AR-HMD performance include avoiding direct light illumination on the situs, setting up partitions, and minimizing the movement of personnel.
Collapse
Affiliation(s)
- Daniel Suter
- Research in Orthopedic Computer Science, University Hospital Balgrist, University of Zurich, Balgrist Campus, Lengghalde 5, 8008, Zurich, Switzerland.
- Department of Orthopedic Surgery, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland.
| | - Sandro Hodel
- Research in Orthopedic Computer Science, University Hospital Balgrist, University of Zurich, Balgrist Campus, Lengghalde 5, 8008, Zurich, Switzerland
- Department of Orthopedic Surgery, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - Florentin Liebmann
- Research in Orthopedic Computer Science, University Hospital Balgrist, University of Zurich, Balgrist Campus, Lengghalde 5, 8008, Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopedic Computer Science, University Hospital Balgrist, University of Zurich, Balgrist Campus, Lengghalde 5, 8008, Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopedic Surgery, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
- Spine Division, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| |
Collapse
|
18
|
León-Muñoz VJ, Santonja-Medina F, Lajara-Marco F, Lisón-Almagro AJ, Jiménez-Olivares J, Marín-Martínez C, Amor-Jiménez S, Galián-Muñoz E, López-López M, Moya-Angeler J. The Accuracy and Absolute Reliability of a Knee Surgery Assistance System Based on ArUco-Type Sensors. SENSORS (BASEL, SWITZERLAND) 2023; 23:8091. [PMID: 37836921 PMCID: PMC10575457 DOI: 10.3390/s23198091] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/05/2023] [Revised: 09/21/2023] [Accepted: 09/25/2023] [Indexed: 10/15/2023]
Abstract
Recent advances allow the use of Augmented Reality (AR) for many medical procedures. AR via optical navigators to aid various knee surgery techniques (e.g., femoral and tibial osteotomies, ligament reconstructions or menisci transplants) is becoming increasingly frequent. Accuracy in these procedures is essential, but evaluations of this technology still need to be made. Our study aimed to evaluate the system's accuracy using an in vitro protocol. We hypothesised that the system's accuracy was equal to or less than 1 mm and 1° for distance and angular measurements, respectively. Our research was an in vitro laboratory with a 316 L steel model. Absolute reliability was assessed according to the Hopkins criteria by seven independent evaluators. Each observer measured the thirty palpation points and the trademarks to acquire direct angular measurements on three occasions separated by at least two weeks. The system's accuracy in assessing distances had a mean error of 1.203 mm and an uncertainty of 2.062, and for the angular values, a mean error of 0.778° and an uncertainty of 1.438. The intraclass correlation coefficient was for all intra-observer and inter-observers, almost perfect or perfect. The mean error for the distance's determination was statistically larger than 1 mm (1.203 mm) but with a trivial effect size. The mean error assessing angular values was statistically less than 1°. Our results are similar to those published by other authors in accuracy analyses of AR systems.
Collapse
Affiliation(s)
- Vicente J. León-Muñoz
- Department of Orthopaedic Surgery and Traumatology, Hospital General Universitario Reina Sofía, 30003 Murcia, Spain; (F.L.-M.); (A.J.L.-A.); (C.M.-M.); (S.A.-J.); (E.G.-M.); (J.M.-A.)
- Instituto de Cirugía Avanzada de la Rodilla (ICAR), 30005 Murcia, Spain
| | - Fernando Santonja-Medina
- Department of Orthopaedic Surgery and Traumatology, Hospital Clínico Universitario Virgen de la Arrixaca, 30120 Murcia, Spain;
- Department of Surgery, Paediatrics and Obstetrics & Gynaecology, Faculty of Medicine, University of Murcia, 30120 Murcia, Spain
| | - Francisco Lajara-Marco
- Department of Orthopaedic Surgery and Traumatology, Hospital General Universitario Reina Sofía, 30003 Murcia, Spain; (F.L.-M.); (A.J.L.-A.); (C.M.-M.); (S.A.-J.); (E.G.-M.); (J.M.-A.)
| | - Alonso J. Lisón-Almagro
- Department of Orthopaedic Surgery and Traumatology, Hospital General Universitario Reina Sofía, 30003 Murcia, Spain; (F.L.-M.); (A.J.L.-A.); (C.M.-M.); (S.A.-J.); (E.G.-M.); (J.M.-A.)
| | - Jesús Jiménez-Olivares
- Department of Orthopaedic Surgery and Traumatology, Hospital Vega Baja, 03314 Orihuela, Spain;
| | - Carmelo Marín-Martínez
- Department of Orthopaedic Surgery and Traumatology, Hospital General Universitario Reina Sofía, 30003 Murcia, Spain; (F.L.-M.); (A.J.L.-A.); (C.M.-M.); (S.A.-J.); (E.G.-M.); (J.M.-A.)
| | - Salvador Amor-Jiménez
- Department of Orthopaedic Surgery and Traumatology, Hospital General Universitario Reina Sofía, 30003 Murcia, Spain; (F.L.-M.); (A.J.L.-A.); (C.M.-M.); (S.A.-J.); (E.G.-M.); (J.M.-A.)
| | - Elena Galián-Muñoz
- Department of Orthopaedic Surgery and Traumatology, Hospital General Universitario Reina Sofía, 30003 Murcia, Spain; (F.L.-M.); (A.J.L.-A.); (C.M.-M.); (S.A.-J.); (E.G.-M.); (J.M.-A.)
| | - Mirian López-López
- Department of Information Technologies, Subdirección General de Tecnologías de la Información, Servicio Murciano de Salud, 30100 Murcia, Spain;
| | - Joaquín Moya-Angeler
- Department of Orthopaedic Surgery and Traumatology, Hospital General Universitario Reina Sofía, 30003 Murcia, Spain; (F.L.-M.); (A.J.L.-A.); (C.M.-M.); (S.A.-J.); (E.G.-M.); (J.M.-A.)
- Instituto de Cirugía Avanzada de la Rodilla (ICAR), 30005 Murcia, Spain
| |
Collapse
|
19
|
Marhofer P, Eichenberger U. Augmented reality in ultrasound-guided regional anaesthesia: useful tool or expensive toy? Br J Anaesth 2023; 131:442-445. [PMID: 37353469 DOI: 10.1016/j.bja.2023.05.022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2023] [Revised: 05/02/2023] [Accepted: 05/24/2023] [Indexed: 06/25/2023] Open
Abstract
Use of augmented reality is increasingly applied in medical education and practice. The main advantage of this technology is the display of relevant information in the visual field of multiple operators. Here we provide a critical analysis of the potential application of augmented reality in regional anaesthesia.
Collapse
Affiliation(s)
- Peter Marhofer
- Department of Anaesthesia, Intensive Care Medicine and Pain Medicine, Medical University of Vienna, Vienna, Austria.
| | - Urs Eichenberger
- Department of Anaesthesiology, Intensive Care and Pain Medicine, Balgrist University Hospital and University of Zurich, Zurich, Switzerland
| |
Collapse
|
20
|
Shahzad H, Bhatti NS, Phillips FM, Khan SN. Applications of Augmented Reality in Orthopaedic Spine Surgery. J Am Acad Orthop Surg 2023; 31:e601-e609. [PMID: 37105182 DOI: 10.5435/jaaos-d-23-00023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Accepted: 03/27/2023] [Indexed: 04/29/2023] Open
Abstract
The application of augmented reality (AR) in surgical settings has primarily been as a navigation tool in the operating room because of its ease of use and minimal effect on surgical procedures. The surgeon can directly face the surgical field while viewing 3D anatomy virtually, thus reducing the need to look at an external display, such as a navigation system. Applications of AR are being explored in spine surgery. The basic principles of AR include data preparation, registration, tracking, and visualization. Current literature provides sufficient preclinical and clinical data evidence for the use of AR technology in spine surgery. AR systems are efficient assistive devices, providing greater accuracy for insertion points, more comfort for surgeons, and reduced operating time. AR technology also has beneficial applications in surgical training, education, and telementorship for spine surgery. However, costs associated with specially designed imaging equipment and physicians' comfort in using this technology continue to remain barriers to its adoption. As this technology evolves to a more widespread use, future applications will be directed by the cost-effectiveness of AR-assisted surgeries.
Collapse
Affiliation(s)
- Hania Shahzad
- From the Department of Orthopedics, The Ohio State University, Wexner Medical Center, Columbus, OH (Shahzad, Bhatti, and Khan) and Department of Orthopedics, Rush University Medical Center, Chicago, IL (Phillips)
| | | | | | | |
Collapse
|
21
|
Pierzchajlo N, Stevenson TC, Huynh H, Nguyen J, Boatright S, Arya P, Chakravarti S, Mehrki Y, Brown NJ, Gendreau J, Lee SJ, Chen SG. Augmented Reality in Minimally Invasive Spinal Surgery: A Narrative Review of Available Technology. World Neurosurg 2023; 176:35-42. [PMID: 37059357 DOI: 10.1016/j.wneu.2023.04.030] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Accepted: 04/08/2023] [Indexed: 04/16/2023]
Abstract
INTRODUCTION Spine surgery has undergone significant changes in approach and technique. With the adoption of intraoperative navigation, minimally invasive spinal surgery (MISS) has arguably become the gold standard. Augmented reality (AR) has now emerged as a front-runner in anatomical visualization and narrower operative corridors. In effect, AR is poised to revolutionize surgical training and operative outcomes. Our study examines the current literature on AR-assisted MISS, synthesizes findings, and creates a narrative highlighting the history and future of AR in spine surgery. MATERIAL AND METHODS Relevant literature was gathered using the PubMed (Medline) database from 1975 to 2023. Pedicle screw placement models were the primary intervention in AR. These were compared to the outcomes of traditional MISS RESULTS: We found that AR devices on the market show promising clinical outcomes in preoperative training and intraoperative use. Three prominent systems were as follows: XVision, HoloLens, and ImmersiveTouch. In the studies, surgeons, residents, and medical students had opportunities to operate AR systems, showcasing their educational potential across each phase of learning. Specifically, one facet described training with cadaver models to gauge accuracy in pedicle screw placement. AR-MISS exceeded free-hand methods without unique complications or contraindications. CONCLUSIONS While still in its infancy, AR has already proven beneficial for educational training and intraoperative MISS applications. We believe that with continued research and advancement of this technology, AR is poised to become a dominant player within the fundamentals of surgical education and MISS operative technique.
Collapse
Affiliation(s)
| | | | - Huey Huynh
- Mercer University, School of Medicine, Savannah, GA, USA
| | - Jimmy Nguyen
- Mercer University, School of Medicine, Savannah, GA, USA
| | | | - Priya Arya
- Mercer University, School of Medicine, Savannah, GA, USA
| | | | - Yusuf Mehrki
- Department of Neurosurgery, University of Florida, Jacksonville, FL, USA
| | - Nolan J Brown
- Department of Neurosurgery, University of California Irvine, Orange, CA, USA
| | - Julian Gendreau
- Department of Biomedical Engineering, Johns Hopkins Whiting School of Engineering, Baltimore, MD, USA
| | - Seung Jin Lee
- Department of Neurosurgery, Mayo Clinic, Jacksonville, FL, USA
| | - Selby G Chen
- Department of Neurosurgery, Mayo Clinic, Jacksonville, FL, USA
| |
Collapse
|
22
|
Kaya Bicer E, Fangerau H, Sur H. Artifical intelligence use in orthopedics: an ethical point of view. EFORT Open Rev 2023; 8:592-596. [PMID: 37526254 PMCID: PMC10441251 DOI: 10.1530/eor-23-0083] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 08/02/2023] Open
Abstract
Artificial intelligence (AI) is increasingly being utilized in orthopedics practice. Ethical concerns have arisen alongside marked improvements and widespread utilization of AI. Patient privacy, consent, data protection, cybersecurity, data safety and monitoring, bias, and accountability are some of the ethical concerns.
Collapse
Affiliation(s)
- Elcil Kaya Bicer
- Department of Orthopedics and Traumatology, Ege University Faculty of Medicine, Izmir, Turkey
| | - Heiner Fangerau
- Department of the History, Philosophy and Ethics of Medicine, Heinrich-Heine-Universität Düsseldorf, Germany
| | - Hakki Sur
- Department of Orthopedics and Traumatology, Ege University Faculty of Medicine, Izmir, Turkey
| |
Collapse
|
23
|
McDonnell KJ. Leveraging the Academic Artificial Intelligence Silecosystem to Advance the Community Oncology Enterprise. J Clin Med 2023; 12:4830. [PMID: 37510945 PMCID: PMC10381436 DOI: 10.3390/jcm12144830] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2023] [Revised: 07/05/2023] [Accepted: 07/07/2023] [Indexed: 07/30/2023] Open
Abstract
Over the last 75 years, artificial intelligence has evolved from a theoretical concept and novel paradigm describing the role that computers might play in our society to a tool with which we daily engage. In this review, we describe AI in terms of its constituent elements, the synthesis of which we refer to as the AI Silecosystem. Herein, we provide an historical perspective of the evolution of the AI Silecosystem, conceptualized and summarized as a Kuhnian paradigm. This manuscript focuses on the role that the AI Silecosystem plays in oncology and its emerging importance in the care of the community oncology patient. We observe that this important role arises out of a unique alliance between the academic oncology enterprise and community oncology practices. We provide evidence of this alliance by illustrating the practical establishment of the AI Silecosystem at the City of Hope Comprehensive Cancer Center and its team utilization by community oncology providers.
Collapse
Affiliation(s)
- Kevin J McDonnell
- Center for Precision Medicine, Department of Medical Oncology & Therapeutics Research, City of Hope Comprehensive Cancer Center, Duarte, CA 91010, USA
| |
Collapse
|
24
|
Stephenson N, Pushparajah K, Wheeler G, Deng S, Schnabel JA, Simpson JM. Extended reality for procedural planning and guidance in structural heart disease - a review of the state-of-the-art. THE INTERNATIONAL JOURNAL OF CARDIOVASCULAR IMAGING 2023:10.1007/s10554-023-02823-z. [PMID: 37103667 DOI: 10.1007/s10554-023-02823-z] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Subscribe] [Scholar Register] [Received: 09/23/2022] [Accepted: 02/22/2023] [Indexed: 04/28/2023]
Abstract
Extended reality (XR), which encompasses virtual, augmented and mixed reality, is an emerging medical imaging display platform which enables intuitive and immersive interaction in a three-dimensional space. This technology holds the potential to enhance understanding of complex spatial relationships when planning and guiding cardiac procedures in congenital and structural heart disease moving beyond conventional 2D and 3D image displays. A systematic review of the literature demonstrates a rapid increase in publications describing adoption of this technology. At least 33 XR systems have been described, with many demonstrating proof of concept, but with no specific mention of regulatory approval including some prospective studies. Validation remains limited, and true clinical benefit difficult to measure. This review describes and critically appraises the range of XR technologies and its applications for procedural planning and guidance in structural heart disease while discussing the challenges that need to be overcome in future studies to achieve safe and effective clinical adoption.
Collapse
Affiliation(s)
- Natasha Stephenson
- School of Biomedical Engineering and Imaging Sciences, King's College London, London, UK.
- Department of Congenital Heart Disease, Evelina Children's Hospital, London, UK.
- St Thomas' Hospital, 3rd Floor, Lambeth Wing, SE1 7EH, London, UK.
| | - Kuberan Pushparajah
- School of Biomedical Engineering and Imaging Sciences, King's College London, London, UK
- Department of Congenital Heart Disease, Evelina Children's Hospital, London, UK
| | - Gavin Wheeler
- School of Biomedical Engineering and Imaging Sciences, King's College London, London, UK
| | - Shujie Deng
- School of Biomedical Engineering and Imaging Sciences, King's College London, London, UK
| | - Julia A Schnabel
- School of Biomedical Engineering and Imaging Sciences, King's College London, London, UK
- Technical University of Munich, Munich, Germany
- Institute of Machine Learning in Biomedical Imaging, Helmholtz Center Munich, Munich, Germany
| | - John M Simpson
- School of Biomedical Engineering and Imaging Sciences, King's College London, London, UK
- Department of Congenital Heart Disease, Evelina Children's Hospital, London, UK
| |
Collapse
|
25
|
León-Muñoz VJ, Moya-Angeler J, López-López M, Lisón-Almagro AJ, Martínez-Martínez F, Santonja-Medina F. Integration of Square Fiducial Markers in Patient-Specific Instrumentation and Their Applicability in Knee Surgery. J Pers Med 2023; 13:jpm13050727. [PMID: 37240897 DOI: 10.3390/jpm13050727] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2023] [Revised: 04/23/2023] [Accepted: 04/23/2023] [Indexed: 05/28/2023] Open
Abstract
Computer technologies play a crucial role in orthopaedic surgery and are essential in personalising different treatments. Recent advances allow the usage of augmented reality (AR) for many orthopaedic procedures, which include different types of knee surgery. AR assigns the interaction between virtual environments and the physical world, allowing both to intermingle (AR superimposes information on real objects in real-time) through an optical device and allows personalising different processes for each patient. This article aims to describe the integration of fiducial markers in planning knee surgeries and to perform a narrative description of the latest publications on AR applications in knee surgery. Augmented reality-assisted knee surgery is an emerging set of techniques that can increase accuracy, efficiency, and safety and decrease the radiation exposure (in some surgical procedures, such as osteotomies) of other conventional methods. Initial clinical experience with AR projection based on ArUco-type artificial marker sensors has shown promising results and received positive operator feedback. Once initial clinical safety and efficacy have been demonstrated, the continued experience should be studied to validate this technology and generate further innovation in this rapidly evolving field.
Collapse
Affiliation(s)
- Vicente J León-Muñoz
- Department of Orthopaedic Surgery and Traumatology, Hospital General Universitario Reina Sofía, 30003 Murcia, Spain
- Instituto de Cirugía Avanzada de la Rodilla (ICAR), 30005 Murcia, Spain
| | - Joaquín Moya-Angeler
- Department of Orthopaedic Surgery and Traumatology, Hospital General Universitario Reina Sofía, 30003 Murcia, Spain
- Instituto de Cirugía Avanzada de la Rodilla (ICAR), 30005 Murcia, Spain
| | - Mirian López-López
- Subdirección General de Tecnologías de la Información, Servicio Murciano de Salud, 30100 Murcia, Spain
| | - Alonso J Lisón-Almagro
- Department of Orthopaedic Surgery and Traumatology, Hospital General Universitario Reina Sofía, 30003 Murcia, Spain
| | - Francisco Martínez-Martínez
- Department of Orthopaedic Surgery and Traumatology, Hospital Clínico Universitario Virgen de la Arrixaca, 30120 Murcia, Spain
| | - Fernando Santonja-Medina
- Department of Orthopaedic Surgery and Traumatology, Hospital Clínico Universitario Virgen de la Arrixaca, 30120 Murcia, Spain
- Department of Surgery, Pediatrics and Obstetrics & Gynecology, Faculty of Medicine, University of Murcia, 30120 Murcia, Spain
| |
Collapse
|
26
|
Farshad-Amacker NA, Kubik-Huch RA, Kolling C, Leo C, Goldhahn J. Learning how to perform ultrasound-guided interventions with and without augmented reality visualization: a randomized study. Eur Radiol 2023; 33:2927-2934. [PMID: 36350392 PMCID: PMC10017581 DOI: 10.1007/s00330-022-09220-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2022] [Revised: 10/02/2022] [Accepted: 10/09/2022] [Indexed: 11/11/2022]
Abstract
OBJECTIVES Augmented reality (AR), which entails overlay of in situ images onto the anatomy, may be a promising technique for assisting image-guided interventions. The purpose of this study was to investigate and compare the learning experience and performance of untrained operators in puncture of soft tissue lesions, when using AR ultrasound (AR US) compared with standard US (sUS). METHODS Forty-four medical students (28 women, 16 men) who had completed a basic US course, but had no experience with AR US, were asked to perform US-guided biopsies with both sUS and AR US, with a randomized selection of the initial modality. The experimental setup aimed to simulate biopsies of superficial soft tissue lesions, such as for example breast masses in clinical practice, by use of a turkey breast containing olives. Time to puncture(s) and success (yes/no) of the biopsies was documented. All participants completed questionnaires about their coordinative skills and their experience during the training. RESULTS Despite having no experience with the AR technique, time to puncture did not differ significantly between AR US and sUS (median [range]: 17.0 s [6-60] and 14.5 s [5-41], p = 0.16), nor were there any gender-related differences (p = 0.22 and p = 0.50). AR US was considered by 79.5% of the operators to be the more enjoyable means of learning and performing US-guided biopsies. Further, a more favorable learning curve was achieved using AR US. CONCLUSIONS Students considered AR US to be the preferable and more enjoyable modality for learning how to obtain soft tissue biopsies; however, they did not perform the biopsies faster than when using sUS. KEY POINTS • Performance of standard and augmented reality US-guided biopsies was comparable • A more favorable learning curve was achieved using augmented reality US. • Augmented reality US was the preferred technique and was considered more enjoyable.
Collapse
Affiliation(s)
- Nadja A Farshad-Amacker
- Radiology, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland.
| | - Rahel A Kubik-Huch
- Institute of Radiology, Department of Medical Services, Kantonsspital Baden, Baden, Switzerland
| | - Christoph Kolling
- Institute of Translational Medicine, Department of Health Sciences and Technology, Eidgenössische Technische Hochschule (ETH), Zurich, Switzerland
| | - Cornelia Leo
- Department of Gynaecology and Obstetrics, Kantonsspital Baden, Baden, Switzerland
| | - Jörg Goldhahn
- Institute of Translational Medicine, Department of Health Sciences and Technology, Eidgenössische Technische Hochschule (ETH), Zurich, Switzerland
| |
Collapse
|
27
|
Seibold M, Spirig JM, Esfandiari H, Farshad M, Fürnstahl P. Translation of Medical AR Research into Clinical Practice. J Imaging 2023; 9:jimaging9020044. [PMID: 36826963 PMCID: PMC9961816 DOI: 10.3390/jimaging9020044] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2022] [Revised: 01/27/2023] [Accepted: 02/03/2023] [Indexed: 02/17/2023] Open
Abstract
Translational research is aimed at turning discoveries from basic science into results that advance patient treatment. The translation of technical solutions into clinical use is a complex, iterative process that involves different stages of design, development, and validation, such as the identification of unmet clinical needs, technical conception, development, verification and validation, regulatory matters, and ethics. For this reason, many promising technical developments at the interface of technology, informatics, and medicine remain research prototypes without finding their way into clinical practice. Augmented reality is a technology that is now making its breakthrough into patient care, even though it has been available for decades. In this work, we explain the translational process for Medical AR devices and present associated challenges and opportunities. To the best knowledge of the authors, this concept paper is the first to present a guideline for the translation of medical AR research into clinical practice.
Collapse
Affiliation(s)
- Matthias Seibold
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, CH-8008 Zurich, Switzerland
- Computer Aided Medical Procedures and Augmented Reality, Technical University Munich, DE-85748 Garching, Germany
- Correspondence:
| | - José Miguel Spirig
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, CH-8008 Zurich, Switzerland
| | - Hooman Esfandiari
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, CH-8008 Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, CH-8008 Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, CH-8008 Zurich, Switzerland
| |
Collapse
|
28
|
Next step trauma and orthopaedic surgery: integration of augmented reality for reduction and nail implantation of tibial fractures. INTERNATIONAL ORTHOPAEDICS 2023; 47:495-501. [PMID: 36378324 PMCID: PMC9877081 DOI: 10.1007/s00264-022-05619-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/11/2022] [Accepted: 10/19/2022] [Indexed: 11/16/2022]
Abstract
INTRODUCTION There is a tremendous scope of hardware and software development going on in augmented reality (AR), also in trauma and orthopaedic surgery. However, there are only a few systems available for intra-operative 3D imaging and guidance, most of them rely on peri- and intra-operative X-ray imaging. Especially in complex situations such as pelvic surgery or multifragmentary multilevel fractures, intra-operative 3D imaging and implant tracking systems have proven to be of great advantage for the outcome of the surgery and can help reduce X-ray exposure, at least for the surgical team (Ochs et al. in Injury 41:1297 1305, 2010). Yet, the current systems do not provide the ability to have a dynamic live view from the perspective of the surgeon. Our study describes a prototype AR-based system for live tracking which does not rely on X-rays. MATERIALS AND METHODS A protype live-view intra-operative guidance system using an AR head-mounted device (HMD) was developed and tested on the implantation of a medullary nail in a tibia fracture model. Software algorithms that allow live view and tracking of the implant, fracture fragments and soft tissue without the intra-operative use of X-rays were derived. RESULTS The implantation of a medullar tibia nail is possible while only relying on AR-guidance and live view without the intra-operative use of X-rays. CONCLUSIONS The current paper describes a feasibility study with a prototype of an intra-operative dynamic live tracking and imaging system that does not require intra-operative use of X-rays and dynamically adjust to the perspective of the surgeons due to an AR HMD. To our knowledge, the current literature does not describe any similar systems. This could be the next step in surgical imaging and education and a promising way to improve patient care.
Collapse
|
29
|
Bennett KM, Griffith A, Sasanelli F, Park I, Talbot S. Augmented Reality Navigation Can Achieve Accurate Coronal Component Alignment During Total Knee Arthroplasty. Cureus 2023; 15:e34607. [PMID: 36883097 PMCID: PMC9985958 DOI: 10.7759/cureus.34607] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/26/2023] [Indexed: 02/05/2023] Open
Abstract
Background Computer-navigated knee arthroplasty has been shown to improve accuracy over conventional instruments. The next generation of computer assistance is being developed using augmented reality. The accuracy of augmented reality navigation has not been established. Methods From April 2021 to October 2021, a prospective, consecutive series of 20 patients underwent total knee arthroplasty utilising an augmented reality-assisted navigation system (ARAN). The coronal and sagittal alignment of the femoral and tibial bone cuts was measured using the ARAN and the final position of the components was measured on postoperative CT scans. The absolute difference between the measurements was recorded to determine the accuracy of the ARAN. Results Two cases were excluded due to segmentation errors, leaving 18 cases for analysis. The ARAN produced a mean absolute error of 1.4°, 2.0°, 1.1° and 1.6° for the femoral coronal, femoral sagittal, tibial coronal and tibial sagittal alignments, respectively. No outliers (absolute error of >3°) were identified in femoral coronal or tibial coronal alignment measurements. Three outliers were identified in tibial sagittal alignment, with all cases demonstrating less tibial slope (by 3.1°, 3.3° and 4°). Five outliers were identified in femoral sagittal alignment and in all cases, the component was more extended (3.1°, 3.2°, 3.2°, 3.4° and 3.9°). The mean operative time significantly decreased from the first nine augmented reality cases to the final nine cases by 11 minutes (p<0.05). There was no difference in the accuracy between the early and late ARAN cases. Conclusion Augmented reality navigation can achieve accurate alignment of total knee arthroplasty with a low rate of component malposition in the coronal plane. Acceptable and consistent accuracy can be achieved from the initial adoption of this technique, however, some sagittal outliers were identified and there is a clear learning curve with respect to operating time. The level of evidence was IV.
Collapse
Affiliation(s)
- Kyle M Bennett
- Department of Orthopaedic Surgery, Western Health, Melbourne, AUS
| | - Andrew Griffith
- Department of Orthopaedic Surgery, Western Health, Melbourne, AUS
| | | | - Isaac Park
- Department of Orthopaedic Surgery, Melbourne Health, Melbourne, AUS
| | - Simon Talbot
- Department of Orthopaedic Surgery, Western Health, Melbourne, AUS
| |
Collapse
|
30
|
Jun EK, Lim S, Seo J, Lee KH, Lee JH, Lee D, Koh JC. Augmented Reality-Assisted Navigation System for Transforaminal Epidural Injection. J Pain Res 2023; 16:921-931. [PMID: 36960464 PMCID: PMC10029754 DOI: 10.2147/jpr.s400955] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2022] [Accepted: 03/07/2023] [Indexed: 03/19/2023] Open
Abstract
Purpose Multiple studies have attempted to demonstrate the benefits of augmented reality (AR)-assisted navigation systems in surgery. Lumbosacral transforaminal epidural injection is an effective treatment commonly used in patients with radiculopathy due to spinal degenerative pathologies. However, few studies have applied AR-assisted navigation systems to this procedure. The study aimed to investigate the safety and effectiveness of an AR-assisted navigation system for transforaminal epidural injection. Patients and Methods Through a real-time tracking system and a wireless network to the head-mounted display, computed tomography images of the spine and the path of a spinal needle to the target were visualized on a torso phantom with respiration movements installed. From L1/L2 to L5/S1, needle insertions were performed using an AR-assisted system on the left side of the phantom, and the conventional method was performed on the right side. Results The procedure duration was approximately three times shorter, and the number of radiographs required was reduced in the experimental group compared to the control group. The distance from the needle tips to the target areas in the plan showed no significant difference between the two groups. (AR group 1.7 ± 2.3mm, control group 3.2 ± 2.8mm, P value 0.067). Conclusion An AR-assisted navigation system may be used to reduce the time required for spinal interventions and ensure the safety of patients and physicians in view of radiation exposure. Further studies are essential to apply AR-assisted navigation systems to spine interventions.
Collapse
Affiliation(s)
- Eun Kyung Jun
- Department of Anesthesiology and Pain Medicine, Korea University Anam Hospital, Seoul, Korea
| | - Sunghwan Lim
- Center for Healthcare Robotics, Artificial Intelligence and Robotics Institute, Korea Institute of Science and Technology, Seoul, Korea
| | - Joonho Seo
- Department of Medical Assistant Robot, Korea Institute of Machinery and Materials, Daegu, Korea
| | - Kae Hong Lee
- Department of Anesthesiology and Pain Medicine, Korea University Anam Hospital, Seoul, Korea
| | - Jae Hee Lee
- Department of Anesthesiology and Pain Medicine, Korea University Anam Hospital, Seoul, Korea
| | - Deukhee Lee
- Center for Healthcare Robotics, Artificial Intelligence and Robotics Institute, Korea Institute of Science and Technology, Seoul, Korea
- Correspondence: Deukhee Lee, Center for Bionics, Korea Institute of Science and Technology, Hwarangno 14-gil 5, Seongbuk-gu, Seoul, 136-791, Republic of Korea, Tel +82-2-958-5633, Fax +82-2-920-2275, Email
| | - Jae Chul Koh
- Department of Anesthesiology and Pain Medicine, Korea University Anam Hospital, Seoul, Korea
- Jae Chul Koh, Department of Anesthesiology and Pain Medicine, Korea University Anam Hospital, 73, Goryeodae-ro, Seongbukgu, Seoul, 02841, Korea, Tel +82-2-920-5632, Fax +82-2-920-2275, Email
| |
Collapse
|
31
|
Hosoi I, Matsumoto T, Chang SH, An Q, Sakuma I, Kobayashi E. Development of Intraoperative Plantar Pressure Measurement System Considering Weight Bearing Axis and Center of Pressure. JOURNAL OF ROBOTICS AND MECHATRONICS 2022. [DOI: 10.20965/jrm.2022.p1318] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
Abstract
To prevent postoperative complications in corrective surgery for foot deformities such as hallux valgus and pes planus, it is critical to quantitatively predict the postoperative standing-position plantar pressure distribution during the operation. The authors have previously proposed an intraoperative plantar pressure measurement system (IPPM) that allows for the measurement of a supine patient’s plantar pressure distribution that is equivalent to that in the standing position. This system consists of an IPPM device comprising of a force plate and pressure distribution sensor, an optical three-dimensional position measurement device, a navigation monitor, and a PC. The plantar pressure distribution in the standing position is reproduced by navigating the operator, as he or she presses the IPPM device against the patient’s sole so that the weight-bearing axis (floor reaction force vector) and femoral head center are as close to each other as possible. However, in our previous study, the reproducibility of the standing position plantar pressure distribution was insufficient. Therefore, in the present study, we add a navigational function that can be used to bring the centers of pressure in the standing position and under measurement, as well as to correct the IPPM’s self-weight in the measured force. The improved device was used in an experiment with nine healthy subjects, and the similarity of the plantar pressure distribution in the standing and supine positions was evaluated using normalized cross-correlation, yielding an average of 0.90. Furthermore, in an evaluation experiment with ten orthopedic surgeons, it was observed that using the system reproduced the plantar pressure distribution significantly better than when the system was not used. These results indicate that the present system can predict the plantar pressure distribution in the standing position. We believe that this system can contribute to reducing complications after foot surgery.
Collapse
|
32
|
Killeen BD, Winter J, Gu W, Martin-Gomez A, Taylor RH, Osgood G, Unberath M. Mixed Reality Interfaces for Achieving Desired Views with Robotic X-ray Systems. COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING. IMAGING & VISUALIZATION 2022; 11:1130-1135. [PMID: 37555199 PMCID: PMC10406465 DOI: 10.1080/21681163.2022.2154272] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/18/2022] [Accepted: 11/19/2022] [Indexed: 12/14/2022]
Abstract
Robotic X-ray C-arm imaging systems can precisely achieve any position and orientation relative to the patient. Informing the system, however, what pose exactly corresponds to a desired view is challenging. Currently these systems are operated by the surgeon using joysticks, but this interaction paradigm is not necessarily effective because users may be unable to efficiently actuate more than a single axis of the system simultaneously. Moreover, novel robotic imaging systems, such as the Brainlab Loop-X, allow for independent source and detector movements, adding even more complexity. To address this challenge, we consider complementary interfaces for the surgeon to command robotic X-ray systems effectively. Specifically, we consider three interaction paradigms: (1) the use of a pointer to specify the principal ray of the desired view relative to the anatomy, (2) the same pointer, but combined with a mixed reality environment to synchronously render digitally reconstructed radiographs from the tool's pose, and (3) the same mixed reality environment but with a virtual X-ray source instead of the pointer. Initial human-in-the-loop evaluation with an attending trauma surgeon indicates that mixed reality interfaces for robotic X-ray system control are promising and may contribute to substantially reducing the number of X-ray images acquired solely during "fluoro hunting" for the desired view or standard plane.
Collapse
Affiliation(s)
- Benjamin D Killeen
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Jonas Winter
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Wenhao Gu
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Alejandro Martin-Gomez
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Russell H Taylor
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Greg Osgood
- Department of Orthopaedic Surgery, Johns Hopkins Hospital, Baltimore, MD, USA
| | - Mathias Unberath
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| |
Collapse
|
33
|
The Role of Augmented Reality in the Next Phase of Surgical Education. Plast Reconstr Surg Glob Open 2022; 10:e4656. [PMID: 36348749 PMCID: PMC9633082 DOI: 10.1097/gox.0000000000004656] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2022] [Accepted: 09/20/2022] [Indexed: 01/24/2023]
Abstract
Concomitant with such a shift toward competency-based curricula, there has been increasing adoption of surgical simulation coupled with virtual, mixed, and augmented reality. These technologies have become more commonplace across multiple surgical disciplines, in domains such as preoperative planning, surgical education, and intraoperative navigation. However, there is a relative paucity of literature pertaining to the application of this technology to plastic surgery education. This review outlines the advantages of mixed and augmented reality in the pursuit of an ideal simulation environment, their benefits for the education of plastic surgery trainees, and their role in standardized assessments. In addition, we offer practical solutions to commonly encountered problems with this technology. Augmented reality has tremendous untapped potential in the next phase of plastic surgery education, and we outline steps toward broader implementation to enhance the learning environment for our trainees and to improve patient outcomes.
Collapse
|
34
|
Mensel C, Gundtoft PH, Brink O. Preoperative templating in orthopaedic fracture surgery: The past, present and future. Injury 2022; 53 Suppl 3:S42-S46. [PMID: 36150912 DOI: 10.1016/j.injury.2022.09.005] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/27/2022] [Revised: 08/31/2022] [Accepted: 09/03/2022] [Indexed: 02/02/2023]
Abstract
Preoperative planning in orthopaedic fracture surgery corroborates with the goal of establishing the best possible surgical result and ensuring a functioning limb for the patient. From placing sketches on overhead projector paper and measuring lengths from anatomical landmarks, ways of preoperative planning have evolved rapidly over the last 100 years. Today, preoperative planning includes methods such as advanced 3-Dimensional (3D) printed models and software programs incorporating entire libraries of osteosynthesis materials that can be shaped and rotated to fit a patient's specific anatomy. Relevant literature was evaluated to review the development of preoperative templating from the past and present, in order to assess its impact on the future of osteosynthesis.We identified studies on 3D-imaging, computer-assisted systems, and 3D-printed fractured bones and drill guides. The use of some of these systems resulted in a reduction in operation time, blood loss, perioperative fluoroscopy and hospital stay, as well as better placement of osteosynthesis material. Only few studies have identified differences in patient morbidity and mortality. Future techniques of preoperative templating are on the rise and the potential is vast. The cost-effectiveness and usefulness of certain methods need to be evaluated further, but the benefit of preoperative templating has the potential of being revolutionary, with the possibility of radical advances within orthopaedic surgery.
Collapse
Affiliation(s)
- Camilla Mensel
- Department of Orthopaedic Surgery, Aarhus University Hospital, Aarhus, Denmark.
| | - Per Hviid Gundtoft
- Department of Orthopaedic Surgery, Aarhus University Hospital, Aarhus, Denmark
| | - Ole Brink
- Department of Orthopaedic Surgery, Aarhus University Hospital, Aarhus, Denmark.
| |
Collapse
|
35
|
Xie J, Chai JJK, O’Sullivan C, Xu JL. Trends of Augmented Reality for Agri-Food Applications. SENSORS (BASEL, SWITZERLAND) 2022; 22:8333. [PMID: 36366030 PMCID: PMC9653656 DOI: 10.3390/s22218333] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/19/2022] [Revised: 10/25/2022] [Accepted: 10/27/2022] [Indexed: 06/16/2023]
Abstract
Recent years have witnessed an increasing interest in deploying state-of-the-art augmented reality (AR) head-mounted displays (HMDs) for agri-food applications. The benefits of AR HMDs to agri-food industry stakeholders (e.g., food suppliers, retail/food service) have received growing attention and recognition. AR HMDs enable users to make healthier dietary choices, experience novel changes in their perception of taste, enhance the cooking and food shopping experience, improve productivity at work and enhance the implementation of precision farming. Therefore, although development costs are still high, the case for integration of AR in food chains appears to be compelling. This review will present the most recent developments of AR HMDs for agri-food relevant applications. The summarized applications can be clustered into different themes: (1) dietary and food nutrition assessment; (2) food sensory science; (3) changing the eating environment; (4) retail food chain applications; (5) enhancing the cooking experience; (6) food-related training and learning; and (7) food production and precision farming. Limitations of current practices will be highlighted, along with some proposed applications.
Collapse
Affiliation(s)
- Junhao Xie
- School of Biosystems and Food Engineering, University College Dublin, Belfield, Dublin 4, Ireland
| | - Jackey J. K. Chai
- School of Computer Science and Statistics, Trinity College Dublin, College Green, Dublin 2, Ireland
| | - Carol O’Sullivan
- School of Computer Science and Statistics, Trinity College Dublin, College Green, Dublin 2, Ireland
| | - Jun-Li Xu
- School of Biosystems and Food Engineering, University College Dublin, Belfield, Dublin 4, Ireland
| |
Collapse
|
36
|
Abstract
Augmented reality (AR) is an innovative system that enhances the real world by superimposing virtual objects on reality. The aim of this study was to analyze the application of AR in medicine and which of its technical solutions are the most used. We carried out a scoping review of the articles published between 2019 and February 2022. The initial search yielded a total of 2649 articles. After applying filters, removing duplicates and screening, we included 34 articles in our analysis. The analysis of the articles highlighted that AR has been traditionally and mainly used in orthopedics in addition to maxillofacial surgery and oncology. Regarding the display application in AR, the Microsoft HoloLens Optical Viewer is the most used method. Moreover, for the tracking and registration phases, the marker-based method with a rigid registration remains the most used system. Overall, the results of this study suggested that AR is an innovative technology with numerous advantages, finding applications in several new surgery domains. Considering the available data, it is not possible to clearly identify all the fields of application and the best technologies regarding AR.
Collapse
|
37
|
Chai JJ, O'Sullivan C, Gowen AA, Rooney B, Xu JL. Augmented/mixed reality technologies for food: A review. Trends Food Sci Technol 2022. [DOI: 10.1016/j.tifs.2022.04.021] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/25/2022]
|
38
|
Bläsius F, Delbrück H, Hildebrand F, Hofmann UK. Surgical Treatment of Bone Sarcoma. Cancers (Basel) 2022; 14:cancers14112694. [PMID: 35681674 PMCID: PMC9179414 DOI: 10.3390/cancers14112694] [Citation(s) in RCA: 22] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2022] [Revised: 05/22/2022] [Accepted: 05/24/2022] [Indexed: 12/24/2022] Open
Abstract
Bone sarcomas are rare primary malignant mesenchymal bone tumors. The three main entities are osteosarcoma, chondrosarcoma, and Ewing sarcoma. While prognosis has improved for affected patients over the past decades, bone sarcomas are still critical conditions that require an interdisciplinary diagnostic and therapeutic approach. While radiotherapy plays a role especially in Ewing sarcoma and chemotherapy in Ewing sarcoma and osteosarcoma, surgery remains the main pillar of treatment in all three entities. After complete tumor resection, the created bone defects need to be reconstructed. Possible strategies are implantation of allografts or autografts including vascularized bone grafts (e.g., of the fibula). Around the knee joint, rotationplasty can be performed or, as an alternative, the implantation of (expandable) megaprostheses can be performed. Challenges still associated with the implantation of foreign materials are aseptic loosening and infection. Future improvements may come with advances in 3D printing of individualized resection blades/implants, thus also securing safe tumor resection margins while at the same time shortening the required surgical time. Faster osseointegration and lower infection rates may possibly be achieved through more elaborate implant surface structures.
Collapse
Affiliation(s)
- Felix Bläsius
- Department of Orthopaedic, Trauma and Reconstructive Surgery, RWTH Aachen University Hospital, Pauwelsstraße 30, 52074 Aachen, Germany; (F.B.); (H.D.); (F.H.)
- Centre for Integrated Oncology Aachen Bonn Köln Düsseldorf (CIO), 52074 Aachen, Germany
| | - Heide Delbrück
- Department of Orthopaedic, Trauma and Reconstructive Surgery, RWTH Aachen University Hospital, Pauwelsstraße 30, 52074 Aachen, Germany; (F.B.); (H.D.); (F.H.)
- Centre for Integrated Oncology Aachen Bonn Köln Düsseldorf (CIO), 52074 Aachen, Germany
| | - Frank Hildebrand
- Department of Orthopaedic, Trauma and Reconstructive Surgery, RWTH Aachen University Hospital, Pauwelsstraße 30, 52074 Aachen, Germany; (F.B.); (H.D.); (F.H.)
- Centre for Integrated Oncology Aachen Bonn Köln Düsseldorf (CIO), 52074 Aachen, Germany
| | - Ulf Krister Hofmann
- Department of Orthopaedic, Trauma and Reconstructive Surgery, RWTH Aachen University Hospital, Pauwelsstraße 30, 52074 Aachen, Germany; (F.B.); (H.D.); (F.H.)
- Centre for Integrated Oncology Aachen Bonn Köln Düsseldorf (CIO), 52074 Aachen, Germany
- Correspondence: ; Tel.: +49-(0)241-80-89350
| |
Collapse
|
39
|
Nikolaidis A. What is Significant in Modern Augmented Reality: A Systematic Analysis of Existing Reviews. J Imaging 2022; 8:jimaging8050145. [PMID: 35621909 PMCID: PMC9144923 DOI: 10.3390/jimaging8050145] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2022] [Revised: 05/17/2022] [Accepted: 05/19/2022] [Indexed: 11/16/2022] Open
Abstract
Augmented reality (AR) is a field of technology that has evolved drastically during the last decades, due to its vast range of applications in everyday life. The aim of this paper is to provide researchers with an overview of what has been surveyed since 2010 in terms of AR application areas as well as in terms of its technical aspects, and to discuss the extent to which both application areas and technical aspects have been covered, as well as to examine whether one can extract useful evidence of what aspects have not been covered adequately and whether it is possible to define common taxonomy criteria for performing AR reviews in the future. To this end, a search with inclusion and exclusion criteria has been performed in the Scopus database, producing a representative set of 47 reviews, covering the years from 2010 onwards. A proper taxonomy of the results is introduced, and the findings reveal, among others, the lack of AR application reviews covering all suggested criteria.
Collapse
Affiliation(s)
- Athanasios Nikolaidis
- Department of Informatics, Computer and Telecommunications Engineering, International Hellenic University, 62124 Serres, Greece
| |
Collapse
|
40
|
Augmented Reality in Arthroplasty: An Overview of Clinical Applications, Benefits, and Limitations. J Am Acad Orthop Surg 2022; 30:e760-e768. [PMID: 35245236 DOI: 10.5435/jaaos-d-21-00964] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/27/2021] [Accepted: 01/30/2022] [Indexed: 02/01/2023] Open
Abstract
Augmented reality (AR) is a natural extension of computer-assisted surgery whereby a computer-generated image is superimposed on the surgeon's field of vision to assist in the planning and execution of the procedure. This emerging technology shows great potential in the field of arthroplasty, improving efficiency, limb alignment, and implant position. AR has shown the capacity to build on computer navigation systems while providing more elaborate information in a streamlined workflow to the user. This review investigates the current uses of AR in the field of arthroplasty and discusses outcomes, limitations, and potential future directions.
Collapse
|
41
|
Augmented Reality: Mapping Methods and Tools for Enhancing the Human Role in Healthcare HMI. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12094295] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Background: Augmented Reality (AR) represents an innovative technology to improve data visualization and strengthen the human perception. Among Human–Machine Interaction (HMI), medicine can benefit most from the adoption of these digital technologies. In this perspective, the literature on orthopedic surgery techniques based on AR was evaluated, focusing on identifying the limitations and challenges of AR-based healthcare applications, to support the research and the development of further studies. Methods: Studies published from January 2018 to December 2021 were analyzed after a comprehensive search on PubMed, Google Scholar, Scopus, IEEE Xplore, Science Direct, and Wiley Online Library databases. In order to improve the review reporting, the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines were used. Results: Authors selected sixty-two articles meeting the inclusion criteria, which were categorized according to the purpose of the study (intraoperative, training, rehabilitation) and according to the surgical procedure used. Conclusions: AR has the potential to improve orthopedic training and practice by providing an increasingly human-centered clinical approach. Further research can be addressed by this review to cover problems related to hardware limitations, lack of accurate registration and tracking systems, and absence of security protocols.
Collapse
|
42
|
Kordon F, Maier A, Swartman B, Privalov M, El Barbari JS, Kunze H. Multi-Stage Platform for (Semi-)Automatic Planning in Reconstructive Orthopedic Surgery. J Imaging 2022; 8:jimaging8040108. [PMID: 35448235 PMCID: PMC9027971 DOI: 10.3390/jimaging8040108] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Revised: 04/05/2022] [Accepted: 04/08/2022] [Indexed: 01/11/2023] Open
Abstract
Intricate lesions of the musculoskeletal system require reconstructive orthopedic surgery to restore the correct biomechanics. Careful pre-operative planning of the surgical steps on 2D image data is an essential tool to increase the precision and safety of these operations. However, the plan’s effectiveness in the intra-operative workflow is challenged by unpredictable patient and device positioning and complex registration protocols. Here, we develop and analyze a multi-stage algorithm that combines deep learning-based anatomical feature detection and geometric post-processing to enable accurate pre- and intra-operative surgery planning on 2D X-ray images. The algorithm allows granular control over each element of the planning geometry, enabling real-time adjustments directly in the operating room (OR). In the method evaluation of three ligament reconstruction tasks effect on the knee joint, we found high spatial precision in drilling point localization (ε<2.9mm) and low angulation errors for k-wire instrumentation (ε<0.75∘) on 38 diagnostic radiographs. Comparable precision was demonstrated in 15 complex intra-operative trauma cases suffering from strong implant overlap and multi-anatomy exposure. Furthermore, we found that the diverse feature detection tasks can be efficiently solved with a multi-task network topology, improving precision over the single-task case. Our platform will help overcome the limitations of current clinical practice and foster surgical plan generation and adjustment directly in the OR, ultimately motivating the development of novel 2D planning guidelines.
Collapse
Affiliation(s)
- Florian Kordon
- Pattern Recognition Lab, Friedrich-Alexander University Erlangen-Nuremberg, 91058 Erlangen, Germany; (A.M.); (H.K.)
- Erlangen Graduate School in Advanced Optical Technologies (SAOT), Friedrich-Alexander University Erlangen-Nuremberg, 91052 Erlangen, Germany
- Advanced Therapies, Siemens Healthcare GmbH, 91031 Forchheim, Germany
- Correspondence:
| | - Andreas Maier
- Pattern Recognition Lab, Friedrich-Alexander University Erlangen-Nuremberg, 91058 Erlangen, Germany; (A.M.); (H.K.)
- Erlangen Graduate School in Advanced Optical Technologies (SAOT), Friedrich-Alexander University Erlangen-Nuremberg, 91052 Erlangen, Germany
| | - Benedict Swartman
- Department for Trauma and Orthopaedic Surgery, BG Trauma Center, Ludwigshafen, 67071 Ludwigshafen, Germany; (B.S.); (M.P.); (J.S.E.B.)
| | - Maxim Privalov
- Department for Trauma and Orthopaedic Surgery, BG Trauma Center, Ludwigshafen, 67071 Ludwigshafen, Germany; (B.S.); (M.P.); (J.S.E.B.)
| | - Jan Siad El Barbari
- Department for Trauma and Orthopaedic Surgery, BG Trauma Center, Ludwigshafen, 67071 Ludwigshafen, Germany; (B.S.); (M.P.); (J.S.E.B.)
| | - Holger Kunze
- Pattern Recognition Lab, Friedrich-Alexander University Erlangen-Nuremberg, 91058 Erlangen, Germany; (A.M.); (H.K.)
- Advanced Therapies, Siemens Healthcare GmbH, 91031 Forchheim, Germany
| |
Collapse
|
43
|
Roberts S, Desai A, Checcucci E, Puliatti S, Taratkin M, Kowalewski KF, Gomez Rivas J, Rivero I, Veneziano D, Autorino R, Porpiglia F, Gill IS, Cacciamani GE. "Augmented reality" applications in urology: a systematic review. Minerva Urol Nephrol 2022; 74:528-537. [PMID: 35383432 DOI: 10.23736/s2724-6051.22.04726-7] [Citation(s) in RCA: 25] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
INTRODUCTION Augmented reality (AR) applied to surgical procedures refers to the superimposition of preoperative or intra-operative images onto the operative field. Augmented reality has been increasingly used in myriad surgical specialties including Urology. The following study reviews advances in the use of AR for improvements in urologic outcomes. EVIDENCE ACQUISITION We identified all descriptive, validity, prospective randomized/nonrandomized trials and retrospective comparative/noncomparative studies about the use of AR in Urology up until March 2021. The MEDLINE, Scopus, and Web of Science databases were used for literature search. We conducted the study selection according to the PRISMA (Preferred Reporting Items for Systematic Reviews and meta-analysis statement) guidelines. We limited included studies to only those using AR, excluding all that used virtual reality technology. EVIDENCE SYNTHESIS A total of 60 studies were identified and included in the present analysis. Overall, 19 studies were descriptive/validity/phantom studies for specific AR methodologies, 4 studies were case reports, and 37 studies included clinical prospective/retrospective comparative studies. CONCLUSIONS Advances in AR have led to increasing registration accuracy as well as increased ability to identify anatomic landmarks and improve outcomes during Urologic procedures such as RARP and robot-assisted partial nephrectomy.
Collapse
Affiliation(s)
- Sidney Roberts
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Aditya Desai
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Enrico Checcucci
- School of Medicine, Division of Urology, Department of Oncology, San Luigi Hospital, University of Turin, Orbassano, Turin, Italy.,European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands
| | - Stefano Puliatti
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, University of Modena and Reggio Emilia, Modena, Italy.,Department of Urology, OLV, Aalst, Belgium.,ORSI Academy, Melle, Belgium
| | - Mark Taratkin
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Institute for Urology and Reproductive Health, Sechenov University, Moscow, Russia
| | - Karl-Friedrich Kowalewski
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Virgen Macarena University Hospital, Seville, Spain.,Department of Urology and Urosurgery, University Hospital of Mannheim, Mannheim, Germany
| | - Juan Gomez Rivas
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, Clinico San Carlos University Hospital, Madrid, Spain
| | - Ines Rivero
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology and Nephrology, Virgen del Rocío University Hospital, Seville, Spain
| | - Domenico Veneziano
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, Riuniti Hospital, Reggio Calabria, Reggio Calabria, Italy
| | | | - Francesco Porpiglia
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands
| | - Inderbir S Gill
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA.,Artificial Intelligence (AI) Center at USC Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Giovanni E Cacciamani
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA - .,European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Artificial Intelligence (AI) Center at USC Urology, USC Institute of Urology, Los Angeles, CA, USA.,Keck School of Medicine, Department of Radiology, University of Southern California, Los Angeles, CA, USA
| |
Collapse
|
44
|
Alpaugh K, Ast MP, Haas SB. Immersive technologies for total knee arthroplasty surgical education. Arch Orthop Trauma Surg 2021; 141:2331-2335. [PMID: 34652513 DOI: 10.1007/s00402-021-04174-7] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/07/2021] [Accepted: 09/06/2021] [Indexed: 12/30/2022]
Abstract
The need to adapt surgical curricula to meet the demands of an increasingly restrictive training environment is rising. Modern constraints of surgical trainees including work-hour restrictions and concerns surrounding patient safety have created an opportunity to supplement traditional teaching methods with developing immersive technologies including virtual and augmented reality. Virtual reality (VR) and augmented reality (AR) have been preliminarily investigated as it relates to total joint arthroplasty. The purpose of this article is to discuss VR and AR as it applies to modern total knee replacement (TKR) surgical education.
Collapse
Affiliation(s)
- Kyle Alpaugh
- Hospital for Special Surgery, Adult Reconstruction and Joint Replacement, 535 E. 70th Street, New York, NY, 10021, USA.
- Division of Hip and Knee Replacement, Massachusetts General Hospital, 55 Fruit Street, Boston, MA, 02114, USA.
| | - Michael P Ast
- Hospital for Special Surgery, Adult Reconstruction and Joint Replacement, 535 E. 70th Street, New York, NY, 10021, USA
| | - Steven B Haas
- Hospital for Special Surgery, Adult Reconstruction and Joint Replacement, 535 E. 70th Street, New York, NY, 10021, USA
| |
Collapse
|
45
|
Tsukada S, Ogawa H, Nishino M, Kurosaka K, Hirasawa N. Augmented Reality-Assisted Femoral Bone Resection in Total Knee Arthroplasty. JB JS Open Access 2021; 6:JBJSOA-D-21-00001. [PMID: 34316529 PMCID: PMC8301282 DOI: 10.2106/jbjs.oa.21.00001] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
An augmented reality (AR)-based navigation system allows visualization of the center of the femoral head and femoral mechanical axis superimposed on the surgical field during total knee arthroplasty (TKA) and may help surgeons to improve the accuracy of distal femoral resection.
Collapse
Affiliation(s)
- Sachiyuki Tsukada
- Department of Orthopaedic Surgery, Hokusuikai Kinen Hospital, Mito, Japan
| | - Hiroyuki Ogawa
- Department of Orthopaedic Surgery, Hokusuikai Kinen Hospital, Mito, Japan
| | - Masahiro Nishino
- Department of Orthopaedic Surgery, Hokusuikai Kinen Hospital, Mito, Japan
| | - Kenji Kurosaka
- Department of Orthopaedic Surgery, Hokusuikai Kinen Hospital, Mito, Japan
| | - Naoyuki Hirasawa
- Department of Orthopaedic Surgery, Hokusuikai Kinen Hospital, Mito, Japan
| |
Collapse
|
46
|
Iacono V, Farinelli L, Natali S, Piovan G, Screpis D, Gigante A, Zorzi C. The use of augmented reality for limb and component alignment in total knee arthroplasty: systematic review of the literature and clinical pilot study. J Exp Orthop 2021; 8:52. [PMID: 34287721 PMCID: PMC8295423 DOI: 10.1186/s40634-021-00374-7] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/11/2021] [Accepted: 07/16/2021] [Indexed: 12/25/2022] Open
Abstract
PURPOSE A systematic review of the literature has been carried out to assess the actual evidence of the use of augmented reality in total knee arthroplasty (TKA). We then conducted a pilot clinical study to examine the accuracy of the Knee + augmented reality navigation in performing TKA. The present augmented reality (AR) system allows the surgeon to view the tibial and femur axis superimposed on the surgical field through the smart glasses. It provides real-time information during surgery and intraoperative feedback. METHODS A systematic review of the PubMed, MEDLINE, and Embase databases up to May 2021 using the keywords "augmented reality", "knee arthroplasty", "computer assisted surgery", "navigation knee arthroplasty" was performed by two independent reviewers. We performed five TKAs using the Knee + system. Patients were 4 females, with mean age of 76.4 years old (range 73-79) and mean Body Max Index (BMI) of 31.9 kg/m2 (range 27-35). The axial alignment of the limb and the orientation of the components were evaluated on standardized pre and postoperative full leg length weight-bearing radiographs, anteroposterior radiographs, and lateral radiographs of the knee. The time of tourniquet was recorded. The perception of motion sickness was assessed by Virtual Reality Sickness Questionnaire (VRSQ) subjected to surgeon immediately after surgery. RESULTS After duplicate removal, a total of 31 abstracts were found. However, only two studies concerned knee arthroplasty. Unfortunately, both were preclinical studies. Knee + system is able to perform a cutting error of less than 1° of difference about coronal alignment of femur and tibia and less than 2° about flexion/extension of femur and posterior tibial slope. The absolute differences between the values obtained during surgery and the measurement of varus femur, varus tibia, posterior slope, and femur flexion angle on post-operative radiographs were 0.6° ± 1.34°, 0.8° ± 0.84°, 0.8° ± 1.79°, and 0.4 mm ± 0.55 mm, respectively. CONCLUSIONS On light of our preliminary results, the Knee + system is accurate and effective to perform TKA. The translation from pilot study to high-level prospective studies is warranted to assess accuracy and cost-effective analysis compared to conventional techniques. LEVEL OF EVIDENCE IV.
Collapse
Affiliation(s)
- V Iacono
- Department of Orthopaedics IRCCS Ospedale Sacro Cuore Don Calabria, Negrar di Valpolicella, Italy
| | - L Farinelli
- Clinical Ortopaedics, Department of Clinical and Molecular Sciences, Università Politecnica Delle Marche, Ancona, Italy
| | - S Natali
- Department of Orthopaedics IRCCS Ospedale Sacro Cuore Don Calabria, Negrar di Valpolicella, Italy.
| | - G Piovan
- Department of Orthopaedics IRCCS Ospedale Sacro Cuore Don Calabria, Negrar di Valpolicella, Italy
| | - D Screpis
- Department of Orthopaedics IRCCS Ospedale Sacro Cuore Don Calabria, Negrar di Valpolicella, Italy
| | - A Gigante
- Clinical Ortopaedics, Department of Clinical and Molecular Sciences, Università Politecnica Delle Marche, Ancona, Italy
| | - C Zorzi
- Department of Orthopaedics IRCCS Ospedale Sacro Cuore Don Calabria, Negrar di Valpolicella, Italy
| |
Collapse
|
47
|
Hu X, Baena FRY, Cutolo F. Head-Mounted Augmented Reality Platform for Markerless Orthopaedic Navigation. IEEE J Biomed Health Inform 2021; 26:910-921. [PMID: 34115600 DOI: 10.1109/jbhi.2021.3088442] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Visual augmented reality (AR) has the potential to improve the accuracy, efficiency and reproducibility of computer-assisted orthopaedic surgery (CAOS). AR Head-mounted displays (HMDs) further allow non-eye-shift target observation and egocentric view. Recently, a markerless tracking and registration (MTR) algorithm was proposed to avoid the artificial markers that are conventionally pinned into the target anatomy for tracking, as their use prolongs surgical workflow, introduces human-induced errors, and necessitates additional surgical invasion in patients. However, such an MTR-based method has neither been explored for surgical applications nor integrated into current AR HMDs, making the ergonomic HMD-based markerless AR CAOS navigation hard to achieve. To these aims, we present a versatile, device-agnostic and accurate HMD-based AR platform. Our software platform, supporting both video see-through (VST) and optical see-through (OST) modes, integrates two proposed fast calibration procedures using a specially designed calibration tool. According to the camera-based evaluation, our AR platform achieves a display error of 6.31 2.55 arcmin for VST and 7.72 3.73 arcmin for OST. A proof-of-concept markerless surgical navigation system to assist in femoral bone drilling was then developed based on the platform and Microsoft HoloLens 1. According to the user study, both VST and OST markerless navigation systems are reliable, with the OST system providing the best usability. The measured navigation error is 4.90 1.04 mm, 5.96 2.22 for VST system and 4.36 0.80 mm, 5.65 1.42 for OST system.
Collapse
|
48
|
Abstract
Augmented Reality (AR) is worldwide recognized as one of the leading technologies of the 21st century and one of the pillars of the new industrial revolution envisaged by the Industry 4.0 international program. Several papers describe, in detail, specific applications of Augmented Reality developed to test its potentiality in a variety of fields. However, there is a lack of sources detailing the current limits of this technology in the event of its introduction in a real working environment where everyday tasks could be carried out by operators using an AR-based approach. A literature analysis to detect AR strength and weakness has been carried out, and a set of case studies has been implemented by authors to find the limits of current AR technologies in industrial applications outside the laboratory-protected environment. The outcome of this paper is that, even though Augmented Reality is a well-consolidated computer graphic technique in research applications, several improvements both from a software and hardware point of view should be introduced before its introduction in industrial operations. The originality of this paper lies in the detection of guidelines to improve the Augmented Reality potentialities in factories and industries.
Collapse
|
49
|
Casari FA, Roner S, Fürnstahl P, Nagy L, Schweizer A. Computer-assisted open reduction internal fixation of intraarticular radius fractures navigated with patient-specific instrumentation, a prospective case series. Arch Orthop Trauma Surg 2021; 141:1425-1432. [PMID: 33715063 PMCID: PMC8295140 DOI: 10.1007/s00402-021-03856-6] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/10/2020] [Accepted: 03/02/2021] [Indexed: 11/30/2022]
Abstract
BACKGROUND Intra-articular fractures are associated with posttraumatic arthritis if inappropriately treated. Exact reduction of the joint congruency is the main factor to avoid the development of arthrosis. Aim of this study was to evaluate feasibility of computer-assisted surgical planning and 3D-printed patient-specific instrumentation (PSI) for treatment of distal intraarticular radius fractures. METHOD 7 Patients who suffered a distal intraarticular radius fracture were enrolled in this prospective case series. Preoperative CT-scan was recorded, whereupon a 3D model was computed for surgical planning and design of PSI for surgical navigation. Postoperative accuracy and joint congruency were assessed. Patients were followed-up 3, 6 and 12 months postoperatively. RESULTS Mean follow-up was 16 months. Over all range of motion was restored and flexion, extension and pronation showed significant recovery, p < 0.05. Biggest intraarticular joint step-off and gap reduced from average 2.49 (± 1.04) to 0.8 mm (± 0.44), p < 0.05 and 6.12 mm (± 1.04) to 2.21 mm (± 1.16), p < 0.05. Average grip strength restored (3-16 months) from 20.33 kg (± 7.12) to 39.3 kg (± 19.55) p < 0.05, 100% of the healthy contralateral side. 3D-accuracy for guided fragments was 2.07 mm (± 0.64) and 8.59° (± 2.9) and 2.33 mm (± 0.69) and 12.86° (± 7.13), p > 0.05 for fragments reduced with ligamentotaxis. CONCLUSION Computer-assisted and PSI navigated intraarticular radius fracture treatment is feasible, safe and accurate. The benefits of this method, however, do not outstand the additional effort. LEVEL OF EVIDENCE IV.
Collapse
Affiliation(s)
- F. A. Casari
- Orthopedic Department, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich-CH, Switzerland ,ROCS; Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Balgrist-Campus, Lengghalde 5, 8008 Zurich-CH, Switzerland
| | - S. Roner
- Orthopedic Department, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich-CH, Switzerland ,ROCS; Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Balgrist-Campus, Lengghalde 5, 8008 Zurich-CH, Switzerland
| | - P. Fürnstahl
- ROCS; Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Balgrist-Campus, Lengghalde 5, 8008 Zurich-CH, Switzerland
| | - L. Nagy
- Orthopedic Department, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich-CH, Switzerland ,ROCS; Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Balgrist-Campus, Lengghalde 5, 8008 Zurich-CH, Switzerland
| | - A. Schweizer
- Orthopedic Department, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich-CH, Switzerland ,ROCS; Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Balgrist-Campus, Lengghalde 5, 8008 Zurich-CH, Switzerland
| |
Collapse
|