1
|
Ye J, Chen Q, Zhong T, Liu J, Gao H. Is Overlain Display a Right Choice for AR Navigation? A Qualitative Study of Head-Mounted Augmented Reality Surgical Navigation on Accuracy for Large-Scale Clinical Deployment. CNS Neurosci Ther 2025; 31:e70217. [PMID: 39817491 PMCID: PMC11736426 DOI: 10.1111/cns.70217] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2024] [Revised: 12/24/2024] [Accepted: 01/03/2025] [Indexed: 01/18/2025] Open
Abstract
BACKGROUND During the course of the past two decades, head-mounted augmented reality surgical navigation (HMARSN) systems have been increasingly employed in a variety of surgical specialties as a result of both advancements in augmented reality-related technologies and surgeons' desires to overcome some drawbacks inherent to conventional surgical navigation systems. In the present time, most experimental HMARSN systems adopt overlain display (OD) that overlay virtual models and planned routes of surgical tools on corresponding physical tissues, organs, lesions, and so forth, in a surgical field so as to provide surgeons with an intuitive and direct view to gain better hand-eye coordination as well as avoid attention shift and loss of sight (LOS), among other benefits during procedures. Yet, its system accuracy, which is the most crucial performance indicator of any surgical navigation system, is difficult to ascertain because it is highly subjective and user-dependent. Therefore, the aim of this study was to review presently available experimental OD HMARSN systems qualitatively, explore how their system accuracy is affected by overlain display, and find out if such systems are suited to large-scale clinical deployment. METHOD We searched PubMed and ScienceDirect with the following terms: head mounted augmented reality surgical navigation, and 445 records were returned in total. After screening and eligibility assessment, 60 papers were finally analyzed. Specifically, we focused on how their accuracies were defined and measured, as well as whether such accuracies are stable in clinical practice and competitive with corresponding commercially available systems. RESULTS AND CONCLUSIONS The primary findings are that the system accuracy of OD HMARSN systems is seriously affected by a transformation between the spaces of the user's eyes and the surgical field, because measurement of the transformation is heavily individualized and user-dependent. Additionally, the transformation itself is potentially subject to changes during surgical procedures, and hence unstable. Therefore, OD HMARSN systems are not suitable for large-scale clinical deployment.
Collapse
Affiliation(s)
- Jian Ye
- Department of Neurosurgery, Affiliated Qingyuan HospitalGuangzhou Medical University, Qingyuan People's HospitalQiangyuanChina
| | - Qingwen Chen
- Department of NeurosurgeryThe First Affiliated Hospital of Guangdong Pharmaceutical UniversityGuangzhouChina
| | - Tao Zhong
- Department of NeurosurgeryThe First Affiliated Hospital of Guangdong Pharmaceutical UniversityGuangzhouChina
| | - Jian Liu
- Department of Neurosurgery, Affiliated Qingyuan HospitalGuangzhou Medical University, Qingyuan People's HospitalQiangyuanChina
| | - Han Gao
- Department of Neurosurgery, Affiliated Qingyuan HospitalGuangzhou Medical University, Qingyuan People's HospitalQiangyuanChina
| |
Collapse
|
2
|
Jung H, Raythatha J, Moghadam A, Jin G, Mao J, Hsu J, Kim J. RibMR - A Mixed Reality Visualization System for Rib Fracture Localization in Surgical Stabilization of Rib Fractures: Phantom, Preclinical, and Clinical Studies. JOURNAL OF IMAGING INFORMATICS IN MEDICINE 2024:10.1007/s10278-024-01332-2. [PMID: 39707113 DOI: 10.1007/s10278-024-01332-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/18/2023] [Revised: 11/04/2024] [Accepted: 11/04/2024] [Indexed: 12/23/2024]
Abstract
In surgical stabilization of rib fractures (SSRF), the current standard relies on preoperative CT imaging and often incorporates ultrasound (US) imaging. As an alternative, mixed reality (MR) technology holds promise for improving rib fracture localization. This study presents an MR-based visualization system designed for SSRF in a clinical setting. We developed RibMR - a visualization system using an MR head-mounted display that projects a patient-specific 3D hologram onto the patient. RibMR enables the localization of rib fractures in relation to the patient's anatomy. We conducted phantom study using a human mannequin, a preclinical study with two healthy patients, and clinical study with two patients to evaluate RibMR and compared it to US practice. RibMR localized rib fractures with an average accuracy of 0.38 ± 0.21 cm in phantom, 3.75 ± 2.45 cm in preclinical, and 1.47 ± 1.33 cm in clinical studies. RibMR took an average time (minutes) of 4.42 ± 0.98 for the phantom, 8.03 ± 3.67 for the preclinical, and 8.76 ± 0.65 for the clinical studies. Compared to US, RibMR located more fractures, including fractures occluded by other structures, with higher accuracy, faster speed, and improved localization rate. All participating surgeons provided positive feedback regarding accuracy, visualization quality, and usability. RibMR enabled accurate and time-efficient localization of rib fractures and showed better performance compared to US. RibMR is a promising alternative to US for localizing rib fractures in SSRF.
Collapse
Affiliation(s)
- Hoijoon Jung
- Biomedical Data Analysis and Visualisation (BDAV) Lab, School of Computer Science, The University of Sydney, Camperdown, NSW, 2050, Australia
| | - Jineel Raythatha
- Trauma Service, Westmead Hospital, Westmead, NSW, 2145, Australia
- Department of Surgery, Faculty of Medicine and Health, The University of Sydney, Camperdown, NSW, 2050, Australia
| | - Alireza Moghadam
- Trauma Service, Westmead Hospital, Westmead, NSW, 2145, Australia
- Department of Surgery, Faculty of Medicine and Health, The University of Sydney, Camperdown, NSW, 2050, Australia
| | - Ge Jin
- Biomedical Data Analysis and Visualisation (BDAV) Lab, School of Computer Science, The University of Sydney, Camperdown, NSW, 2050, Australia
| | - Jiawei Mao
- Biomedical Data Analysis and Visualisation (BDAV) Lab, School of Computer Science, The University of Sydney, Camperdown, NSW, 2050, Australia
| | - Jeremy Hsu
- Trauma Service, Westmead Hospital, Westmead, NSW, 2145, Australia
- Department of Surgery, Faculty of Medicine and Health, The University of Sydney, Camperdown, NSW, 2050, Australia
| | - Jinman Kim
- Biomedical Data Analysis and Visualisation (BDAV) Lab, School of Computer Science, The University of Sydney, Camperdown, NSW, 2050, Australia.
| |
Collapse
|
3
|
Rowan NJ. Digital technologies to unlock safe and sustainable opportunities for medical device and healthcare sectors with a focus on the combined use of digital twin and extended reality applications: A review. THE SCIENCE OF THE TOTAL ENVIRONMENT 2024; 926:171672. [PMID: 38485014 DOI: 10.1016/j.scitotenv.2024.171672] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/14/2024] [Revised: 03/09/2024] [Accepted: 03/10/2024] [Indexed: 03/26/2024]
Abstract
Medical devices have increased in complexity where there is a pressing need to consider design thinking and specialist training for manufacturers, healthcare and sterilization providers, and regulators. Appropriately addressing this consideration will positively inform end-to-end supply chain and logistics, production, processing, sterilization, safety, regulation, education, sustainability and circularity. There are significant opportunities to innovate and to develop appropriate digital tools to help unlock efficiencies in these important areas. This constitutes the first paper to create an awareness of and to define different digital technologies for informing and enabling medical device production from a holistic end-to-end life cycle perspective. It describes the added-value of using digital innovations to meet emerging opportunities for many disposable and reusable medical devices. It addresses the value of accessing and using integrated multi-actor HUBs that combine academia, industry, healthcare, regulators and society to help meet these opportunities. Such as cost-effective access to specialist pilot facilities and expertise that converges digital innovation, material science, biocompatibility, sterility assurance, business model and sustainability. It highlights the marked gap in academic R&D activities (PRISMA review of best publications conducted between January 2010 and January 2024) and the actual list of U.S. FDA's approved and marketed artificial intelligence/machine learning (AI/ML), and augmented reality/virtual reality (AR/VR) enabled-medical devices for different healthcare applications. Bespoke examples of benefits underlying future use of digital tools includes potential implementation of machine learning for supporting and enabling parametric release of sterilized products through efficient monitoring of critical process data (complying with ISO 11135:2014) that would benefit stakeholders. This paper also focuses on the transformative potential of combining digital twin with extended reality innovations to inform efficiencies in medical device design thinking, supply chain and training to inform patient safety, circularity and sustainability.
Collapse
Affiliation(s)
- Neil J Rowan
- Centre for Sustainable Disinfection and Sterilization, Technological University of the Shannon, Midlands Campus, Ireland; CURAM SFI Research Centre for Medical Devices, University of Galway, Ireland.
| |
Collapse
|
4
|
Csernátony Z, Manó S, Szabó D, Soósné Horváth H, Kovács ÁÉ, Csámer L. Acetabular Revision with McMinn Cup: Development and Application of a Patient-Specific Targeting Device. Bioengineering (Basel) 2023; 10:1095. [PMID: 37760197 PMCID: PMC10526046 DOI: 10.3390/bioengineering10091095] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2023] [Revised: 09/11/2023] [Accepted: 09/13/2023] [Indexed: 09/29/2023] Open
Abstract
BACKGROUND Surgeries of severe periacetabular bone defects (Paprosky ≥ 2B) are a major challenge in current practice. Although solutions are available for this serious clinical problem, they all have their disadvantages as well as their advantages. An alternative method of reconstructing such extensive defects was the use of a cup with a stem to solve these revision situations. As the instrumentation offered is typically designed for scenarios where a significant bone defect is not present, our unique technique has been developed for implantation in cases where reference points are missing. Our hypothesis was that a targeting device designed based on the CT scan of a patient's pelvis could facilitate the safe insertion of the guiding wire. METHODS Briefly, our surgical solution consists of a two-step operation. If periacetabular bone loss was found to be more significant during revision surgery, all implants were removed, and two titanium marker screws in the anterior iliac crest were percutaneously inserted. Next, by applying the metal artifact removal (MAR) algorithm, a CT scan of the pelvis was performed. Based on that, the dimensions and positioning of the cup to be inserted were determined, and a patient-specific 3D printed targeting device made of biocompatible material was created to safely insert the guidewire, which is essential to the implantation process. RESULTS In this study, medical, engineering, and technical tasks related to the design, the surgical technique, and experiences from 17 surgical cases between February 2018 and July 2021 are reported. There were no surgical complications in any cases. The implant had to be removed due to septic reasons (independently from the technique) in a single case, consistent with the septic statistics for this type of surgery. There was not any perforation of the linea terminalis of the pelvis due to the guiding method. The wound healing of patients was uneventful, and the implant was fixed securely. Following rehabilitation, the joints were able to bear weight again. After one to four years of follow-up, the patient satisfaction level was high, and the gait function of the patients improved a lot in all cases. CONCLUSIONS Our results show that CT-based virtual surgical planning and, based on it, the use of a patient-specific 3D printed aiming device is a reliable method for major hip surgeries with significant bone loss. This technique has also made it possible to perform these operations with minimal X-ray exposure.
Collapse
Affiliation(s)
- Zoltán Csernátony
- Department of Orthopaedics and Traumatology, Faculty of Medicine, University of Debrecen, H-4032 Debrecen, Hungary; (Z.C.)
- Laboratory of Biomechanics, Department of Orthopaedics and Traumatology, Faculty of Medicine, University of Debrecen, H-4032 Debrecen, Hungary; (S.M.)
| | - Sándor Manó
- Laboratory of Biomechanics, Department of Orthopaedics and Traumatology, Faculty of Medicine, University of Debrecen, H-4032 Debrecen, Hungary; (S.M.)
| | - Dániel Szabó
- Department of Orthopaedics and Traumatology, Faculty of Medicine, University of Debrecen, H-4032 Debrecen, Hungary; (Z.C.)
| | - Hajnalka Soósné Horváth
- Laboratory of Biomechanics, Department of Orthopaedics and Traumatology, Faculty of Medicine, University of Debrecen, H-4032 Debrecen, Hungary; (S.M.)
| | - Ágnes Éva Kovács
- Laboratory of Biomechanics, Department of Orthopaedics and Traumatology, Faculty of Medicine, University of Debrecen, H-4032 Debrecen, Hungary; (S.M.)
| | - Loránd Csámer
- Laboratory of Biomechanics, Department of Orthopaedics and Traumatology, Faculty of Medicine, University of Debrecen, H-4032 Debrecen, Hungary; (S.M.)
| |
Collapse
|
5
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 30] [Impact Index Per Article: 15.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
6
|
Sugahara K, Koyachi M, Tachizawa K, Iwasaki A, Matsunaga S, Odaka K, Sugimoto M, Abe S, Nishii Y, Katakura A. Using mixed reality and CAD/CAM technology for treatment of maxillary non-union after Le Fort I osteotomy: a case description. Quant Imaging Med Surg 2023; 13:1190-1199. [PMID: 36819286 PMCID: PMC9929389 DOI: 10.21037/qims-22-414] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2022] [Accepted: 11/11/2022] [Indexed: 01/05/2023]
Affiliation(s)
- Keisuke Sugahara
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan;,Oral Health Science Center, Tokyo Dental College, Tokyo, Japan
| | - Masahide Koyachi
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan
| | - Kotaro Tachizawa
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan
| | - Akira Iwasaki
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan
| | - Satoru Matsunaga
- Oral Health Science Center, Tokyo Dental College, Tokyo, Japan;,Department of Anatomy, Tokyo Dental College, Tokyo, Japan
| | - Kento Odaka
- Department of Oral and Maxillofacial Radiology, Tokyo Dental College, Tokyo, Japan
| | - Maki Sugimoto
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan;,Innovation Lab, Teikyo University Okinaga Research Institute, Tokyo, Japan
| | - Shinichi Abe
- Department of Anatomy, Tokyo Dental College, Tokyo, Japan
| | - Yasushi Nishii
- Department of Orthodontics, Tokyo Dental College, Tokyo, Japan
| | - Akira Katakura
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan;,Oral Health Science Center, Tokyo Dental College, Tokyo, Japan
| |
Collapse
|
7
|
Chegini S, Edwards E, McGurk M, Clarkson M, Schilling C. Systematic review of techniques used to validate the registration of augmented-reality images using a head-mounted device to navigate surgery. Br J Oral Maxillofac Surg 2023; 61:19-27. [PMID: 36513525 DOI: 10.1016/j.bjoms.2022.08.007] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 07/31/2022] [Accepted: 08/17/2022] [Indexed: 12/14/2022]
Abstract
Augmented-reality (AR) head-mounted devices (HMD) allow the wearer to have digital images superposed on to their field of vision. They are being used to superpose annotations on to the surgical field akin to a navigation system. This review examines published validation studies on HMD-AR systems, their reported protocols, and outcomes. The aim was to establish commonalities and an acceptable registration outcome. Multiple databases were systematically searched for relevant articles between January 2015 and January 2021. Studies that examined the registration of AR content using a HMD to guide surgery were eligible for inclusion. The country of origin, year of publication, medical specialty, HMD device, software, and method of registration, were recorded. A meta-analysis of the mean registration error was conducted. A total of 4784 papers were identified, of which 23 met the inclusion criteria. They included studies using HoloLens (Microsoft) (n = 22) and nVisor ST60 (NVIS Inc) (n = 1). Sixty-six per cent of studies were in hard tissue specialties. Eleven studies reported registration errors using pattern markers (mean (SD) 2.6 (1.8) mm), and four reported registration errors using surface markers (mean (SD) 3.8 (3.7) mm). Three studies reported registration errors using manual alignment (mean (SD) 2.2 (1.3) mm). The majority of studies in this review used in-house software with a variety of registration methods and reported errors. The mean registration error calculated in this study can be considered as a minimum acceptable standard. It should be taken into consideration when procedural applications are selected.
Collapse
Affiliation(s)
- Soudeh Chegini
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom.
| | - Eddie Edwards
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Mark McGurk
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Matthew Clarkson
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Clare Schilling
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| |
Collapse
|
8
|
Meulstee J, Bussink T, Delye H, Xi T, Borstlap W, Maal T. Surgical guides versus augmented reality to transfer a virtual surgical plan for open cranial vault reconstruction: A pilot study. ADVANCES IN ORAL AND MAXILLOFACIAL SURGERY 2022. [DOI: 10.1016/j.adoms.2022.100334] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022] Open
|
9
|
von Haxthausen F, Moreta-Martinez R, Pose Díez de la Lastra A, Pascau J, Ernst F. UltrARsound: in situ visualization of live ultrasound images using HoloLens 2. Int J Comput Assist Radiol Surg 2022; 17:2081-2091. [PMID: 35776399 PMCID: PMC9515035 DOI: 10.1007/s11548-022-02695-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2022] [Accepted: 05/31/2022] [Indexed: 11/24/2022]
Abstract
Purpose Augmented Reality (AR) has the potential to simplify ultrasound (US) examinations which usually require a skilled and experienced sonographer to mentally align narrow 2D cross-sectional US images in the 3D anatomy of the patient. This work describes and evaluates a novel approach to track retroreflective spheres attached to the US probe using an inside-out technique with the AR glasses HoloLens 2. Finally, live US images are displayed in situ on the imaged anatomy. Methods The Unity application UltrARsound performs spatial tracking of the US probe and attached retroreflective markers using the depth camera integrated into the AR glasses—thus eliminating the need for an external tracking system. Additionally, a Kalman filter is implemented to improve the noisy measurements of the camera. US images are streamed wirelessly via the PLUS toolkit to HoloLens 2. The technical evaluation comprises static and dynamic tracking accuracy, frequency and latency of displayed images. Results Tracking is performed with a median accuracy of 1.98 mm/1.81\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$^\circ $$\end{document}∘ for the static setting when using the Kalman filter. In a dynamic scenario, the median error was 2.81 mm/1.70\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$^\circ $$\end{document}∘. The tracking frequency is currently limited to 20 Hz. 83% of the displayed US images had a latency lower than 16 ms. Conclusions In this work, we showed that spatial tracking of retroreflective spheres with the depth camera of HoloLens 2 is feasible, achieving a promising accuracy for in situ visualization of live US images. For tracking, no additional hardware nor modifications to HoloLens 2 are required making it a cheap and easy-to-use approach. Moreover, a minimal latency of displayed images enables a real-time perception for the sonographer.
Collapse
Affiliation(s)
- Felix von Haxthausen
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain. .,Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562, Lübeck, Schleswig-Holstein, Germany.
| | - Rafael Moreta-Martinez
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain
| | - Alicia Pose Díez de la Lastra
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain
| | - Javier Pascau
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain
| | - Floris Ernst
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562, Lübeck, Schleswig-Holstein, Germany
| |
Collapse
|
10
|
Pose-Díez-de-la-Lastra A, Moreta-Martinez R, García-Sevilla M, García-Mato D, Calvo-Haro JA, Mediavilla-Santos L, Pérez-Mañanes R, von Haxthausen F, Pascau J. HoloLens 1 vs. HoloLens 2: Improvements in the New Model for Orthopedic Oncological Interventions. SENSORS 2022; 22:s22134915. [PMID: 35808407 PMCID: PMC9269857 DOI: 10.3390/s22134915] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/27/2022] [Revised: 06/20/2022] [Accepted: 06/27/2022] [Indexed: 11/16/2022]
Abstract
This work analyzed the use of Microsoft HoloLens 2 in orthopedic oncological surgeries and compares it to its predecessor (Microsoft HoloLens 1). Specifically, we developed two equivalent applications, one for each device, and evaluated the augmented reality (AR) projection accuracy in an experimental scenario using phantoms based on two patients. We achieved automatic registration between virtual and real worlds using patient-specific surgical guides on each phantom. They contained a small adaptor for a 3D-printed AR marker, the characteristic patterns of which were easily recognized using both Microsoft HoloLens devices. The newest model improved the AR projection accuracy by almost 25%, and both of them yielded an RMSE below 3 mm. After ascertaining the enhancement of the second model in this aspect, we went a step further with Microsoft HoloLens 2 and tested it during the surgical intervention of one of the patients. During this experience, we collected the surgeons’ feedback in terms of comfortability, usability, and ergonomics. Our goal was to estimate whether the improved technical features of the newest model facilitate its implementation in actual surgical scenarios. All of the results point to Microsoft HoloLens 2 being better in all the aspects affecting surgical interventions and support its use in future experiences.
Collapse
Affiliation(s)
- Alicia Pose-Díez-de-la-Lastra
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (A.P.-D.-d.-l.-L.); (R.M.-M.); (M.G.-S.); (D.G.-M.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
| | - Rafael Moreta-Martinez
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (A.P.-D.-d.-l.-L.); (R.M.-M.); (M.G.-S.); (D.G.-M.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
| | - Mónica García-Sevilla
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (A.P.-D.-d.-l.-L.); (R.M.-M.); (M.G.-S.); (D.G.-M.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
| | - David García-Mato
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (A.P.-D.-d.-l.-L.); (R.M.-M.); (M.G.-S.); (D.G.-M.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
| | - José Antonio Calvo-Haro
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
- Servicio de Cirugía Ortopédica y Traumatología, Hospital General Universitario Gregorio Marañón, 28007 Madrid, Spain
- Departamento de Cirugía, Facultad de Medicina, Universidad Complutense de Madrid, 28040 Madrid, Spain
| | - Lydia Mediavilla-Santos
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
- Servicio de Cirugía Ortopédica y Traumatología, Hospital General Universitario Gregorio Marañón, 28007 Madrid, Spain
- Departamento de Cirugía, Facultad de Medicina, Universidad Complutense de Madrid, 28040 Madrid, Spain
| | - Rubén Pérez-Mañanes
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
- Servicio de Cirugía Ortopédica y Traumatología, Hospital General Universitario Gregorio Marañón, 28007 Madrid, Spain
- Departamento de Cirugía, Facultad de Medicina, Universidad Complutense de Madrid, 28040 Madrid, Spain
| | - Felix von Haxthausen
- Institute for Robotics and Cognitive Systems, University of Lübeck, 23562 Lübeck, Germany;
| | - Javier Pascau
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (A.P.-D.-d.-l.-L.); (R.M.-M.); (M.G.-S.); (D.G.-M.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
- Correspondence: ; Tel.: +34-91-624-8196
| |
Collapse
|
11
|
Cercenelli L, Babini F, Badiali G, Battaglia S, Tarsitano A, Marchetti C, Marcelli E. Augmented Reality to Assist Skin Paddle Harvesting in Osteomyocutaneous Fibular Flap Reconstructive Surgery: A Pilot Evaluation on a 3D-Printed Leg Phantom. Front Oncol 2022; 11:804748. [PMID: 35071009 PMCID: PMC8770836 DOI: 10.3389/fonc.2021.804748] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2021] [Accepted: 12/10/2021] [Indexed: 11/13/2022] Open
Abstract
Background Augmented Reality (AR) represents an evolution of navigation-assisted surgery, providing surgeons with a virtual aid contextually merged with the real surgical field. We recently reported a case series of AR-assisted fibular flap harvesting for mandibular reconstruction. However, the registration accuracy between the real and the virtual content needs to be systematically evaluated before widely promoting this tool in clinical practice. In this paper, after description of the AR based protocol implemented for both tablet and HoloLens 2 smart glasses, we evaluated in a first test session the achievable registration accuracy with the two display solutions, and in a second test session the success rate in executing the AR-guided skin paddle incision task on a 3D printed leg phantom. Methods From a real computed tomography dataset, 3D virtual models of a human leg, including fibula, arteries and skin with planned paddle profile for harvesting, were obtained. All virtual models were imported into Unity software to develop a marker-less AR application suitable to be used both via tablet and via HoloLens 2 headset. The registration accuracy for both solutions was verified on a 3D printed leg phantom obtained from the virtual models, by repeatedly applying the tracking function and computing pose deviations between the AR-projected virtual skin paddle profile and the real one transferred to the phantom via a CAD/CAM cutting guide. The success rate in completing the AR-guided task of skin paddle harvesting was evaluated using CAD/CAM templates positioned on the phantom model surface. Results On average, the marker-less AR protocol showed comparable registration errors (ranging within 1-5 mm) for tablet-based and HoloLens-based solution. Registration accuracy seems to be quite sensitive to ambient light conditions. We found a good success rate in completing the AR-guided task within an error margin of 4 mm (97% and 100% for tablet and HoloLens, respectively). All subjects reported greater usability and ergonomics for HoloLens 2 solution. Conclusions Results revealed that the proposed marker-less AR based protocol may guarantee a registration error within 1-5 mm for assisting skin paddle harvesting in the clinical setting. Optimal lightening conditions and further improvement of marker-less tracking technologies have the potential to increase the efficiency and precision of this AR-assisted reconstructive surgery.
Collapse
Affiliation(s)
- Laura Cercenelli
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| | - Federico Babini
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| | - Giovanni Badiali
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Salvatore Battaglia
- Maxillofacial Surgery Unit, Policlinico San Marco University Hospital, University of Catania, Catania, Italy
| | - Achille Tarsitano
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Claudio Marchetti
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Emanuela Marcelli
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| |
Collapse
|
12
|
García-Sevilla M, Moreta-Martinez R, García-Mato D, Arenas de Frutos G, Ochandiano S, Navarro-Cuéllar C, Sanjuán de Moreta G, Pascau J. Surgical Navigation, Augmented Reality, and 3D Printing for Hard Palate Adenoid Cystic Carcinoma En-Bloc Resection: Case Report and Literature Review. Front Oncol 2022; 11:741191. [PMID: 35059309 PMCID: PMC8763795 DOI: 10.3389/fonc.2021.741191] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2021] [Accepted: 11/26/2021] [Indexed: 12/18/2022] Open
Abstract
Adenoid Cystic Carcinoma is a rare and aggressive tumor representing less than 1% of head and neck cancers. This malignancy often arises from the minor salivary glands, being the palate its most common location. Surgical en-bloc resection with clear margins is the primary treatment. However, this location presents a limited line of sight and a high risk of injuries, making the surgical procedure challenging. In this context, technologies such as intraoperative navigation can become an effective tool, reducing morbidity and improving the safety and accuracy of the procedure. Although their use is extended in fields such as neurosurgery, their application in maxillofacial surgery has not been widely evidenced. One reason is the need to rigidly fixate a navigation reference to the patient, which often entails an invasive setup. In this work, we studied three alternative and less invasive setups using optical tracking, 3D printing and augmented reality. We evaluated their precision in a patient-specific phantom, obtaining errors below 1 mm. The optimum setup was finally applied in a clinical case, where the navigation software was used to guide the tumor resection. Points were collected along the surgical margins after resection and compared with the real ones identified in the postoperative CT. Distances of less than 2 mm were obtained in 90% of the samples. Moreover, the navigation provided confidence to the surgeons, who could then undertake a less invasive and more conservative approach. The postoperative CT scans showed adequate resection margins and confirmed that the patient is free of disease after two years of follow-up.
Collapse
Affiliation(s)
- Mónica García-Sevilla
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - Rafael Moreta-Martinez
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - David García-Mato
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - Gema Arenas de Frutos
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Cirugía Oral y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Santiago Ochandiano
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Cirugía Oral y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Carlos Navarro-Cuéllar
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Cirugía Oral y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Guillermo Sanjuán de Moreta
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Otorrinolaringología, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Javier Pascau
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| |
Collapse
|
13
|
García-Sevilla M, Moreta-Martinez R, García-Mato D, Pose-Diez-de-la-Lastra A, Pérez-Mañanes R, Calvo-Haro JA, Pascau J. Augmented Reality as a Tool to Guide PSI Placement in Pelvic Tumor Resections. SENSORS 2021; 21:s21237824. [PMID: 34883825 PMCID: PMC8659846 DOI: 10.3390/s21237824] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/03/2021] [Revised: 11/17/2021] [Accepted: 11/22/2021] [Indexed: 02/02/2023]
Abstract
Patient-specific instruments (PSIs) have become a valuable tool for osteotomy guidance in complex surgical scenarios such as pelvic tumor resection. They provide similar accuracy to surgical navigation systems but are generally more convenient and faster. However, their correct placement can become challenging in some anatomical regions, and it cannot be verified objectively during the intervention. Incorrect installations can result in high deviations from the planned osteotomy, increasing the risk of positive resection margins. In this work, we propose to use augmented reality (AR) to guide and verify PSIs placement. We designed an experiment to assess the accuracy provided by the system using a smartphone and the HoloLens 2 and compared the results with the conventional freehand method. The results showed significant differences, where AR guidance prevented high osteotomy deviations, reducing maximal deviation of 54.03 mm for freehand placements to less than 5 mm with AR guidance. The experiment was performed in two versions of a plastic three-dimensional (3D) printed phantom, one including a silicone layer to simulate tissue, providing more realism. We also studied how differences in shape and location of PSIs affect their accuracy, concluding that those with smaller sizes and a homogeneous target surface are more prone to errors. Our study presents promising results that prove AR's potential to overcome the present limitations of PSIs conveniently and effectively.
Collapse
Affiliation(s)
- Mónica García-Sevilla
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (M.G.-S.); (R.M.-M.); (D.G.-M.); (A.P.-D.-d.-l.-L.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (R.P.-M.); (J.A.C.-H.)
| | - Rafael Moreta-Martinez
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (M.G.-S.); (R.M.-M.); (D.G.-M.); (A.P.-D.-d.-l.-L.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (R.P.-M.); (J.A.C.-H.)
| | - David García-Mato
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (M.G.-S.); (R.M.-M.); (D.G.-M.); (A.P.-D.-d.-l.-L.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (R.P.-M.); (J.A.C.-H.)
| | - Alicia Pose-Diez-de-la-Lastra
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (M.G.-S.); (R.M.-M.); (D.G.-M.); (A.P.-D.-d.-l.-L.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (R.P.-M.); (J.A.C.-H.)
| | - Rubén Pérez-Mañanes
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (R.P.-M.); (J.A.C.-H.)
- Servicio de Cirugía Ortopédica y Traumatología, Hospital General Universitario Gregorio Marañón, 28007 Madrid, Spain
| | - José Antonio Calvo-Haro
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (R.P.-M.); (J.A.C.-H.)
- Servicio de Cirugía Ortopédica y Traumatología, Hospital General Universitario Gregorio Marañón, 28007 Madrid, Spain
| | - Javier Pascau
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (M.G.-S.); (R.M.-M.); (D.G.-M.); (A.P.-D.-d.-l.-L.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (R.P.-M.); (J.A.C.-H.)
- Correspondence: ; Tel.: +34-91-624-8196
| |
Collapse
|
14
|
Complex Bone Tumors of the Trunk-The Role of 3D Printing and Navigation in Tumor Orthopedics: A Case Series and Review of the Literature. J Pers Med 2021; 11:jpm11060517. [PMID: 34200075 PMCID: PMC8228871 DOI: 10.3390/jpm11060517] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2021] [Revised: 05/28/2021] [Accepted: 06/01/2021] [Indexed: 02/07/2023] Open
Abstract
The combination of 3D printing and navigation promises improvements in surgical procedures and outcomes for complex bone tumor resection of the trunk, but its features have rarely been described in the literature. Five patients with trunk tumors were surgically treated in our institution using a combination of 3D printing and navigation. The main process includes segmentation, virtual modeling and build preparation, as well as quality assessment. Tumor resection was performed with navigated instruments. Preoperative planning supported clear margin multiplanar resections with intraoperatively adaptable real-time visualization of navigated instruments. The follow-up ranged from 2–15 months with a good functional result. The present results and the review of the current literature reflect the trend and the diverse applications of 3D printing in the medical field. 3D printing at hospital sites is often not standardized, but regulatory aspects may serve as disincentives. However, 3D printing has an increasing impact on precision medicine, and we are convinced that our process represents a valuable contribution in the context of patient-centered individual care.
Collapse
|
15
|
Sugahara K, Koyachi M, Koyama Y, Sugimoto M, Matsunaga S, Odaka K, Abe S, Katakura A. Mixed reality and three dimensional printed models for resection of maxillary tumor: a case report. Quant Imaging Med Surg 2021; 11:2187-2194. [PMID: 33936998 DOI: 10.21037/qims-20-597] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
In the field of oral and maxillofacial surgery, many institutions have recently begun using three-dimensional printers to create three-dimensional models and mixed reality in a variety of diseases. Here, we report the actual situation model which we made using three-dimensional printer from virtual operation data and the resection that was performed while grasping a maxillary benign tumor and neighboring three-dimensional structure by designing an application for Microsoft® HoloLens, and using Mixed Reality surgery support during the procedure.
Collapse
Affiliation(s)
- Keisuke Sugahara
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan.,Oral Health Science Center, Tokyo Dental College, Tokyo, Japan
| | - Masahide Koyachi
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan
| | - Yu Koyama
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan
| | - Maki Sugimoto
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan.,Okinaga Research Institute Innovation Lab, Teikyo University, Tokyo, Japan
| | - Satoru Matsunaga
- Oral Health Science Center, Tokyo Dental College, Tokyo, Japan.,Department of Anatomy, Tokyo Dental College, Tokyo, Japan
| | - Kento Odaka
- Department of Oral and Maxillofacial Radiology, Tokyo Dental College, Tokyo, Japan
| | - Shinichi Abe
- Department of Anatomy, Tokyo Dental College, Tokyo, Japan
| | - Akira Katakura
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan.,Oral Health Science Center, Tokyo Dental College, Tokyo, Japan
| |
Collapse
|
16
|
Point-of-care manufacturing: a single university hospital's initial experience. 3D Print Med 2021; 7:11. [PMID: 33890198 PMCID: PMC8061881 DOI: 10.1186/s41205-021-00101-z] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2020] [Accepted: 04/08/2021] [Indexed: 12/13/2022] Open
Abstract
Background The integration of 3D printing technology in hospitals is evolving toward production models such as point-of-care manufacturing. This study aims to present the results of the integration of 3D printing technology in a manufacturing university hospital. Methods Observational, descriptive, retrospective, and monocentric study of 907 instances of 3D printing from November 2015 to March 2020. Variables such as product type, utility, time, or manufacturing materials were analyzed. Results Orthopedic Surgery and Traumatology, Oral and Maxillofacial Surgery, and Gynecology and Obstetrics are the medical specialties that have manufactured the largest number of processes. Working and printing time, as well as the amount of printing material, is different for different types of products and input data. The most common printing material was polylactic acid, although biocompatible resin was introduced to produce surgical guides. In addition, the hospital has worked on the co-design of custom-made implants with manufacturing companies and has also participated in tissue bio-printing projects. Conclusions The integration of 3D printing in a university hospital allows identifying the conceptual evolution to “point-of-care manufacturing.”
Collapse
|
17
|
Calvo-Haro JA, Pascau J, Mediavilla-Santos L, Sanz-Ruiz P, Sánchez-Pérez C, Vaquero-Martín J, Perez-Mañanes R. Conceptual evolution of 3D printing in orthopedic surgery and traumatology: from "do it yourself" to "point of care manufacturing". BMC Musculoskelet Disord 2021; 22:360. [PMID: 33863319 PMCID: PMC8051827 DOI: 10.1186/s12891-021-04224-6] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/26/2020] [Accepted: 04/07/2021] [Indexed: 02/06/2023] Open
Abstract
BACKGROUND 3D printing technology in hospitals facilitates production models such as point-of-care manufacturing. Orthopedic Surgery and Traumatology is the specialty that can most benefit from the advantages of these tools. The purpose of this study is to present the results of the integration of 3D printing technology in a Department of Orthopedic Surgery and Traumatology and to identify the productive model of the point-of-care manufacturing as a paradigm of personalized medicine. METHODS Observational, descriptive, retrospective and monocentric study of a total of 623 additive manufacturing processes carried out in a Department of Orthopedic Surgery and Traumatology from November 2015 to March 2020. Variables such as product type, utility, time or materials for manufacture were analyzed. RESULTS The areas of expertise that have performed more processes are Traumatology, Reconstructive and Orthopedic Oncology. Pre-operative planning is their primary use. Working and 3D printing hours, as well as the amount of 3D printing material used, vary according to the type of product or material delivered to perform the process. The most commonly used 3D printing material for manufacturing is polylactic acid, although biocompatible resin has been used to produce surgical guides. In addition, the hospital has worked on the co-design of customized implants with manufacturing companies. CONCLUSIONS The integration of 3D printing in a Department of Orthopedic Surgery and Traumatology allows identifying the conceptual evolution from "Do-It-Yourself" to "POC manufacturing".
Collapse
Affiliation(s)
- Jose Antonio Calvo-Haro
- Orthopaedic Surgery and Traumatology Department, Hospital General Universitario Gregorio Marañón, Calle Doctor Esquerdo, 46, Postal code, 28007, Madrid, Spain.
- Advanced Planning and 3D 1Manufacturing Unit, Hospital General Universitario Gregorio Marañón, Madrid, Spain.
- Faculty of Medicine, Department of Surgery, Universidad Complutense, Madrid, Spain.
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.
| | - Javier Pascau
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain
| | - Lydia Mediavilla-Santos
- Orthopaedic Surgery and Traumatology Department, Hospital General Universitario Gregorio Marañón, Calle Doctor Esquerdo, 46, Postal code, 28007, Madrid, Spain
| | - Pablo Sanz-Ruiz
- Orthopaedic Surgery and Traumatology Department, Hospital General Universitario Gregorio Marañón, Calle Doctor Esquerdo, 46, Postal code, 28007, Madrid, Spain
- Faculty of Medicine, Department of Surgery, Universidad Complutense, Madrid, Spain
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - Coral Sánchez-Pérez
- Orthopaedic Surgery and Traumatology Department, Hospital General Universitario Gregorio Marañón, Calle Doctor Esquerdo, 46, Postal code, 28007, Madrid, Spain
| | - Javier Vaquero-Martín
- Orthopaedic Surgery and Traumatology Department, Hospital General Universitario Gregorio Marañón, Calle Doctor Esquerdo, 46, Postal code, 28007, Madrid, Spain
- Faculty of Medicine, Department of Surgery, Universidad Complutense, Madrid, Spain
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - Rubén Perez-Mañanes
- Orthopaedic Surgery and Traumatology Department, Hospital General Universitario Gregorio Marañón, Calle Doctor Esquerdo, 46, Postal code, 28007, Madrid, Spain
- Advanced Planning and 3D 1Manufacturing Unit, Hospital General Universitario Gregorio Marañón, Madrid, Spain
- Faculty of Medicine, Department of Surgery, Universidad Complutense, Madrid, Spain
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| |
Collapse
|
18
|
Andrés-Cano P, Calvo-Haro J, Fillat-Gomà F, Andrés-Cano I, Perez-Mañanes R. Role of the orthopaedic surgeon in 3D printing: current applications and legal issues for a personalized medicine. Rev Esp Cir Ortop Traumatol (Engl Ed) 2021. [DOI: 10.1016/j.recote.2021.01.001] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022] Open
|
19
|
García-Sevilla M, Mediavilla-Santos L, Ruiz-Alba MT, Pérez-Mañanes R, Calvo-Haro JA, Pascau J. Patient-specific desktop 3D-printed guides for pelvic tumour resection surgery: a precision study on cadavers. Int J Comput Assist Radiol Surg 2021; 16:397-406. [PMID: 33616839 DOI: 10.1007/s11548-021-02322-3] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2020] [Accepted: 02/03/2021] [Indexed: 11/24/2022]
Abstract
PURPOSE 3D-printed patient-specific instruments have become a useful tool to improve accuracy in pelvic tumour resections. However, their correct placement can be challenging in some regions due to the morphology of the bone, so it is essential to be aware of the possible placement errors in each region. In this study, we characterize these errors in common pelvic osteotomies. METHODS We conducted an experiment with 9 cadaveric specimens, for which we acquired a pre-operative computed tomography scan. Small PSIs were designed for each case following a realistic surgical approach for four regions of the pelvis: iliac crest (C), supra-acetabular (S), ischial (I), and pubic (P). Final surgical placement was based on a post-operative scan. The resulting positions were compared with pre-operative planning, obtaining translations, rotations, and maximum osteotomy deviations in a local reference frame defined based on the bone's morphology. RESULTS Mean translations and rotations in the direction of the osteotomy plane were as follows: C = 5.3 mm, 6.7°; S = 1.8 mm, 5.1°; I = 1.5 mm, 3.4°; P = 1.8 mm, 3.5°. Mean translations in the remaining axes were below 2 mm. Maximum osteotomy deviations (75% of cases) were below 11.8 mm in C (7.8 mm for half-length), 7.8 mm in S (5.5 mm for half-length), 5.5 mm in I, and 3.7 mm in P. CONCLUSION We have characterized placement errors for small PSIs in four regions of the pelvis. Our results show high errors in C and S PSIs in the direction of the resection plane's normal, and thus large osteotomy deviations. Deviations in short osteotomies in S, I and P and placement errors in the remaining directions were low. The PSIs used in this study are biocompatible and can be produced with a desktop 3D printer, thus minimizing manufacturing cost.
Collapse
Affiliation(s)
- Mónica García-Sevilla
- Departamento de Bioingeniería E Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avenida de La Universidad, 30, 28911, Leganés, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - Lydia Mediavilla-Santos
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Cirugía Ortopédica Y Traumatología, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - María Teresa Ruiz-Alba
- Departamento de Bioingeniería E Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avenida de La Universidad, 30, 28911, Leganés, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - Rubén Pérez-Mañanes
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Cirugía Ortopédica Y Traumatología, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - José Antonio Calvo-Haro
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Cirugía Ortopédica Y Traumatología, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Javier Pascau
- Departamento de Bioingeniería E Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avenida de La Universidad, 30, 28911, Leganés, Madrid, Spain. .,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.
| |
Collapse
|
20
|
Combining Augmented Reality and 3D Printing to Improve Surgical Workflows in Orthopedic Oncology: Smartphone Application and Clinical Evaluation. SENSORS 2021; 21:s21041370. [PMID: 33672053 PMCID: PMC7919470 DOI: 10.3390/s21041370] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/31/2020] [Revised: 02/10/2021] [Accepted: 02/12/2021] [Indexed: 12/15/2022]
Abstract
During the last decade, orthopedic oncology has experienced the benefits of computerized medical imaging to reduce human dependency, improving accuracy and clinical outcomes. However, traditional surgical navigation systems do not always adapt properly to this kind of interventions. Augmented reality (AR) and three-dimensional (3D) printing are technologies lately introduced in the surgical environment with promising results. Here we present an innovative solution combining 3D printing and AR in orthopedic oncological surgery. A new surgical workflow is proposed, including 3D printed models and a novel AR-based smartphone application (app). This app can display the patient’s anatomy and the tumor’s location. A 3D-printed reference marker, designed to fit in a unique position of the affected bone tissue, enables automatic registration. The system has been evaluated in terms of visualization accuracy and usability during the whole surgical workflow. Experiments on six realistic phantoms provided a visualization error below 3 mm. The AR system was tested in two clinical cases during surgical planning, patient communication, and surgical intervention. These results and the positive feedback obtained from surgeons and patients suggest that the combination of AR and 3D printing can improve efficacy, accuracy, and patients’ experience.
Collapse
|
21
|
Teatini A, Kumar RP, Elle OJ, Wiig O. Mixed reality as a novel tool for diagnostic and surgical navigation in orthopaedics. Int J Comput Assist Radiol Surg 2021; 16:407-414. [PMID: 33555563 PMCID: PMC7946663 DOI: 10.1007/s11548-020-02302-z] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2020] [Accepted: 12/14/2020] [Indexed: 12/15/2022]
Abstract
Purpose This study presents a novel surgical navigation tool developed in mixed reality environment for orthopaedic surgery. Joint and skeletal deformities affect all age groups and greatly reduce the range of motion of the joints. These deformities are notoriously difficult to diagnose and to correct through surgery. Method We have developed a surgical tool which integrates surgical instrument tracking and augmented reality through a head mounted display. This allows the surgeon to visualise bones with the illusion of possessing “X-ray” vision. The studies presented below aim to assess the accuracy of the surgical navigation tool in tracking a location at the tip of the surgical instrument in holographic space. Results Results show that the average accuracy provided by the navigation tool is around 8 mm, and qualitative assessment by the orthopaedic surgeons provided positive feedback in terms of the capabilities for diagnostic use. Conclusions More improvements are necessary for the navigation tool to be accurate enough for surgical applications, however, this new tool has the potential to improve diagnostic accuracy and allow for safer and more precise surgeries, as well as provide for better learning conditions for orthopaedic surgeons in training.
Collapse
Affiliation(s)
- Andrea Teatini
- The Intervention Centre, Oslo University Hospital, Oslo, Norway.
- Department of Informatics, University of Oslo, Oslo, Norway.
| | - Rahul P Kumar
- The Intervention Centre, Oslo University Hospital, Oslo, Norway
| | - Ole Jakob Elle
- The Intervention Centre, Oslo University Hospital, Oslo, Norway
- Department of Informatics, University of Oslo, Oslo, Norway
| | - Ola Wiig
- Department of Orthopaedic Surgery, Oslo University Hospital, Oslo, Norway
| |
Collapse
|
22
|
Fotouhi J, Mehrfard A, Song T, Johnson A, Osgood G, Unberath M, Armand M, Navab N. Development and Pre-Clinical Analysis of Spatiotemporal-Aware Augmented Reality in Orthopedic Interventions. IEEE TRANSACTIONS ON MEDICAL IMAGING 2021; 40:765-778. [PMID: 33166252 PMCID: PMC8317976 DOI: 10.1109/tmi.2020.3037013] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Suboptimal interaction with patient data and challenges in mastering 3D anatomy based on ill-posed 2D interventional images are essential concerns in image-guided therapies. Augmented reality (AR) has been introduced in the operating rooms in the last decade; however, in image-guided interventions, it has often only been considered as a visualization device improving traditional workflows. As a consequence, the technology is gaining minimum maturity that it requires to redefine new procedures, user interfaces, and interactions. The main contribution of this paper is to reveal how exemplary workflows are redefined by taking full advantage of head-mounted displays when entirely co-registered with the imaging system at all times. The awareness of the system from the geometric and physical characteristics of X-ray imaging allows the exploration of different human-machine interfaces. Our system achieved an error of 4.76 ± 2.91mm for placing K-wire in a fracture management procedure, and yielded errors of 1.57 ± 1.16° and 1.46 ± 1.00° in the abduction and anteversion angles, respectively, for total hip arthroplasty (THA). We compared the results with the outcomes from baseline standard operative and non-immersive AR procedures, which had yielded errors of [4.61mm, 4.76°, 4.77°] and [5.13mm, 1.78°, 1.43°], respectively, for wire placement, and abduction and anteversion during THA. We hope that our holistic approach towards improving the interface of surgery not only augments the surgeon's capabilities but also augments the surgical team's experience in carrying out an effective intervention with reduced complications and provide novel approaches of documenting procedures for training purposes.
Collapse
|
23
|
Andrews CM, Henry AB, Soriano IM, Southworth MK, Silva JR. Registration Techniques for Clinical Applications of Three-Dimensional Augmented Reality Devices. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE 2020; 9:4900214. [PMID: 33489483 PMCID: PMC7819530 DOI: 10.1109/jtehm.2020.3045642] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/28/2020] [Revised: 11/13/2020] [Accepted: 12/03/2020] [Indexed: 12/15/2022]
Abstract
Many clinical procedures would benefit from direct and intuitive real-time visualization of anatomy, surgical plans, or other information crucial to the procedure. Three-dimensional augmented reality (3D-AR) is an emerging technology that has the potential to assist physicians with spatial reasoning during clinical interventions. The most intriguing applications of 3D-AR involve visualizations of anatomy or surgical plans that appear directly on the patient. However, commercially available 3D-AR devices have spatial localization errors that are too large for many clinical procedures. For this reason, a variety of approaches for improving 3D-AR registration accuracy have been explored. The focus of this review is on the methods, accuracy, and clinical applications of registering 3D-AR devices with the clinical environment. The works cited represent a variety of approaches for registering holograms to patients, including manual registration, computer vision-based registration, and registrations that incorporate external tracking systems. Evaluations of user accuracy when performing clinically relevant tasks suggest that accuracies of approximately 2 mm are feasible. 3D-AR device limitations due to the vergence-accommodation conflict or other factors attributable to the headset hardware add on the order of 1.5 mm of error compared to conventional guidance. Continued improvements to 3D-AR hardware will decrease these sources of error.
Collapse
Affiliation(s)
- Christopher M. Andrews
- Department of Biomedical EngineeringWashington University in St Louis, McKelvey School of EngineeringSt LouisMO63130USA
- SentiAR, Inc.St. LouisMO63108USA
| | | | | | | | - Jonathan R. Silva
- Department of Biomedical EngineeringWashington University in St Louis, McKelvey School of EngineeringSt LouisMO63130USA
| |
Collapse
|
24
|
Andrés-Cano P, Calvo-Haro JA, Fillat-Gomà F, Andrés-Cano I, Perez-Mañanes R. Role of the orthopaedic surgeon in 3D printing: current applications and legal issues for a personalized medicine. Rev Esp Cir Ortop Traumatol (Engl Ed) 2020; 65:138-151. [PMID: 33298378 DOI: 10.1016/j.recot.2020.06.014] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2020] [Accepted: 06/14/2020] [Indexed: 12/16/2022] Open
Abstract
3D printing (I3D) is an additive manufacturing technology with a growing interest in medicine and especially in the specialty of orthopaedic surgery and traumatology. There are numerous applications that add value to the personalised treatment of patients: advanced preoperative planning, surgeries with specific tools for each patient, customised orthotic treatments, personalised implants or prostheses and innovative development in the field of bone and cartilage tissue engineering. This paper provides an update on the role that the orthopaedic surgeon and traumatologist plays as a user and prescriber of this technology and a review of the stages required for the correct integration of I3D into the hospital care flow, from the necessary resources to the current legal recommendations.
Collapse
Affiliation(s)
- P Andrés-Cano
- Departamento de Cirugía Ortopédica y Traumatología, Hospital Universitario Virgen del Rocío, Sevilla, España.
| | - J A Calvo-Haro
- Servicio de Cirugía Ortopédica y Traumatología, Hospital General Universitario Gregorio Marañón, Madrid, España; Departamento de Cirugía, Facultad de Medicina, Universidad Complutense de Madrid, Madrid, España
| | - F Fillat-Gomà
- Unidad de Planificación Quirúrgica 3D, Departamento de Cirugía Ortopédica y Traumatología, Parc Taulí Hospital Universitari, Institut d'Investigació i Innovació Parc Taulí I3PT, Universitat Autònoma de Barcelona, Sabadell, Barcelona, España
| | - I Andrés-Cano
- Departamento de Radiodiagnóstico Hospital Universitario Puerta del Mar, Cádiz, España
| | - R Perez-Mañanes
- Servicio de Cirugía Ortopédica y Traumatología, Hospital General Universitario Gregorio Marañón, Madrid, España; Departamento de Cirugía, Facultad de Medicina, Universidad Complutense de Madrid, Madrid, España
| |
Collapse
|
25
|
García-Mato D, Moreta-Martínez R, García-Sevilla M, Ochandiano S, García-Leal R, Pérez-Mañanes R, Calvo-Haro JA, Salmerón JI, Pascau J. Augmented reality visualization for craniosynostosis surgery. COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING: IMAGING & VISUALIZATION 2020. [DOI: 10.1080/21681163.2020.1834876] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Affiliation(s)
- David García-Mato
- Departamento De Bioingeniería E Ingeniería Aeroespacial, Universidad Carlos III De Madrid, Madrid, Spain
- Instituto De Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - Rafael Moreta-Martínez
- Departamento De Bioingeniería E Ingeniería Aeroespacial, Universidad Carlos III De Madrid, Madrid, Spain
- Instituto De Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - Mónica García-Sevilla
- Departamento De Bioingeniería E Ingeniería Aeroespacial, Universidad Carlos III De Madrid, Madrid, Spain
- Instituto De Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - Santiago Ochandiano
- Instituto De Investigación Sanitaria Gregorio Marañón, Madrid, Spain
- Servicio De Cirugía Oral Y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Roberto García-Leal
- Instituto De Investigación Sanitaria Gregorio Marañón, Madrid, Spain
- Servicio De Neurocirugía, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Rubén Pérez-Mañanes
- Instituto De Investigación Sanitaria Gregorio Marañón, Madrid, Spain
- Servicio De Cirugía Ortopédica Y Traumatología, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - José A. Calvo-Haro
- Instituto De Investigación Sanitaria Gregorio Marañón, Madrid, Spain
- Servicio De Cirugía Ortopédica Y Traumatología, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - José I. Salmerón
- Instituto De Investigación Sanitaria Gregorio Marañón, Madrid, Spain
- Servicio De Cirugía Oral Y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Javier Pascau
- Departamento De Bioingeniería E Ingeniería Aeroespacial, Universidad Carlos III De Madrid, Madrid, Spain
- Instituto De Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| |
Collapse
|
26
|
The influence of different patient positions on the preoperative 3D planning for surgical resection of soft tissue sarcoma in the lower limb-a cadaver pilot study. Surg Oncol 2020; 35:478-483. [PMID: 33120254 DOI: 10.1016/j.suronc.2020.10.008] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2020] [Revised: 10/02/2020] [Accepted: 10/19/2020] [Indexed: 12/26/2022]
Abstract
INTRODUCTION Complete surgical resection remains the mainstay of the treatment of soft tissue sarcomas. Intraoperative positioning of the patient is dictated by tumor location, whereas preoperative imaging is always performed in the supine position. The effect of changing the patient position on the exact location of the tumor with regard to neurovascular structures and bone is unknown. MATERIAL AND METHODS Two fresh frozen cadavers (pelvis and legs) were thawed and warmed. Three standardized tumor models were implanted in the thigh and calf. MR/CT images of the cadavers were obtained sequentially in four different patient positions. The minimal distance of each "tumor" to neurovascular structures was measured on axial MR images and the 3D shift of the center of the tumor to the bone was measured after segmentation of the CT images. RESULTS A significant difference of the minimal distance of the "tumor" to the femoral artery (P = 0.019/0.023) and a significantly greater number of deviations of more than 5mm/10 mm in the thigh between the supine position and the other positions compared to two supine positions (p = 0.027/0.028) were seen. The center of the "tumor" compared to the bone shifted significantly in the thigh (P < 0.001/0.002) but not the lower leg. CONCLUSION Obtaining images in the same patient position as the planned tumor resection may become particularly relevant if computer assisted surgery, which is based on preoperative imaging, is introduced into soft tissue sarcoma surgery as the patient position significantly influences the spatial position of the tumor.
Collapse
|
27
|
Value of the surgeon's sightline on hologram registration and targeting in mixed reality. Int J Comput Assist Radiol Surg 2020; 15:2027-2039. [PMID: 32984934 PMCID: PMC7671978 DOI: 10.1007/s11548-020-02263-3] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2020] [Accepted: 09/14/2020] [Indexed: 12/12/2022]
Abstract
Purpose Mixed reality (MR) is being evaluated as a visual tool for surgical navigation. Current literature presents unclear results on intraoperative accuracy using the Microsoft HoloLens 1®. This study aims to assess the impact of the surgeon’s sightline in an inside-out marker-based MR navigation system for open surgery. Methods Surgeons at Akershus University Hospital tested this system. A custom-made phantom was used, containing 18 wire target crosses within its inner walls. A CT scan was obtained in order to segment all wire targets into a single 3D-model (hologram). An in-house software application (CTrue), developed for the Microsoft HoloLens 1, uploaded 3D-models and automatically registered the 3D-model with the phantom. Based on the surgeon’s sightline while registering and targeting (free sightline /F/or a strictly perpendicular sightline /P/), 4 scenarios were developed (FF-PF-FP-PP). Target error distance (TED) was obtained in three different working axes-(XYZ).
Results Six surgeons (5 males, age 29–62) were enrolled. A total of 864 measurements were collected in 4 scenarios, twice. Scenario PP showed the smallest TED in XYZ-axes mean = 2.98 mm ± SD 1.33; 2.28 mm ± SD 1.45; 2.78 mm ± SD 1.91, respectively. Scenario FF showed the largest TED in XYZ-axes with mean = 10.03 mm ± SD 3.19; 6.36 mm ± SD 3.36; 16.11 mm ± SD 8.91, respectively. Multiple comparison tests, grouped in scenarios and axes, showed that the majority of scenario comparisons had significantly different TED values (p < 0.05). Y-axis always presented the smallest TED regardless of scenario tested. Conclusion A strictly perpendicular working sightline in relation to the 3D-model achieves the best accuracy results. Shortcomings in this technology, as an intraoperative visual cue, can be overcome by sightline correction. Incidentally, this is the preferred working angle for open surgery.
Collapse
|
28
|
Gibby J, Cvetko S, Javan R, Parr R, Gibby W. Use of augmented reality for image-guided spine procedures. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2020; 29:1823-1832. [DOI: 10.1007/s00586-020-06495-4] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/22/2019] [Revised: 04/07/2020] [Accepted: 05/31/2020] [Indexed: 12/14/2022]
|
29
|
Takemoto J, Parmentier B, Bratelli R, Merritt T, California Health Sciences University L. Extended Reality in Patient Care and Pharmacy Practice: A Viewpoint. JOURNAL OF CONTEMPORARY PHARMACY PRACTICE 2020. [DOI: 10.37901/jcphp18-00030] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
Abstract
The evolution of technology has given practitioners and educators more tools to better treat, manage, and educate both patients and future pharmacists. The objective of this viewpoint publication is to describe the current use of extended reality (XR) in pharmacy and propose ways in which pharmacy practice and education may benefit from incorporation of this technology. While these tools have been used for decades by many other professions, pharmacy is starting to adopt XR in professional and educational practice. XR (virtual reality, mixed reality, and augmented reality) is being used in various aspects of pharmacy care and education, such as pain management, diabetes self-care, cross-checking of prescriptions, treatments for addiction, and (in limited ways) patient and pharmacy education. There is great potential for further integration of XR into pharmacy practice and pharmacy education to ultimately improve patient care and education as well as pharmacy education.
Collapse
|
30
|
Takemoto J, Parmentier B, Bratelli R, Merritt T, Coyne L. Extended Reality in Patient Care and Pharmacy Practice: A Viewpoint. JOURNAL OF CONTEMPORARY PHARMACY PRACTICE 2019. [DOI: 10.37901/2573-2765-66.4.33] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
Abstract
The evolution of technology has given practitioners and educators more tools to better treat, manage, and educate both patients and future pharmacists. The objective of this viewpoint publication is to describe the current use of extended reality (XR) in pharmacy and propose ways in which pharmacy practice and education may benefit from incorporation of this technology. While these tools have been used for decades by many other professions, pharmacy is starting to adopt XR in professional and educational practice. XR (virtual reality, mixed reality, and augmented reality) is being used in various aspects of pharmacy care and education, such as pain management, diabetes self-care, cross-checking of prescriptions, treatments for addiction, and (in limited ways) patient and pharmacy education. There is great potential for further integration of XR into pharmacy practice and pharmacy education to ultimately improve patient care and education as well as pharmacy education.
Collapse
|