1
|
van der Woude R, Fitski M, van der Zee JM, van de Ven CP, Bökkerink GMJ, Wijnen MHWA, Meulstee JW, van Doormaal TPC, Siepel FJ, van der Steeg AFW. Clinical Application and Further Development of Augmented Reality Guidance for the Surgical Localization of Pediatric Chest Wall Tumors. J Pediatr Surg 2024; 59:1549-1555. [PMID: 38472040 DOI: 10.1016/j.jpedsurg.2024.02.023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/13/2024] [Accepted: 02/16/2024] [Indexed: 03/14/2024]
Abstract
BACKGROUND Surgical treatment of pediatric chest wall tumors requires accurate surgical planning and tumor localization to achieve radical resections while sparing as much healthy tissue as possible. Augmented Reality (AR) could facilitate surgical decision making by improving anatomical understanding and intraoperative tumor localization. We present our clinical experience with the use of an AR system for intraoperative tumor localization during chest wall resections. Furthermore, we present the pre-clinical results of a new registration method to improve our conventional AR system. METHODS From January 2021, we used the HoloLens 2 for pre-incisional tumor localization during all chest wall resections inside our center. A patient-specific 3D model was projected onto the patient by use of a five-point registration method based on anatomical landmarks. Furthermore, we developed and pre-clinically tested a surface matching method to allow post-incisional AR guidance by performing registration on the exposed surface of the ribs. RESULTS Successful registration and holographic overlay were achieved in eight patients. The projection seemed most accurate when landmarks were positioned in a non-symmetric configuration in proximity to the tumor. Disagreements between the overlay and expected tumor location were mainly due to user-dependent registration errors. The pre-clinical tests of the surface matching method proved the feasibility of registration on the exposed ribs. CONCLUSIONS Our results prove the applicability of AR guidance for the pre- and post-incisional localization of pediatric chest wall tumors during surgery. The system has the potential to enable intraoperative 3D visualization, hereby facilitating surgical planning and management of chest wall resections. LEVEL OF EVIDENCE IV TYPE OF STUDY: Treatment Study.
Collapse
Affiliation(s)
- Rémi van der Woude
- Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584 CS, Utrecht, the Netherlands; Technical Medicine, TechMed Centre, University of Twente, Drienderlolaan 5, 7522 NB, Enschede, the Netherlands
| | - Matthijs Fitski
- Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584 CS, Utrecht, the Netherlands
| | - Jasper M van der Zee
- Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584 CS, Utrecht, the Netherlands; Technical Medicine, TechMed Centre, University of Twente, Drienderlolaan 5, 7522 NB, Enschede, the Netherlands
| | - Cornelis P van de Ven
- Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584 CS, Utrecht, the Netherlands
| | - Guus M J Bökkerink
- Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584 CS, Utrecht, the Netherlands
| | - Marc H W A Wijnen
- Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584 CS, Utrecht, the Netherlands
| | | | - Tristan P C van Doormaal
- Augmedit B.V., Naarden, the Netherlands; Department of Neurosurgery, Brain Division, University Medical Center, Utrecht, the Netherlands
| | - Françoise J Siepel
- Robotics and Mechatronics, TechMed Centre, University of Twente, Drienerlolaan 5, 7522 NB, Enschede, the Netherlands
| | - Alida F W van der Steeg
- Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584 CS, Utrecht, the Netherlands.
| |
Collapse
|
2
|
Finos K, Datta S, Sedrakyan A, Milsom JW, Pua BB. Mixed reality in interventional radiology: a focus on first clinical use of XR90 augmented reality-based visualization and navigation platform. Expert Rev Med Devices 2024:1-10. [PMID: 39054630 DOI: 10.1080/17434440.2024.2379925] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2024] [Accepted: 06/28/2024] [Indexed: 07/27/2024]
Abstract
INTRODUCTION Augmented reality (AR) and virtual reality (VR) are emerging tools in interventional radiology (IR), enhancing IR education, preprocedural planning, and intraprocedural guidance. AREAS COVERED This review identifies current applications of AR/VR in IR, with a focus on studies that assess the clinical impact of AR/VR. We outline the relevant technology and assess current limitations and future directions in this space. We found that the use of AR in IR lags other surgical fields, and the majority of the data exists in case series or small-scale studies. Educational use of AR/VR improves learning anatomy, procedure steps, and procedural learning curves. Preprocedural use of AR/VR decreases procedure times, especially in complex procedures. Intraprocedural AR for live tracking is accurate within 5 mm live patients and has up to 0.75 mm in phantoms, offering decreased procedure time and radiation exposure. Challenges include cost, ergonomics, rapid segmentation, and organ motion. EXPERT OPINION The use of AR/VR in interventional radiology may lead to safer and more efficient procedures. However, more data from larger studies is needed to better understand where AR/VR is confers the most benefit in interventional radiology clinical practice.
Collapse
Affiliation(s)
- Kyle Finos
- Division of Interventional Radiology, New York Presbyterian Hospital/Weill Cornell Medicine, New York, USA
| | - Sanjit Datta
- Division of Interventional Radiology, New York Presbyterian Hospital/Weill Cornell Medicine, New York, USA
| | - Art Sedrakyan
- Population Health Science, New York Presbyterian Hospital/Weill Cornell Medicine, New York, USA
| | - Jeffrey W Milsom
- Division of Colorectal Surgery, New York Presbyterian Hospital/Weill Cornell Medicine, New York, USA
| | - Bradley B Pua
- Division of Interventional Radiology, New York Presbyterian Hospital/Weill Cornell Medicine, New York, USA
| |
Collapse
|
3
|
Arensmeyer J, Bedetti B, Schnorr P, Buermann J, Zalepugas D, Schmidt J, Feodorovici P. A System for Mixed-Reality Holographic Overlays of Real-Time Rendered 3D-Reconstructed Imaging Using a Video Pass-through Head-Mounted Display-A Pathway to Future Navigation in Chest Wall Surgery. J Clin Med 2024; 13:2080. [PMID: 38610849 PMCID: PMC11012529 DOI: 10.3390/jcm13072080] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2024] [Revised: 03/26/2024] [Accepted: 03/27/2024] [Indexed: 04/14/2024] Open
Abstract
Background: Three-dimensional reconstructions of state-of-the-art high-resolution imaging are progressively being used more for preprocedural assessment in thoracic surgery. It is a promising tool that aims to improve patient-specific treatment planning, for example, for minimally invasive or robotic-assisted lung resections. Increasingly available mixed-reality hardware based on video pass-through technology enables the projection of image data as a hologram onto the patient. We describe the novel method of real-time 3D surgical planning in a mixed-reality setting by presenting three representative cases utilizing volume rendering. Materials: A mixed-reality system was set up using a high-performance workstation running a video pass-through-based head-mounted display. Image data from computer tomography were imported and volume-rendered in real-time to be customized through live editing. The image-based hologram was projected onto the patient, highlighting the regions of interest. Results: Three oncological cases were selected to explore the potentials of the mixed-reality system. Two of them presented large tumor masses in the thoracic cavity, while a third case presented an unclear lesion of the chest wall. We aligned real-time rendered 3D holographic image data onto the patient allowing us to investigate the relationship between anatomical structures and their respective body position. Conclusions: The exploration of holographic overlay has proven to be promising in improving preprocedural surgical planning, particularly for complex oncological tasks in the thoracic surgical field. Further studies on outcome-related surgical planning and navigation should therefore be conducted. Ongoing technological progress of extended reality hardware and intelligent software features will most likely enhance applicability and the range of use in surgical fields within the near future.
Collapse
Affiliation(s)
- Jan Arensmeyer
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Bonn Surgical Technology Center (BOSTER), University Hospital Bonn, 53227 Bonn, Germany
| | - Benedetta Bedetti
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, 53123 Bonn, Germany
| | - Philipp Schnorr
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, 53123 Bonn, Germany
| | - Jens Buermann
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, 53123 Bonn, Germany
| | - Donatas Zalepugas
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, 53123 Bonn, Germany
| | - Joachim Schmidt
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Bonn Surgical Technology Center (BOSTER), University Hospital Bonn, 53227 Bonn, Germany
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, 53123 Bonn, Germany
| | - Philipp Feodorovici
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Bonn Surgical Technology Center (BOSTER), University Hospital Bonn, 53227 Bonn, Germany
| |
Collapse
|
4
|
Youssef S, McDonnell JM, Wilson KV, Turley L, Cunniffe G, Morris S, Darwish S, Butler JS. Accuracy of augmented reality-assisted pedicle screw placement: a systematic review. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2024; 33:974-984. [PMID: 38177834 DOI: 10.1007/s00586-023-08094-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/23/2023] [Revised: 12/06/2023] [Accepted: 12/08/2023] [Indexed: 01/06/2024]
Abstract
OBJECTIVE Conventional freehand methods of pedicle screw placement are associated with significant complications due to close proximity to neural and vascular structures. Recent advances in augmented reality surgical navigation (ARSN) have led to its adoption into spine surgery. However, little is known regarding its overall accuracy. The purpose of this study is to delineate the overall accuracy of ARSN pedicle screw placement across various models. METHODS A systematic review was conducted of Medline/PubMed, Cochrane and Embase Library databases according to the PRISMA guidelines. Relevant data extracted included reports of pedicle screw placement accuracy and breaches, as defined by the Gertzbein-Robbins classification, in addition to deviation from pre-planned trajectory and entry point. Accuracy was defined as the summation of grade 0 and grade 1 events per the Gertzbein-Robbins classification. RESULTS Twenty studies reported clinically accurate placed screws. The range of clinically accurate placed screws was 26.3-100%, with 2095 screws (93.1%) being deemed clinically accurate. Furthermore, 5.4% (112/2088) of screws were reported as grade two breaches, 1.6% (33/2088) grade 3 breaches, 3.1% (29/926) medial breaches and 2.3% (21/926) lateral breaches. Mean linear deviation ranged from 1.3 to 5.99 mm, while mean angular/trajectory deviation ranged 1.6°-5.88°. CONCLUSION The results of this study highlight the overall accuracy of ARSN pedicle screw placement. However, further robust prospective studies are needed to accurately compare to conventional methods of pedicle screw placement.
Collapse
Affiliation(s)
- Salma Youssef
- School of Medicine, University College Dublin, Belfield, Dublin, Ireland
| | - Jake M McDonnell
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland
- Trinity Biomedical Sciences Institute, Trinity College Dublin, Dublin, Ireland
| | - Kielan V Wilson
- School of Medicine, University College Dublin, Belfield, Dublin, Ireland.
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland.
| | - Luke Turley
- Department of Orthopaedics, Tallaght University Hospital, Tallaght, Dublin, Ireland
| | - Gráinne Cunniffe
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland
| | - Seamus Morris
- School of Medicine, University College Dublin, Belfield, Dublin, Ireland
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland
| | - Stacey Darwish
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland
- Department of Orthopaedics, St. Vincent's University Hospital, Dublin, Ireland
| | - Joseph S Butler
- School of Medicine, University College Dublin, Belfield, Dublin, Ireland
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland
| |
Collapse
|
5
|
Lee KH, Li M, Varble N, Negussie AH, Kassin MT, Arrichiello A, Carrafiello G, Hazen LA, Wakim PG, Li X, Xu S, Wood BJ. Smartphone Augmented Reality Outperforms Conventional CT Guidance for Composite Ablation Margins in Phantom Models. J Vasc Interv Radiol 2024; 35:452-461.e3. [PMID: 37852601 DOI: 10.1016/j.jvir.2023.10.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2022] [Revised: 09/23/2023] [Accepted: 10/08/2023] [Indexed: 10/20/2023] Open
Abstract
PURPOSE To develop and evaluate a smartphone augmented reality (AR) system for a large 50-mm liver tumor ablation with treatment planning for composite overlapping ablation zones. MATERIALS AND METHODS A smartphone AR application was developed to display tumor, probe, projected probe paths, ablated zones, and real-time percentage of the ablated target tumor volume. Fiducial markers were attached to phantoms and an ablation probe hub for tracking. The system was evaluated with tissue-mimicking thermochromic phantoms and gel phantoms. Four interventional radiologists performed 2 trials each of 3 probe insertions per trial using AR guidance versus computed tomography (CT) guidance approaches in 2 gel phantoms. Insertion points and optimal probe paths were predetermined. On Gel Phantom 2, serial ablated zones were saved and continuously displayed after each probe placement/adjustment, enabling feedback and iterative planning. The percentages of tumor ablated for AR guidance versus CT guidance, and with versus without display of recorded ablated zones, were compared among interventional radiologists with pairwise t-tests. RESULTS The means of percentages of tumor ablated for CT freehand and AR guidance were 36% ± 7 and 47% ± 4 (P = .004), respectively. The mean composite percentages of tumor ablated for AR guidance were 43% ± 1 (without) and 50% ± 2 (with display of ablation zone) (P = .033). There was no strong correlation between AR-guided percentage of ablation and years of experience (r < 0.5), whereas there was a strong correlation between CT-guided percentage of ablation and years of experience (r > 0.9). CONCLUSIONS A smartphone AR guidance system for dynamic iterative large liver tumor ablation was accurate, performed better than conventional CT guidance, especially for less experienced interventional radiologists, and enhanced more standardized performance across experience levels for ablation of a 50-mm tumor.
Collapse
Affiliation(s)
- Katerina H Lee
- McGovern Medical School at UTHealth, Houston, Texas; Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Ming Li
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Nicole Varble
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland; Philips Research North America, Cambridge, Massachusetts
| | - Ayele H Negussie
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Michael T Kassin
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Antonio Arrichiello
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Gianpaolo Carrafiello
- Department of Radiology, Foundation IRCCS Ca' Granda Ospedale Maggiore Policlinico, University of Milan, Milan, Italy
| | - Lindsey A Hazen
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Paul G Wakim
- Biostatistics and Clinical Epidemiology Service, National Institutes of Health, Bethesda, Maryland
| | - Xiaobai Li
- Biostatistics and Clinical Epidemiology Service, National Institutes of Health, Bethesda, Maryland
| | - Sheng Xu
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Bradford J Wood
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland.
| |
Collapse
|
6
|
Stark PW, Borger van der Burg BLS, van Waes OJF, van Dongen TTCF, Wouter, Casper M, Hoencamp R. Telemedicine-Guided Two-Incision Lower Leg Fasciotomy Performed by Combat Medics During Tactical Combat Casualty Care: A Feasibility Study. Mil Med 2024; 189:e645-e651. [PMID: 37703048 PMCID: PMC10898936 DOI: 10.1093/milmed/usad364] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/16/2023] [Revised: 08/25/2023] [Accepted: 09/03/2023] [Indexed: 09/14/2023] Open
Abstract
INTRODUCTION During tactical combat casualty care, life- and limb-saving procedures might also be performed by combat medics. This study assesses whether it is feasible to use a head-mounted display (HMD) to provide telemedicine (TM) support from a consulted senior surgeon for combat medics when performing a two-incision lower leg fasciotomy. MATERIALS AND METHODS Nine combat medics were randomized into groups to perform a two-incision lower leg fasciotomy. One group used the Vuzix M400 and the second group used the RealWear HMT-1Z1. A third, control, group received no guidance. In the Vuzix M400 group and RealWear HMT-1Z1 group, a senior surgeon examined the results after the two-incision lower leg fasciotomy was finished to assess the release of compartments, possible collateral damage, and performance of the combat medics. In the control group, these results were examined by a surgical resident with expertise in two-incision lower leg fasciotomies. The resident's operative performance questionnaire was used to score the performance of the combat medics. The telehealth usability questionnaire was used to evaluate the usability of the HMDs as perceived by the combat medics. RESULTS Combat medics using an HMD were considered competent in performing a two-incision lower leg fasciotomy (Vuzix: median 3 [range 0], RealWear: median 3 [range 1]). These combat medics had a significantly better score in their ability to adapt to anatomical variances compared to the control group (Vuzix: median 3 [range 0], RealWear: median 3 [range 0], control: median 1 [range 0]; P = .018). Combat medics using an HMD were faster than combat medics in the control group (Vuzix: mean 14:14 [SD 3:41], RealWear: mean 15:42 [SD 1:58], control: mean 17:45 [SD 2:02]; P = .340). The overall satisfaction with both HMDs was 5 out of 7 (Vuzix: median 5 [range 0], RealWear: median 5 [range 1]; P = .317). CONCLUSIONS This study shows that it is feasible to use an HMD to provide TM support performance from a consulted senior surgeon for combat medics when performing a two-incision lower leg fasciotomy. The results of this study suggest that TM support might be useful for combat medics during tactical combat casualty care when performing life- and limb-saving procedures.
Collapse
Affiliation(s)
- P W Stark
- Trauma Research Unit, Department of Surgery, Erasmus University Medical Center, Rotterdam, Zuid-Holland 3015 GD, The Netherlands
- Department of Surgery, Alrijne Hospital, Leiderdorp, Zuid-Holland 2353 GA, The Netherlands
| | | | - O J F van Waes
- Trauma Research Unit, Department of Surgery, Erasmus University Medical Center, Rotterdam, Zuid-Holland 3015 GD, The Netherlands
- Defense Healthcare Organization, Ministry of Defense, Den Haag, Zuid-Holland 2511 CB, The Netherlands
| | - T T C F van Dongen
- Department of Surgery, Alrijne Hospital, Leiderdorp, Zuid-Holland 2353 GA, The Netherlands
- Defense Healthcare Organization, Ministry of Defense, Den Haag, Zuid-Holland 2511 CB, The Netherlands
| | - Wouter
- Defense Healthcare Organization, Ministry of Defense, Den Haag, Zuid-Holland 2511 CB, The Netherlands
| | - Marnalg Casper
- Defense Healthcare Organization, Ministry of Defense, Den Haag, Zuid-Holland 2511 CB, The Netherlands
| | - R Hoencamp
- Trauma Research Unit, Department of Surgery, Erasmus University Medical Center, Rotterdam, Zuid-Holland 3015 GD, The Netherlands
- Department of Surgery, Alrijne Hospital, Leiderdorp, Zuid-Holland 2353 GA, The Netherlands
- Defense Healthcare Organization, Ministry of Defense, Den Haag, Zuid-Holland 2511 CB, The Netherlands
| |
Collapse
|
7
|
Begagić E, Bečulić H, Pugonja R, Memić Z, Balogun S, Džidić-Krivić A, Milanović E, Salković N, Nuhović A, Skomorac R, Sefo H, Pojskić M. Augmented Reality Integration in Skull Base Neurosurgery: A Systematic Review. MEDICINA (KAUNAS, LITHUANIA) 2024; 60:335. [PMID: 38399622 PMCID: PMC10889940 DOI: 10.3390/medicina60020335] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/26/2023] [Revised: 02/05/2024] [Accepted: 02/09/2024] [Indexed: 02/25/2024]
Abstract
Background and Objectives: To investigate the role of augmented reality (AR) in skull base (SB) neurosurgery. Materials and Methods: Utilizing PRISMA methodology, PubMed and Scopus databases were explored to extract data related to AR integration in SB surgery. Results: The majority of 19 included studies (42.1%) were conducted in the United States, with a focus on the last five years (77.8%). Categorization included phantom skull models (31.2%, n = 6), human cadavers (15.8%, n = 3), or human patients (52.6%, n = 10). Microscopic surgery was the predominant modality in 10 studies (52.6%). Of the 19 studies, surgical modality was specified in 18, with microscopic surgery being predominant (52.6%). Most studies used only CT as the data source (n = 9; 47.4%), and optical tracking was the prevalent tracking modality (n = 9; 47.3%). The Target Registration Error (TRE) spanned from 0.55 to 10.62 mm. Conclusion: Despite variations in Target Registration Error (TRE) values, the studies highlighted successful outcomes and minimal complications. Challenges, such as device practicality and data security, were acknowledged, but the application of low-cost AR devices suggests broader feasibility.
Collapse
Affiliation(s)
- Emir Begagić
- Department of General Medicine, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Hakija Bečulić
- Department of Neurosurgery, Cantonal Hospital Zenica, Crkvice 67, 72000 Zenica, Bosnia and Herzegovina; (H.B.)
- Department of Anatomy, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Ragib Pugonja
- Department of Anatomy, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Zlatan Memić
- Department of General Medicine, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Simon Balogun
- Division of Neurosurgery, Department of Surgery, Obafemi Awolowo University Teaching Hospitals Complex, Ilesa Road PMB 5538, Ile-Ife 220282, Nigeria
| | - Amina Džidić-Krivić
- Department of Neurology, Cantonal Hospital Zenica, Crkvice 67, 72000 Zenica, Bosnia and Herzegovina
| | - Elma Milanović
- Neurology Clinic, Clinical Center University of Sarajevo, Bolnička 25, 71000 Sarajevo, Bosnia and Herzegovina
| | - Naida Salković
- Department of General Medicine, School of Medicine, University of Tuzla, Univerzitetska 1, 75000 Tuzla, Bosnia and Herzegovina;
| | - Adem Nuhović
- Department of General Medicine, School of Medicine, University of Sarajevo, Univerzitetska 1, 71000 Sarajevo, Bosnia and Herzegovina;
| | - Rasim Skomorac
- Department of Neurosurgery, Cantonal Hospital Zenica, Crkvice 67, 72000 Zenica, Bosnia and Herzegovina; (H.B.)
- Department of Surgery, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina
| | - Haso Sefo
- Neurosurgery Clinic, Clinical Center University of Sarajevo, Bolnička 25, 71000 Sarajevo, Bosnia and Herzegovina
| | - Mirza Pojskić
- Department of Neurosurgery, University Hospital Marburg, Baldingerstr., 35033 Marburg, Germany
| |
Collapse
|
8
|
Azad TD, Warman A, Tracz JA, Hughes LP, Judy BF, Witham TF. Augmented reality in spine surgery - past, present, and future. Spine J 2024; 24:1-13. [PMID: 37660893 DOI: 10.1016/j.spinee.2023.08.015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/13/2023] [Revised: 07/27/2023] [Accepted: 08/29/2023] [Indexed: 09/05/2023]
Abstract
BACKGROUND CONTEXT Augmented reality (AR) is increasingly recognized as a valuable tool in spine surgery. Here we provides an overview of the key developments and technological milestones that have laid the foundation for AR applications in this field. We also assess the quality of existing studies on AR systems in spine surgery and explore potential future applications. PURPOSE The purpose of this narrative review is to examine the role of AR in spine surgery. It aims to highlight the evolution of AR technology in this context, evaluate the existing body of research, and outline potential future directions for integrating AR into spine surgery. STUDY DESIGN Narrative review. METHODS We conducted a thorough literature search to identify studies and developments related to AR in spine surgery. Relevant articles, reports, and technological advancements were analyzed to establish the historical context and current state of AR in this field. RESULTS The review identifies significant milestones in the development of AR technology for spine surgery. It discusses the growing body of research and highlights the strengths and weaknesses of existing investigations. Additionally, it presents insights into the potential for AR to enhance spine surgical education and speculates on future applications. CONCLUSIONS Augmented reality has emerged as a promising adjunct in spine surgery, with notable advancements and research efforts. The integration of AR into the spine surgery operating room holds promise, as does its potential to revolutionize surgical education. Future applications of AR in spine surgery may include real-time navigation, enhanced visualization, and improved patient outcomes. Continued development and evaluation of AR technology are essential for its successful implementation in this specialized surgical field.
Collapse
Affiliation(s)
- Tej D Azad
- Department of Neurosurgery, Johns Hopkins University School of Medicine, 600 N. Wolfe St, Meyer 7-109, Baltimore, MD 21287, USA
| | - Anmol Warman
- Department of Neurosurgery, Johns Hopkins University School of Medicine, 600 N. Wolfe St, Meyer 7-109, Baltimore, MD 21287, USA
| | - Jovanna A Tracz
- Department of Neurosurgery, Johns Hopkins University School of Medicine, 600 N. Wolfe St, Meyer 7-109, Baltimore, MD 21287, USA
| | - Liam P Hughes
- Department of Neurosurgery, Johns Hopkins University School of Medicine, 600 N. Wolfe St, Meyer 7-109, Baltimore, MD 21287, USA
| | - Brendan F Judy
- Department of Neurosurgery, Johns Hopkins University School of Medicine, 600 N. Wolfe St, Meyer 7-109, Baltimore, MD 21287, USA
| | - Timothy F Witham
- Department of Neurosurgery, Johns Hopkins University School of Medicine, 600 N. Wolfe St, Meyer 7-109, Baltimore, MD 21287, USA.
| |
Collapse
|
9
|
Ghayda RA, Cannarella R, Calogero AE, Shah R, Rambhatla A, Zohdy W, Kavoussi P, Avidor-Reiss T, Boitrelle F, Mostafa T, Saleh R, Toprak T, Birowo P, Salvio G, Calik G, Kuroda S, Kaiyal RS, Ziouziou I, Crafa A, Phuoc NHV, Russo GI, Durairajanayagam D, Al-Hashimi M, Hamoda TAAAM, Pinggera GM, Adriansjah R, Maldonado Rosas I, Arafa M, Chung E, Atmoko W, Rocco L, Lin H, Huyghe E, Kothari P, Solorzano Vazquez JF, Dimitriadis F, Garrido N, Homa S, Falcone M, Sabbaghian M, Kandil H, Ko E, Martinez M, Nguyen Q, Harraz AM, Serefoglu EC, Karthikeyan VS, Tien DMB, Jindal S, Micic S, Bellavia M, Alali H, Gherabi N, Lewis S, Park HJ, Simopoulou M, Sallam H, Ramirez L, Colpi G, Agarwal A. Artificial Intelligence in Andrology: From Semen Analysis to Image Diagnostics. World J Mens Health 2024; 42:39-61. [PMID: 37382282 PMCID: PMC10782130 DOI: 10.5534/wjmh.230050] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2023] [Revised: 03/10/2023] [Accepted: 03/17/2023] [Indexed: 06/30/2023] Open
Abstract
Artificial intelligence (AI) in medicine has gained a lot of momentum in the last decades and has been applied to various fields of medicine. Advances in computer science, medical informatics, robotics, and the need for personalized medicine have facilitated the role of AI in modern healthcare. Similarly, as in other fields, AI applications, such as machine learning, artificial neural networks, and deep learning, have shown great potential in andrology and reproductive medicine. AI-based tools are poised to become valuable assets with abilities to support and aid in diagnosing and treating male infertility, and in improving the accuracy of patient care. These automated, AI-based predictions may offer consistency and efficiency in terms of time and cost in infertility research and clinical management. In andrology and reproductive medicine, AI has been used for objective sperm, oocyte, and embryo selection, prediction of surgical outcomes, cost-effective assessment, development of robotic surgery, and clinical decision-making systems. In the future, better integration and implementation of AI into medicine will undoubtedly lead to pioneering evidence-based breakthroughs and the reshaping of andrology and reproductive medicine.
Collapse
Affiliation(s)
- Ramy Abou Ghayda
- Urology Institute, University Hospitals, Case Western Reserve University, Cleveland, OH, USA
| | - Rossella Cannarella
- Department of Clinical and Experimental Medicine, University of Catania, Catania, Italy
- Glickman Urological & Kidney Institute, Cleveland Clinic Foundation, Cleveland, OH, USA
| | - Aldo E. Calogero
- Department of Clinical and Experimental Medicine, University of Catania, Catania, Italy
| | - Rupin Shah
- Department of Urology, Lilavati Hospital and Research Centre, Mumbai, India
| | - Amarnath Rambhatla
- Department of Urology, Henry Ford Health System, Vattikuti Urology Institute, Detroit, MI, USA
| | - Wael Zohdy
- Andrology and STDs, Cairo University, Cairo, Egypt
| | - Parviz Kavoussi
- Department of Urology, University of Texas Health Science Center at San Antonio, San Antonio, TX, USA
| | - Tomer Avidor-Reiss
- Department of Biological Sciences, University of Toledo, Toledo, OH, USA
- Department of Urology, College of Medicine and Life Sciences, University of Toledo, Toledo, OH, USA
| | - Florence Boitrelle
- Reproductive Biology, Fertility Preservation, Andrology, CECOS, Poissy Hospital, Poissy, France
- Department of Biology, Reproduction, Epigenetics, Environment, and Development, Paris Saclay University, UVSQ, INRAE, BREED, Paris, France
| | - Taymour Mostafa
- Andrology, Sexology & STIs Department, Faculty of Medicine, Cairo University, Cairo, Egypt
| | - Ramadan Saleh
- Department of Dermatology, Venereology and Andrology, Faculty of Medicine, Sohag University, Sohag, Egypt
| | - Tuncay Toprak
- Department of Urology, Fatih Sultan Mehmet Training and Research Hospital, University of Health Sciences, Istanbul, Turkey
| | - Ponco Birowo
- Department of Urology, Dr. Cipto Mangunkusumo Hospital, Faculty of Medicine, Universitas Indonesia, Jakarta, Indonesia
| | - Gianmaria Salvio
- Department of Endocrinology, Polytechnic University of Marche, Ancona, Italy
| | - Gokhan Calik
- Department of Urology, Istanbul Medipol University, Istanbul, Turkey
| | - Shinnosuke Kuroda
- Glickman Urological & Kidney Institute, Cleveland Clinic Foundation, Cleveland, OH, USA
- Department of Urology, Reproduction Center, Yokohama City University Medical Center, Yokohama, Japan
| | - Raneen Sawaid Kaiyal
- Glickman Urological & Kidney Institute, Cleveland Clinic Foundation, Cleveland, OH, USA
| | - Imad Ziouziou
- Department of Urology, College of Medicine and Pharmacy, Ibn Zohr University, Agadir, Morocco
| | - Andrea Crafa
- Department of Clinical and Experimental Medicine, University of Catania, Catania, Italy
| | - Nguyen Ho Vinh Phuoc
- Department of Andrology, Binh Dan Hospital, Ho Chi Minh City, Vietnam
- Department of Urology and Andrology, Pham Ngoc Thach University of Medicine, Ho Chi Minh City, Vietnam
| | | | - Damayanthi Durairajanayagam
- Department of Physiology, Faculty of Medicine, Universiti Teknologi MARA, Sungai Buloh Campus, Selangor, Malaysia
| | - Manaf Al-Hashimi
- Department of Urology, Burjeel Hospital, Abu Dhabi, United Arab Emirates (UAE)
- Khalifa University, College of Medicine and Health Science, Abu Dhabi, United Arab Emirates (UAE)
| | - Taha Abo-Almagd Abdel-Meguid Hamoda
- Department of Urology, King Abdulaziz University, Jeddah, Saudi Arabia
- Department of Urology, Faculty of Medicine, Minia University, El-Minia, Egypt
| | | | - Ricky Adriansjah
- Department of Urology, Hasan Sadikin General Hospital, Universitas Padjadjaran, Banding, Indonesia
| | | | - Mohamed Arafa
- Department of Urology, Hamad Medical Corporation, Doha, Qatar
- Department of Urology, Weill Cornell Medical-Qatar, Doha, Qatar
| | - Eric Chung
- Department of Urology, Princess Alexandra Hospital, University of Queensland, Brisbane QLD, Australia
| | - Widi Atmoko
- Department of Urology, Dr. Cipto Mangunkusumo Hospital, Faculty of Medicine, Universitas Indonesia, Jakarta, Indonesia
| | - Lucia Rocco
- Department of Environmental, Biological and Pharmaceutical Sciences and Technologies, University of Campania “Luigi Vanvitelli”, Caserta, Italy
| | - Haocheng Lin
- Department of Urology, Peking University Third Hospital, Peking University, Beijing, China
| | - Eric Huyghe
- Department of Urology and Andrology, University Hospital of Toulouse, Toulouse, France
| | - Priyank Kothari
- Department of Urology, B.Y.L. Nair Charitable Hospital, Topiwala National Medical College, Mumbai, India
| | | | - Fotios Dimitriadis
- Department of Urology, Aristotle University of Thessaloniki, Thessaloniki, Greece
| | - Nicolas Garrido
- IVIRMA Global Research Alliance, IVI Foundation, Instituto de Investigación Sanitaria La Fe (IIS La Fe), Valencia, Spain
| | - Sheryl Homa
- Department of Biosciences, University of Kent, Canterbury, United Kingdom
| | - Marco Falcone
- Department of Urology, Molinette Hospital, A.O.U. Città della Salute e della Scienza, University of Turin, Torino, Italy
| | - Marjan Sabbaghian
- Department of Andrology, Reproductive Biomedicine Research Center, Royan Institute for Reproductive Biomedicine, ACECR, Tehran, Iran
| | | | - Edmund Ko
- Department of Urology, Loma Linda University Health, Loma Linda, CA, USA
| | - Marlon Martinez
- Section of Urology, Department of Surgery, University of Santo Tomas Hospital, Manila, Philippines
| | - Quang Nguyen
- Section of Urology, Department of Surgery, University of Santo Tomas Hospital, Manila, Philippines
- Center for Andrology and Sexual Medicine, Viet Duc University Hospital, Hanoi, Vietnam
- Department of Urology, Andrology and Sexual Medicine, University of Medicine and Pharmacy, Vietnam National University, Hanoi, Vietnam
| | - Ahmed M. Harraz
- Urology and Nephrology Center, Mansoura University, Mansoura, Egypt
- Department of Surgery, Urology Unit, Farwaniya Hospital, Farwaniya, Kuwait
- Department of Urology, Sabah Al Ahmad Urology Center, Kuwait City, Kuwait
| | - Ege Can Serefoglu
- Department of Urology, Biruni University School of Medicine, Istanbul, Turkey
| | | | - Dung Mai Ba Tien
- Department of Andrology, Binh Dan Hospital, Ho Chi Minh City, Vietnam
| | - Sunil Jindal
- Department of Andrology and Reproductive Medicine, Jindal Hospital, Meerut, India
| | - Sava Micic
- Department of Andrology, Uromedica Polyclinic, Belgrade, Serbia
| | - Marina Bellavia
- Andrology and IVF Center, Next Fertility Procrea, Lugano, Switzerland
| | - Hamed Alali
- King Fahad Specialist Hospital, Dammam, Saudi Arabia
| | - Nazim Gherabi
- Andrology Committee of the Algerian Association of Urology, Algiers, Algeria
| | - Sheena Lewis
- Examen Lab Ltd., Northern Ireland, United Kingdom
| | - Hyun Jun Park
- Department of Urology, Pusan National University School of Medicine, Busan, Korea
- Medical Research Institute of Pusan National University Hospital, Busan, Korea
| | - Mara Simopoulou
- Department of Experimental Physiology, School of Health Sciences, Faculty of Medicine, National and Kapodistrian University of Athens, Athens, Greece
| | - Hassan Sallam
- Alexandria University Faculty of Medicine, Alexandria, Egypt
| | - Liliana Ramirez
- IVF Laboratory, CITMER Reproductive Medicine, Mexico City, Mexico
| | - Giovanni Colpi
- Andrology and IVF Center, Next Fertility Procrea, Lugano, Switzerland
| | - Ashok Agarwal
- Global Andrology Forum, Moreland Hills, OH, USA
- Cleveland Clinic, Cleveland, OH, USA
| | | |
Collapse
|
10
|
Condino S, Cutolo F, Carbone M, Cercenelli L, Badiali G, Montemurro N, Ferrari V. Registration Sanity Check for AR-guided Surgical Interventions: Experience From Head and Face Surgery. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE 2023; 12:258-267. [PMID: 38410181 PMCID: PMC10896424 DOI: 10.1109/jtehm.2023.3332088] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/25/2023] [Revised: 10/19/2023] [Accepted: 11/08/2023] [Indexed: 02/28/2024]
Abstract
Achieving and maintaining proper image registration accuracy is an open challenge of image-guided surgery. This work explores and assesses the efficacy of a registration sanity check method for augmented reality-guided navigation (AR-RSC), based on the visual inspection of virtual 3D models of landmarks. We analyze the AR-RSC sensitivity and specificity by recruiting 36 subjects to assess the registration accuracy of a set of 114 AR images generated from camera images acquired during an AR-guided orthognathic intervention. Translational or rotational errors of known magnitude up to ±1.5 mm/±15.5°, were artificially added to the image set in order to simulate different registration errors. This study analyses the performance of AR-RSC when varying (1) the virtual models selected for misalignment evaluation (e. g., the model of brackets, incisor teeth, and gingival margins in our experiment), (2) the type (translation/rotation) of registration error, and (3) the level of user experience in using AR technologies. Results show that: 1) the sensitivity and specificity of the AR-RSC depends on the virtual models (globally, a median true positive rate of up to 79.2% was reached with brackets, and a median true negative rate of up to 64.3% with incisor teeth), 2) there are error components that are more difficult to identify visually, 3) the level of user experience does not affect the method. In conclusion, the proposed AR-RSC, tested also in the operating room, could represent an efficient method to monitor and optimize the registration accuracy during the intervention, but special attention should be paid to the selection of the AR data chosen for the visual inspection of the registration accuracy.
Collapse
Affiliation(s)
- Sara Condino
- Department of Information EngineeringUniversity of Pisa56126PisaItaly
| | - Fabrizio Cutolo
- Department of Information EngineeringUniversity of Pisa56126PisaItaly
| | - Marina Carbone
- Department of Information EngineeringUniversity of Pisa56126PisaItaly
| | - Laura Cercenelli
- EDIMES Laboratory of BioengineeringDepartment of Experimental, Diagnostic and Specialty MedicineUniversity of Bologna40138BolognaItaly
| | - Giovanni Badiali
- Department of Information EngineeringUniversity of Pisa56126PisaItaly
| | - Nicola Montemurro
- Department of NeurosurgeryAzienda Ospedaliera Universitaria Pisana (AOUP)56127PisaItaly
| | - Vincenzo Ferrari
- Department of Information EngineeringUniversity of Pisa56126PisaItaly
| |
Collapse
|
11
|
Nazzal EM, Zsidai B, Hiemstra LA, Lustig S, Samuelsson K, Musahl V. Applications of Extended Reality in Orthopaedic Surgery. J Bone Joint Surg Am 2023; 105:1721-1729. [PMID: 37713502 DOI: 10.2106/jbjs.22.00805] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 09/17/2023]
Abstract
➤ Extended reality is a term that encompasses different modalities, including virtual reality, augmented reality, and mixed reality.➤ Although fully immersive virtual reality has benefits for developing procedural memory and technical skills, augmented and mixed reality are more appropriate modalities for preoperative planning and intraoperative utilization.➤ Current investigations on the role of extended reality in preoperative planning and intraoperative utilization are still in the early stages, but preliminarily show that extended reality technologies can help surgeons to be more accurate and efficient.
Collapse
Affiliation(s)
- Ehab M Nazzal
- Department of Orthopaedic Surgery, UPMC Freddie Fu Sports Medicine Center, University of Pittsburgh, Pittsburgh, Pennsylvania
| | - Bálint Zsidai
- Department of Orthopaedic Surgery, UPMC Freddie Fu Sports Medicine Center, University of Pittsburgh, Pittsburgh, Pennsylvania
- Department of Orthopaedics, Institute of Clinical Sciences, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
| | - Laurie A Hiemstra
- Banff Sport Medicine, Banff, Alberta, Canada
- Department of Surgery, University of Calgary, Calgary, Alberta, Canada
| | - Sébastien Lustig
- Department of Orthopaedic Surgery and Sports Medicine, FIFA Medical Center of Excellence, Croix-Rousse Hospital, Lyon University Hospital, Lyon, France
| | - Kristian Samuelsson
- Department of Orthopaedics, Institute of Clinical Sciences, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Department of Orthopaedics, Sahlgrenska University Hospital, Mölndal, Sweden
| | - Volker Musahl
- Department of Orthopaedic Surgery, UPMC Freddie Fu Sports Medicine Center, University of Pittsburgh, Pittsburgh, Pennsylvania
| |
Collapse
|
12
|
Turan Z, Karabey SC. The use of immersive technologies in distance education: A systematic review. EDUCATION AND INFORMATION TECHNOLOGIES 2023:1-24. [PMID: 37361798 PMCID: PMC10160721 DOI: 10.1007/s10639-023-11849-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/03/2022] [Accepted: 04/25/2023] [Indexed: 06/28/2023]
Abstract
This study aims to conduct a systematic review that includes studies on the use of immersive technologies in distance education. For this purpose, 132 studies detected by searching Web of Science, Eric, Taylor & Francis and Education Full Text (EBSCO) databases were examined. The studies were analysed using the content analysis method. As a result of the analyses, it was observed that the first study investigating the subject was conducted in 2002, and the number of related studies increased over the years. In addition, these studies were primarily conducted quantitatively, were mainly journal articles, and originated mostly from China and the USA. Moreover, the sample groups of these studies consisted mostly of university students. Therefore, they mainly used academic performance and motivation variables. Furthermore, these studies were conducted primarily in the science and medical education disciplines. When the studies were evaluated in terms of publication journals, it was determined that they were published mostly in "Education Science" and "Computers & Education" journals. They were also included in the proceedings published within the scope of various conferences. When the application platforms in the studies were examined, it was determined that the UNITY and ARTUTOR platforms were mostly used. The findings of the studies revealed that the increase in academic performance and motivation was one of the most reported advantages of such technologies. On the other hand, the problems caused while using these technologies and the internet were the most reported difficulties in the studies. Finally, the review presented suggestions for future studies.
Collapse
Affiliation(s)
- Zeynep Turan
- Department of Computer Education & Instructional Technology, Faculty of Education, Ataturk University, 25240 Erzurum, Turkey
| | | |
Collapse
|
13
|
Onuma H, Sakai K, Arai Y, Torigoe I, Tomori M, Sakaki K, Hirai T, Egawa S, Kobayashi Y, Okawa A, Yoshii T. Augmented Reality Support for Anterior Decompression and Fusion Using Floating Method for Cervical Ossification of the Posterior Longitudinal Ligament. J Clin Med 2023; 12:jcm12082898. [PMID: 37109235 PMCID: PMC10143834 DOI: 10.3390/jcm12082898] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2023] [Revised: 04/06/2023] [Accepted: 04/14/2023] [Indexed: 04/29/2023] Open
Abstract
Anterior decompression and fusion (ADF) using the floating method for cervical ossification of the posterior longitudinal ligament (OPLL) is an ideal surgical technique, but it has a specific risk of insufficient decompression caused by the impingement of residual ossification. Augmented reality (AR) support is a novel technology that enables the superimposition of images onto the view of a surgical field. AR technology was applied to ADF for cervical OPLL to facilitate intraoperative anatomical orientation and OPLL identification. In total, 14 patients with cervical OPLL underwent ADF with microscopic AR support. The outline of the OPLL and the bilateral vertebral arteries was marked after intraoperative CT, and the reconstructed 3D image data were transferred and linked to the microscope. The AR microscopic view enabled us to visualize the ossification outline, which could not be seen directly in the surgical field, and allowed sufficient decompression of the ossification. Neurological disturbances were improved in all patients. No cases of serious complications, such as major intraoperative bleeding or reoperation due to the postoperative impingement of the floating OPLL, were registered. To our knowledge, this is the first report of the introduction of microscopic AR into ADF using the floating method for cervical OPLL with favorable clinical results.
Collapse
Affiliation(s)
- Hiroaki Onuma
- Department of Orthopedic Surgery, Saiseikai Kawaguchi General Hospital, 5-11-5 Nishikawaguchi, Kawaguchi-shi 332-8558, Japan
| | - Kenichiro Sakai
- Department of Orthopedic Surgery, Saiseikai Kawaguchi General Hospital, 5-11-5 Nishikawaguchi, Kawaguchi-shi 332-8558, Japan
| | - Yoshiyasu Arai
- Department of Orthopedic Surgery, Saiseikai Kawaguchi General Hospital, 5-11-5 Nishikawaguchi, Kawaguchi-shi 332-8558, Japan
| | - Ichiro Torigoe
- Department of Orthopedic Surgery, Saiseikai Kawaguchi General Hospital, 5-11-5 Nishikawaguchi, Kawaguchi-shi 332-8558, Japan
| | - Masaki Tomori
- Department of Orthopedic Surgery, Saiseikai Kawaguchi General Hospital, 5-11-5 Nishikawaguchi, Kawaguchi-shi 332-8558, Japan
| | - Kyohei Sakaki
- Department of Orthopedic Surgery, Saiseikai Kawaguchi General Hospital, 5-11-5 Nishikawaguchi, Kawaguchi-shi 332-8558, Japan
| | - Takashi Hirai
- Department of Orthopedic Surgery, Tokyo Medical and Dental University, 1-5-45 Yushima, Bunkyo Ward, Tokyo 113-8519, Japan
| | - Satoru Egawa
- Department of Orthopedic Surgery, Tokyo Medical and Dental University, 1-5-45 Yushima, Bunkyo Ward, Tokyo 113-8519, Japan
| | - Yutaka Kobayashi
- Department of Orthopedic Surgery, Saiseikai Kawaguchi General Hospital, 5-11-5 Nishikawaguchi, Kawaguchi-shi 332-8558, Japan
| | - Atsushi Okawa
- Department of Orthopedic Surgery, Tokyo Medical and Dental University, 1-5-45 Yushima, Bunkyo Ward, Tokyo 113-8519, Japan
| | - Toshitaka Yoshii
- Department of Orthopedic Surgery, Tokyo Medical and Dental University, 1-5-45 Yushima, Bunkyo Ward, Tokyo 113-8519, Japan
| |
Collapse
|
14
|
Kshetrapal A, McBride ME, Mannarino C. Taking the Pulse of the Current State of Simulation. Crit Care Clin 2023; 39:373-384. [PMID: 36898780 DOI: 10.1016/j.ccc.2022.09.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Abstract
Simulation in health-care professions has grown in the last few decades. We provide an overview of the history of simulation in other fields, the trajectory of simulation in health professions education, and research in medical education, including the learning theories and tools to assess and evaluate simulation programs. We also propose future directions for simulation and research in health professions education.
Collapse
Affiliation(s)
- Anisha Kshetrapal
- Department of Pediatrics, Division of Emergency Medicine, Ann & Robert H Lurie Children's Hospital of Chicago, 225 East Chicago Avenue, Box 62, Chicago, IL 60611, USA.
| | - Mary E McBride
- Depatment of Pediatrics, Divisions of Cardiology and Critical Care Medicine, Ann & Robert H Lurie Children's Hospital of Chicago, 225 East Chicago Avenue, Box 62, Chicago, IL 60611, USA
| | - Candace Mannarino
- Depatment of Pediatrics, Divisions of Cardiology and Critical Care Medicine, Ann & Robert H Lurie Children's Hospital of Chicago, 225 East Chicago Avenue, Box 62, Chicago, IL 60611, USA
| |
Collapse
|
15
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 17] [Impact Index Per Article: 17.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
16
|
Costa M, Pierre C, Vivanco-Suarez J, Baldoncini M, Tymchak Z, Patel A, Monteith SJ. Head-Mounted Augmented Reality in the Planning of Cerebrovascular Neurosurgical Procedures: A Single-Center Initial Experience. World Neurosurg 2023; 171:e693-e706. [PMID: 36566980 DOI: 10.1016/j.wneu.2022.12.086] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2022] [Revised: 12/17/2022] [Accepted: 12/19/2022] [Indexed: 12/24/2022]
Abstract
BACKGROUND Augmented reality (AR) technology has played an increasing role in cerebrovascular neurosurgery over the last 2 decades. Hence, we aim to evaluate the technical and educational value of head-mounted AR in cerebrovascular procedures. METHODS This is a single-center retrospective study of patients who underwent open surgery for cranial and spinal cerebrovascular lesions between April and August 2022. In all cases, the Medivis Surgical AR platform and HoloLens 2 were used for preoperative and intraoperative (preincision) planning. Surgical plan adjustment due to the use of head-mounted AR and subjective educational value of the tool were recorded. RESULTS A total of 33 patients and 35 cerebrovascular neurosurgical procedures were analyzed. Procedures included 12 intracranial aneurysm clippings, 6 brain and 1 spinal arteriovenous malformation resections, 2 cranial dural arteriovenous fistula obliterations, 3 carotid endarterectomies, two extracranial-intracranial direct bypasses, two encephaloduroangiosynostosis for Moyamoya disease, 1 biopsy of the superficial temporal artery, 2 microvascular decompressions, 2 cavernoma resections, 1 combined intracranial aneurysm clipping and encephaloduroangiosynostosis for Moyamoya disease, and 1 percutaneous feeder catheterization for arteriovenous malformation embolization. Minor changes in the surgical plan were recorded in 16 of 35 procedures (45.7%). Subjective educational value was scored as "very helpful" for cranial, spinal arteriovenous malformations, and carotid endarterectomies; "helpful" for intracranial aneurysm, dural arteriovenous fistulas, direct bypass, encephaloduroangiosynostosis, and superficial temporal artery-biopsy; and "not helpful" for cavernoma resection and microvascular decompression. CONCLUSIONS Head-mounted AR can be used in cerebrovascular neurosurgery as an adjunctive tool that might influence surgical strategy, enable 3-dimensional understanding of complex anatomy, and provide great educational value in selected cases.
Collapse
Affiliation(s)
- Matias Costa
- Cerebrovascular Neurosurgery, Swedish Neuroscience Institute, Swedish Medical Center, Seattle, Washington, USA.
| | - Clifford Pierre
- Cerebrovascular Neurosurgery, Swedish Neuroscience Institute, Swedish Medical Center, Seattle, Washington, USA
| | - Juan Vivanco-Suarez
- Cerebrovascular Neurosurgery, Swedish Neuroscience Institute, Swedish Medical Center, Seattle, Washington, USA
| | - Matias Baldoncini
- Department of Neurological Surgery, Hospital San Fernando, Argentina
| | - Zane Tymchak
- Cerebrovascular Neurosurgery, Swedish Neuroscience Institute, Swedish Medical Center, Seattle, Washington, USA
| | - Akshal Patel
- Cerebrovascular Neurosurgery, Swedish Neuroscience Institute, Swedish Medical Center, Seattle, Washington, USA
| | - Stephen J Monteith
- Cerebrovascular Neurosurgery, Swedish Neuroscience Institute, Swedish Medical Center, Seattle, Washington, USA
| |
Collapse
|
17
|
Dinh A, Tseng E, Yin AL, Estrin D, Greenwald P, Fortenko A. Perceptions of Augmented Reality in Remote Medical Care: Interview Study of Emergency Telemedicine Providers (Preprint). JMIR Form Res 2022; 7:e45211. [PMID: 36976628 PMCID: PMC10131657 DOI: 10.2196/45211] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2022] [Revised: 02/15/2023] [Accepted: 02/16/2023] [Indexed: 02/18/2023] Open
Abstract
BACKGROUND Augmented reality (AR) and virtual reality (VR) have increasingly appeared in the medical literature in the past decade, with AR recently being studied for its potential role in remote health care delivery and communication. Recent literature describes AR's implementation in real-time telemedicine contexts across multiple specialties and settings, with remote emergency services in particular using AR to enhance disaster support and simulation education. Despite the introduction of AR in the medical literature and its potential to shape the future of remote medical services, studies have yet to investigate the perspectives of telemedicine providers regarding this novel technology. OBJECTIVE This study aimed to understand the applications and challenges of AR in telemedicine anticipated by emergency medicine providers with a range of experiences in using telemedicine and AR or VR technology. METHODS Across 10 academic medical institutions, 21 emergency medicine providers with variable exposures to telemedicine and AR or VR technology were recruited for semistructured interviews via snowball sampling. The interview questions focused on various potential uses of AR, anticipated obstacles that prevent its implementation in the telemedicine area, and how providers and patients might respond to its introduction. We included video demonstrations of a prototype using AR during the interviews to elicit more informed and complete insights regarding AR's potential in remote health care. Interviews were transcribed and analyzed via thematic coding. RESULTS Our study identified 2 major areas of use for AR in telemedicine. First, AR is perceived to facilitate information gathering by enhancing observational tasks such as visual examination and granting simultaneous access to data and remote experts. Second, AR is anticipated to supplement distance learning of both minor and major procedures and nonprocedural skills such as cue recognition and empathy for patients and trainees. AR may also supplement long-distance education programs and thereby support less specialized medical facilities. However, the addition of AR may exacerbate the preexisting financial, structural, and literacy barriers to telemedicine. Providers seek value demonstrated by extensive research on the clinical outcome, satisfaction, and financial benefits of AR. They also seek institutional support and early training before adopting novel tools such as AR. Although an overall mixed reception is anticipated, consumer adoption and awareness are key components in AR's adoption. CONCLUSIONS AR has the potential to enhance the ability to gather observational and medical information, which would serve a diverse set of applications in remote health care delivery and education. However, AR faces obstacles similar to those faced by the current telemedicine technology, such as lack of access, infrastructure, and familiarity. This paper discusses the potential areas of investigation that would inform future studies and approaches to implementing AR in telemedicine.
Collapse
Affiliation(s)
- Alana Dinh
- Medical College, Weill Cornell Medicine, New York, NY, United States
| | - Emily Tseng
- Department of Information Science, Cornell Tech, New York, NY, United States
| | - Andrew Lukas Yin
- Department of Medicine, Weill Cornell Medicine, New York, NY, United States
| | - Deborah Estrin
- Department of Computer Science, Cornell Tech, New York, NY, United States
| | - Peter Greenwald
- Emergency Medicine, NewYork-Presbyterian Hospital, New York, NY, United States
| | - Alexander Fortenko
- Emergency Medicine, NewYork-Presbyterian Hospital, New York, NY, United States
| |
Collapse
|
18
|
Long Y, Li C, Dou Q. Robotic surgery remote mentoring via AR with 3D scene streaming and hand interaction. COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING: IMAGING & VISUALIZATION 2022. [DOI: 10.1080/21681163.2022.2145498] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Affiliation(s)
- Yonghao Long
- Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong SAR, China
| | - Chengkun Li
- Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong SAR, China
| | - Qi Dou
- Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong SAR, China
| |
Collapse
|
19
|
The intraoperative use of augmented and mixed reality technology to improve surgical outcomes: A systematic review. Int J Med Robot 2022; 18:e2450. [DOI: 10.1002/rcs.2450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2022] [Revised: 07/23/2022] [Accepted: 07/27/2022] [Indexed: 11/07/2022]
|
20
|
Mandal P, Ambade R. Surgery Training and Simulation Using Virtual and Augmented Reality for Knee Arthroplasty. Cureus 2022; 14:e28823. [PMID: 36225417 PMCID: PMC9535617 DOI: 10.7759/cureus.28823] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2022] [Accepted: 09/06/2022] [Indexed: 11/28/2022] Open
Abstract
A range of extended reality technology integration, including immersive virtual reality (IVR), augmented reality (AR), as well as mixed reality, has lately acquired favour in orthopaedics. The utilization of extended reality machinery in knee arthroplasty is examined in this review study. Virtual reality (VR) and AR are usually exercised together in orthopaedic surgical training as alluring training outside of the operation theatre is acknowledged as a good surgical training tool. The use of this technology, its consequences for orthopaedic surgeons and their patients, and its moral and practical issues are also covered. Head-mounted displays (HMDs) are a potential addition directed toward improving surgical precision along with instruction. Although the hardware is cutting-edge, substantial effort needs to be done to develop software that enables seamless, trustworthy integration into clinical practice and training. Remote virtual rehabilitation has drawn increasing attention in recent years, and its significance has increased in light of the recent outbreak of the COVID-19 epidemic. Numerous medical sectors have shown the benefits of telerehabilitation, gamification, VR, and AR. Given the rising demand for orthopaedic therapy and its rising costs, this is a requirement. A remote surgeon can impart knowledge without being present, by virtually placing his or her hands in the visual field of a local surgeon using AR technology. With the use of this innovation, orthopaedic surgery seems to have been able to participate in the telemedicine revolution. This technology may also have an impact on how surgeons collaborate and train for orthopaedic residencies in the future. Volatility in the HMD market will probably stall improvements in surgical education.
Collapse
|
21
|
Spinczyk D. Measuring Respiratory Motion for Supporting the Minimally Invasive Destruction of Liver Tumors. SENSORS (BASEL, SWITZERLAND) 2022; 22:6446. [PMID: 36080904 PMCID: PMC9460029 DOI: 10.3390/s22176446] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/28/2022] [Revised: 08/19/2022] [Accepted: 08/25/2022] [Indexed: 06/15/2023]
Abstract
OBJECTIVE Destroying liver tumors is a challenge for contemporary interventional radiology. The aim of this work is to compare different techniques used for the measurement of respiratory motion, as this is the main hurdle to the effective implementation of this therapy. METHODS Laparoscopic stereoscopic reconstruction of point displacements on the surface of the liver, observation of breathing using external markers placed on the surface of the abdominal cavity, and methods for registration of the surface of the abdominal cavity during breathing were implemented and evaluated. RESULTS The following accuracies were obtained: above 4 mm and 0.5 mm, and below 8 mm for laparoscopic, skin markers, and skin surface registration methods, respectively. CONCLUSIONS The clinical techniques and accompanying imaging modalities employed to destroy liver tumors, as well as the advantages and limitations of the proposed methods, are presented. Further directions for their development are also indicated.
Collapse
Affiliation(s)
- Dominik Spinczyk
- Faculty of Biomedical Engineering, Silesian University of Technology, 40 Roosevelta, 41-800 Zabrze, Poland
| |
Collapse
|
22
|
Spijkerboer KG, Fitski M, Siepel FJ, van de Ven CP, van der Steeg AF. Augmented reality-guided localization of a chest wall tumor in a pediatric patient. Eur J Cancer 2022; 170:103-105. [DOI: 10.1016/j.ejca.2022.04.023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2022] [Revised: 04/10/2022] [Accepted: 04/14/2022] [Indexed: 11/26/2022]
|
23
|
Pose-Díez-de-la-Lastra A, Moreta-Martinez R, García-Sevilla M, García-Mato D, Calvo-Haro JA, Mediavilla-Santos L, Pérez-Mañanes R, von Haxthausen F, Pascau J. HoloLens 1 vs. HoloLens 2: Improvements in the New Model for Orthopedic Oncological Interventions. SENSORS 2022; 22:s22134915. [PMID: 35808407 PMCID: PMC9269857 DOI: 10.3390/s22134915] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/27/2022] [Revised: 06/20/2022] [Accepted: 06/27/2022] [Indexed: 11/16/2022]
Abstract
This work analyzed the use of Microsoft HoloLens 2 in orthopedic oncological surgeries and compares it to its predecessor (Microsoft HoloLens 1). Specifically, we developed two equivalent applications, one for each device, and evaluated the augmented reality (AR) projection accuracy in an experimental scenario using phantoms based on two patients. We achieved automatic registration between virtual and real worlds using patient-specific surgical guides on each phantom. They contained a small adaptor for a 3D-printed AR marker, the characteristic patterns of which were easily recognized using both Microsoft HoloLens devices. The newest model improved the AR projection accuracy by almost 25%, and both of them yielded an RMSE below 3 mm. After ascertaining the enhancement of the second model in this aspect, we went a step further with Microsoft HoloLens 2 and tested it during the surgical intervention of one of the patients. During this experience, we collected the surgeons’ feedback in terms of comfortability, usability, and ergonomics. Our goal was to estimate whether the improved technical features of the newest model facilitate its implementation in actual surgical scenarios. All of the results point to Microsoft HoloLens 2 being better in all the aspects affecting surgical interventions and support its use in future experiences.
Collapse
Affiliation(s)
- Alicia Pose-Díez-de-la-Lastra
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (A.P.-D.-d.-l.-L.); (R.M.-M.); (M.G.-S.); (D.G.-M.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
| | - Rafael Moreta-Martinez
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (A.P.-D.-d.-l.-L.); (R.M.-M.); (M.G.-S.); (D.G.-M.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
| | - Mónica García-Sevilla
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (A.P.-D.-d.-l.-L.); (R.M.-M.); (M.G.-S.); (D.G.-M.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
| | - David García-Mato
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (A.P.-D.-d.-l.-L.); (R.M.-M.); (M.G.-S.); (D.G.-M.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
| | - José Antonio Calvo-Haro
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
- Servicio de Cirugía Ortopédica y Traumatología, Hospital General Universitario Gregorio Marañón, 28007 Madrid, Spain
- Departamento de Cirugía, Facultad de Medicina, Universidad Complutense de Madrid, 28040 Madrid, Spain
| | - Lydia Mediavilla-Santos
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
- Servicio de Cirugía Ortopédica y Traumatología, Hospital General Universitario Gregorio Marañón, 28007 Madrid, Spain
- Departamento de Cirugía, Facultad de Medicina, Universidad Complutense de Madrid, 28040 Madrid, Spain
| | - Rubén Pérez-Mañanes
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
- Servicio de Cirugía Ortopédica y Traumatología, Hospital General Universitario Gregorio Marañón, 28007 Madrid, Spain
- Departamento de Cirugía, Facultad de Medicina, Universidad Complutense de Madrid, 28040 Madrid, Spain
| | - Felix von Haxthausen
- Institute for Robotics and Cognitive Systems, University of Lübeck, 23562 Lübeck, Germany;
| | - Javier Pascau
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (A.P.-D.-d.-l.-L.); (R.M.-M.); (M.G.-S.); (D.G.-M.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
- Correspondence: ; Tel.: +34-91-624-8196
| |
Collapse
|
24
|
A Dedicated Tool for Presurgical Mapping of Brain Tumors and Mixed-Reality Navigation During Neurosurgery. J Digit Imaging 2022; 35:704-713. [PMID: 35230562 PMCID: PMC9156583 DOI: 10.1007/s10278-022-00609-8] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2021] [Revised: 02/03/2022] [Accepted: 02/05/2022] [Indexed: 12/15/2022] Open
Abstract
Brain tumor surgery requires a delicate tradeoff between complete removal of neoplastic tissue while minimizing loss of brain function. Functional magnetic resonance imaging (fMRI) and diffusion tensor imaging (DTI) have emerged as valuable tools for non-invasive assessment of human brain function and are now used to determine brain regions that should be spared to prevent functional impairment after surgery. However, image analysis requires different software packages, mainly developed for research purposes and often difficult to use in a clinical setting, preventing large-scale diffusion of presurgical mapping. We developed a specialized software able to implement an automatic analysis of multimodal MRI presurgical mapping in a single application and to transfer the results to the neuronavigator. Moreover, the imaging results are integrated in a commercially available wearable device using an optimized mixed-reality approach, automatically anchoring 3-dimensional holograms obtained from MRI with the physical head of the patient. This will allow the surgeon to virtually explore deeper tissue layers highlighting critical brain structures that need to be preserved, while retaining the natural oculo-manual coordination. The enhanced ergonomics of this procedure will significantly improve accuracy and safety of the surgery, with large expected benefits for health care systems and related industrial investors.
Collapse
|
25
|
Montemurro N, Condino S, Carbone M, Cattari N, D’Amato R, Cutolo F, Ferrari V. Brain Tumor and Augmented Reality: New Technologies for the Future. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:6347. [PMID: 35627884 PMCID: PMC9141435 DOI: 10.3390/ijerph19106347] [Citation(s) in RCA: 18] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 05/19/2022] [Accepted: 05/22/2022] [Indexed: 12/26/2022]
Abstract
In recent years, huge progress has been made in the management of brain tumors, due to the availability of imaging devices, which provide fundamental anatomical and pathological information not only for diagnostic purposes [...].
Collapse
Affiliation(s)
- Nicola Montemurro
- Department of Neurosurgery, Azienda Ospedaliera Universitaria Pisana (AOUP), University of Pisa, 56100 Pisa, Italy
| | - Sara Condino
- Department of Information Engineering, University of Pisa, 56100 Pisa, Italy; (S.C.); (R.D.); (F.C.); (V.F.)
- EndoCAS Center for Computer-Assisted Surgery, 56100 Pisa, Italy; (M.C.); (N.C.)
| | - Marina Carbone
- EndoCAS Center for Computer-Assisted Surgery, 56100 Pisa, Italy; (M.C.); (N.C.)
| | - Nadia Cattari
- EndoCAS Center for Computer-Assisted Surgery, 56100 Pisa, Italy; (M.C.); (N.C.)
- Department of Translational Research, University of Pisa, 56100 Pisa, Italy
| | - Renzo D’Amato
- Department of Information Engineering, University of Pisa, 56100 Pisa, Italy; (S.C.); (R.D.); (F.C.); (V.F.)
- EndoCAS Center for Computer-Assisted Surgery, 56100 Pisa, Italy; (M.C.); (N.C.)
| | - Fabrizio Cutolo
- Department of Information Engineering, University of Pisa, 56100 Pisa, Italy; (S.C.); (R.D.); (F.C.); (V.F.)
- EndoCAS Center for Computer-Assisted Surgery, 56100 Pisa, Italy; (M.C.); (N.C.)
| | - Vincenzo Ferrari
- Department of Information Engineering, University of Pisa, 56100 Pisa, Italy; (S.C.); (R.D.); (F.C.); (V.F.)
- EndoCAS Center for Computer-Assisted Surgery, 56100 Pisa, Italy; (M.C.); (N.C.)
| |
Collapse
|
26
|
Kalaiarasan K, Prathap L, Ayyadurai M, Subhashini P, Tamilselvi T, Avudaiappan T, Infant Raj I, Alemayehu Mamo S, Mezni A. Clinical Application of Augmented Reality in Computerized Skull Base Surgery. EVIDENCE-BASED COMPLEMENTARY AND ALTERNATIVE MEDICINE : ECAM 2022; 2022:1335820. [PMID: 35600956 PMCID: PMC9117015 DOI: 10.1155/2022/1335820] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Accepted: 04/19/2022] [Indexed: 12/02/2022]
Abstract
Cranial base tactics comprise the regulation of tiny and complicated structures in the domains of otology, rhinology, neurosurgery, and maxillofacial medical procedure. Basic nerves and veins are in the nearness of these buildings. Increased the truth is a coming innovation that may reform the cerebral basis approach by supplying vital physical and navigational facts brought together in a solitary presentation. In any case, the awareness and acknowledgment of prospective results of expanding reality frameworks in the cerebral base region are really poor. This article targets examining the handiness of expanded reality frameworks in cranial foundation medical procedures and emphasizes the obstacles that present innovation encounters and their prospective adjustments. A specialized perspective on distinct strategies used being produced of an improved realty framework is furthermore offered. The newest item offers an expansion in interest in expanded reality frameworks that may motivate more secure and practical procedures. In any case, a couple of concerns have to be cared to before that can be for the vast part fused into normal practice.
Collapse
Affiliation(s)
- K. Kalaiarasan
- Department of Information Technology, M. Kumarasamy College of Engineering, Karur, India
| | - Lavanya Prathap
- Department of Anatomy, Saveetha Dental College and Hospital, Saveetha Institute of Medical and Technical Sciences, Chennai, Tamil Nadu 600077, India
| | - M. Ayyadurai
- SG, Institute of ECE, Saveetha School of Engineering, SIMATS, Chennai, Tamil Nadu 600077, India
| | - P. Subhashini
- Department of Computer Science and Engineering, J.N.N Institute of Engineering, Kannigaipair, Tamil Nadu 601102, India
| | - T. Tamilselvi
- Department of Computer Science and Engineering, Panimalar Institute of Technology, Varadarajapuram, Tamil Nadu 600123, India
| | - T. Avudaiappan
- Computer Science and Engineering, K. Ramakrishnan College of Technology, Trichy 621112, India
| | - I. Infant Raj
- Department of Computer Science and Engineering, K. Ramakrishnan College of Engineering, Trichy, India
| | - Samson Alemayehu Mamo
- Department of Electrical and Computer Engineering, Faculty of Electrical and Biomedical Engineering, Institute of Technology, Hawassa University, Awasa, Ethiopia
| | - Amine Mezni
- Department of Chemistry, College of Science, Taif University, P.O. Box 11099, Taif 21944, Saudi Arabia
| |
Collapse
|
27
|
Thabit A, Benmahdjoub M, van Veelen MLC, Niessen WJ, Wolvius EB, van Walsum T. Augmented reality navigation for minimally invasive craniosynostosis surgery: a phantom study. Int J Comput Assist Radiol Surg 2022; 17:1453-1460. [PMID: 35507209 PMCID: PMC9307543 DOI: 10.1007/s11548-022-02634-y] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2022] [Accepted: 04/05/2022] [Indexed: 11/26/2022]
Abstract
Purpose In minimally invasive spring-assisted craniectomy, surgeons plan the surgery by manually locating the cranial sutures. However, this approach is prone to error. Augmented reality (AR) could be used to visualize the cranial sutures and assist in the surgery planning. The purpose of our work is to develop an AR-based system to visualize cranial sutures, and to assess the accuracy and usability of using AR-based navigation for surgical guidance in minimally invasive spring-assisted craniectomy. Methods An AR system was developed that consists of an electromagnetic tracking system linked with a Microsoft HoloLens. The system was used to conduct a study with two skull phantoms. For each phantom, five sutures were annotated and visualized on the skull surface. Twelve participants assessed the system. For each participant, model alignment using six anatomical landmarks was performed, followed by the participant delineation of the visualized sutures. At the end, the participants filled a system usability scale (SUS) questionnaire. For evaluation, an independent optical tracking system was used and the delineated sutures were digitized and compared to the CT-annotated sutures. Results For a total of 120 delineated sutures, the distance of the annotated sutures to the planning reference was \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$2.4\pm 1.2$$\end{document}2.4±1.2 mm. The average delineation time per suture was \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$13\pm 5$$\end{document}13±5 s. For the system usability questionnaire, an average SUS score of 73 was obtained. Conclusion The developed AR-system has good accuracy (average 2.4 mm distance) and could be used in the OR. The system can assist in the pre-planning of minimally invasive craniosynostosis surgeries to locate cranial sutures accurately instead of the traditional approach of manual palpation. Although the conducted phantom study was designed to closely reflect the clinical setup in the OR, further clinical validation of the developed system is needed and will be addressed in a future work. Supplementary Information The online version contains supplementary material available at 10.1007/s11548-022-02634-y.
Collapse
Affiliation(s)
- Abdullah Thabit
- Department of Radiology and Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
- Department of Oral and Maxillofacial Surgery, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | - Mohamed Benmahdjoub
- Department of Radiology and Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
- Department of Oral and Maxillofacial Surgery, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | - Marie-Lise C. van Veelen
- Department of Neurosurgery, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | - Wiro J. Niessen
- Department of Radiology and Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
- Department of Imaging Physics, Faculty of Applied Sciences, Delft University of Technology, Delft, The Netherlands
| | - Eppo B. Wolvius
- Department of Oral and Maxillofacial Surgery, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | - Theo van Walsum
- Department of Radiology and Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| |
Collapse
|
28
|
Gregory T, Gregory J, Dacheux C, Hurst SA. Surgeon experience of mixed reality headset technology during the COVID-19 pandemic: a multicenter international case series in orthopedic surgery. BMJ SURGERY, INTERVENTIONS, & HEALTH TECHNOLOGIES 2022; 4:e000127. [PMID: 35637758 PMCID: PMC9130665 DOI: 10.1136/bmjsit-2021-000127] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2021] [Accepted: 01/10/2022] [Indexed: 11/16/2022] Open
Affiliation(s)
- Thomas Gregory
- Universite Sorbonne Paris Nord - Campus de Bobigny, Paris, Île-de-France, France
| | - Jules Gregory
- Beaujon Hospital Department of Medical Imaging, Clichy, Île-de-France, France
| | - Charles Dacheux
- Universite Sorbonne Paris Nord - Campus de Bobigny, Paris, Île-de-France, France
| | - Simon Alexander Hurst
- Universite Sorbonne Paris Nord - Campus de Bobigny, Paris, Île-de-France, France
- Department of Trauma & Orthopaedic Surgery, St Mary's Hospital Campus, Imperial College, London, UK
| |
Collapse
|
29
|
Vandevoorde K, Vollenkemper L, Schwan C, Kohlhase M, Schenck W. Using Artificial Intelligence for Assistance Systems to Bring Motor Learning Principles into Real World Motor Tasks. SENSORS 2022; 22:s22072481. [PMID: 35408094 PMCID: PMC9002555 DOI: 10.3390/s22072481] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/18/2022] [Revised: 03/18/2022] [Accepted: 03/20/2022] [Indexed: 11/03/2022]
Abstract
Humans learn movements naturally, but it takes a lot of time and training to achieve expert performance in motor skills. In this review, we show how modern technologies can support people in learning new motor skills. First, we introduce important concepts in motor control, motor learning and motor skill learning. We also give an overview about the rapid expansion of machine learning algorithms and sensor technologies for human motion analysis. The integration between motor learning principles, machine learning algorithms and recent sensor technologies has the potential to develop AI-guided assistance systems for motor skill training. We give our perspective on this integration of different fields to transition from motor learning research in laboratory settings to real world environments and real world motor tasks and propose a stepwise approach to facilitate this transition.
Collapse
|
30
|
Cannizzaro D, Zaed I, Safa A, Jelmoni AJM, Composto A, Bisoglio A, Schmeizer K, Becker AC, Pizzi A, Cardia A, Servadei F. Augmented Reality in Neurosurgery, State of Art and Future Projections. A Systematic Review. Front Surg 2022; 9:864792. [PMID: 35360432 PMCID: PMC8961734 DOI: 10.3389/fsurg.2022.864792] [Citation(s) in RCA: 36] [Impact Index Per Article: 18.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Accepted: 02/11/2022] [Indexed: 01/13/2023] Open
Abstract
Background The use of augmented reality (AR) is growing in medical education, in particular, in radiology and surgery. AR has the potential to become a strategic component of neurosurgical training courses. In fact, over the years, there has been a progressive increase in the application of AR in the various fields of neurosurgery. In this study, the authors aim to define the diffusion of these augmented reality systems in recent years. This study describes future trends in augmented reality for neurosurgeons. Methods A systematic review of the literature was conducted to identify research published from December 1st, 2011 to November 30th, 2021. Electronic databases (PubMed, PubMed Central, and Scopus) were screened. The methodological quality of studies and extracted data were assessed for “augmented reality” and “neurosurgery”. The data analysis focused on the geographical distribution, temporal evolution, and topic of augmented reality in neurosurgery. Results A total of 198 studies have been included. The number of augmented reality applications in the neurosurgical field has increased during the last 10 years. The main topics on which it is mostly applied are spine surgery, neuronavigation, and education. The geographical distribution shows extensive use of augmented reality in the USA, Germany, China, and Canada. North America is the continent that uses augmented reality the most in the training and education of medical students, residents, and surgeons, besides giving the greatest research contribution in spine surgery, brain oncology, and surgical planning. AR is also extensively used in Asia for intraoperative navigation. Nevertheless, augmented reality is still far from reaching Africa and other countries with limited facilities, as no publications could be retrieved from our search. Conclusions The use of AR is significantly increased in the last 10 years. Nowadays it is mainly used in spine surgery and for neurosurgical education, especially in North America, Europe and China. A continuous growth, also in other aspects of the specialty, is expected in the next future.
Collapse
Affiliation(s)
- Delia Cannizzaro
- Department of Neurosurgery, IRCCS Humanitas Research Hospital, Rozzano, Italy
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Italy
| | - Ismail Zaed
- Department of Neurosurgery, IRCCS Humanitas Research Hospital, Rozzano, Italy
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Italy
- *Correspondence: Ismail Zaed
| | - Adrian Safa
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Italy
| | - Alice J. M. Jelmoni
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Italy
| | - Antonio Composto
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Italy
| | - Andrea Bisoglio
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Italy
| | - Kyra Schmeizer
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Italy
| | - Ana C. Becker
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Italy
| | - Andrea Pizzi
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Italy
| | - Andrea Cardia
- Department of Neurosurgery, IRCCS Humanitas Research Hospital, Rozzano, Italy
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Italy
| | - Franco Servadei
- Department of Neurosurgery, IRCCS Humanitas Research Hospital, Rozzano, Italy
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Italy
| |
Collapse
|
31
|
Ahmad HS, Yoon JW. Intra-operative wearable visualization in spine surgery: past, present, and future. JOURNAL OF SPINE SURGERY (HONG KONG) 2022; 8:132-138. [PMID: 35441103 PMCID: PMC8990397 DOI: 10.21037/jss-21-95] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/20/2021] [Accepted: 01/27/2022] [Indexed: 04/15/2023]
Abstract
The history of modern surgery has run parallel to the invention and development of intra-operative visualization techniques. The first operating room, built in 1804 at Pennsylvania Hospital, demonstrates this principle: illumination of the surgical field by the Sun through an overhead skylight allowed surgeries to proceed even prior to the invention of anesthesia or sterile technique. Surgeries were restricted to begin around when the Sun was at its zenith; without adequate light from the Sun and skylight, surgeons were unable to achieve adequate visualization. In the years since, new visualization instruments have expanded the scope and success of surgical intervention. Spine surgery in particular has benefited greatly from improved visualization technologies, due to the complex and intricate nervous, vascular and musculoskeletal structures that are closely intertwined which surgeons must manipulate. Over time, new technologies have also advanced to take up smaller footprints, leading to the rise of wearable tools that surgeons don intra-operatively to better visualize the surgical field. As surgical techniques shift to more minimally invasive methods, reliable, fidelitous, and ergonomic wearables are of growing importance. Here, we discuss the past and present of wearable visualization tools, from the first surgical loupes to cutting-edge augmented reality (AR) goggles, and comment on how emerging innovations will continue to revolutionize spine surgery.
Collapse
Affiliation(s)
- Hasan S Ahmad
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
| | - Jang W Yoon
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
32
|
Gadodia G, Yanof J, Hanlon A, Bustos S, Weunski C, West K, Martin C. Early Clinical Feasibility Evaluation of an Augmented Reality Platform for Guidance and Navigation during Percutaneous Tumor Ablation. J Vasc Interv Radiol 2022; 33:333-338. [PMID: 35221048 DOI: 10.1016/j.jvir.2021.11.014] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2021] [Revised: 09/30/2021] [Accepted: 11/23/2021] [Indexed: 10/19/2022] Open
Abstract
An augmented reality platform with a head-mounted display and electromagnetic tracking of instruments was developed for percutaneous procedural guidance. Earlier work had demonstrated bench and first-in-human feasibility of the platform. This report further evaluated the clinical usability and benefits of this technology. The platform was used in 12 patients who had been referred for percutaneous thermal ablation of abdominal soft tissue tumors. In 10 cases, the intraprocedural holographic guidance agreed with the standard imaging guidance. The evaluation was limited in 2 cases because of anatomic and workflow issues. Overall, this series demonstrated the clinical feasibility of this platform and the potential benefits of its use in percutaneous procedures.
Collapse
Affiliation(s)
- Gaurav Gadodia
- Department of Radiology, Section of Interventional Radiology, Cleveland Clinic, Cleveland, Ohio.
| | | | | | - Sara Bustos
- Department of Biomedical Engineering, Lerner Research Institute, Cleveland Clinic, Cleveland, Ohio
| | | | - Karl West
- Department of Biomedical Engineering, Lerner Research Institute, Cleveland Clinic, Cleveland, Ohio
| | - Charles Martin
- Department of Radiology, Section of Interventional Radiology, Cleveland Clinic, Cleveland, Ohio; Department of Biomedical Engineering, Lerner Research Institute, Cleveland Clinic, Cleveland, Ohio. https://twitter.com/chuckmartin3md
| |
Collapse
|
33
|
Aguilar-Salinas P, Gutierrez-Aguirre SF, Avila MJ, Nakaji P. Current status of augmented reality in cerebrovascular surgery: a systematic review. Neurosurg Rev 2022; 45:1951-1964. [PMID: 35149900 DOI: 10.1007/s10143-022-01733-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2021] [Revised: 12/01/2021] [Accepted: 01/05/2022] [Indexed: 12/29/2022]
Abstract
Augmented reality (AR) is an adjuvant tool in neuronavigation to improve spatial and anatomic understanding. The present review aims to describe the current status of intraoperative AR for the treatment of cerebrovascular pathology. A systematic review was conducted following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. The following databases were searched: PubMed, Science Direct, Web of Science, and EMBASE up to December, 2020. The search strategy consisted of "augmented reality," "AR," "cerebrovascular," "navigation," "neurovascular," "neurosurgery," and "endovascular" in both AND and OR combinations. Studies included were original research articles with intraoperative application. The manuscripts were thoroughly examined for study design, outcomes, and results. Sixteen studies were identified describing the use of intraoperative AR in the treatment of cerebrovascular pathology. A total of 172 patients were treated for 190 cerebrovascular lesions using intraoperative AR. The most common treated pathology was intracranial aneurysms. Most studies were cases and there was only a case-control study. A head-up display system in the microscope was the most common AR display. AR was found to be useful for tailoring the craniotomy, dura opening, and proper identification of donor and recipient vessels in vascular bypass. Most AR systems were unable to account for tissue deformation. This systematic review suggests that intraoperative AR is becoming a promising and feasible adjunct in the treatment of cerebrovascular pathology. It has been found to be a useful tool in the preoperative planning and intraoperative guidance. However, its clinical benefits remain to be seen.
Collapse
Affiliation(s)
- Pedro Aguilar-Salinas
- Department of Neurosurgery, Banner University Medical Center, University of Arizona, Tucson, AZ, USA
| | | | - Mauricio J Avila
- Department of Neurosurgery, Banner University Medical Center, University of Arizona, Tucson, AZ, USA
| | - Peter Nakaji
- Department of Neurosurgery, Banner University Medical Center, University of Arizona, 755 E. McDowell Rd, Phoenix, AZ, 85006, USA.
| |
Collapse
|
34
|
Okachi S, Ito T, Sato K, Iwano S, Shinohara Y, Itoigawa H, Hashimoto N. Virtual Bronchoscopy-Guided Transbronchial Biopsy Simulation Using a Head-Mounted Display: A New Style of Flexible Bronchoscopy. Surg Innov 2022; 29:811-813. [PMID: 35000513 DOI: 10.1177/15533506211068928] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Background/need. The increases in reference images and information during bronchoscopy using virtual bronchoscopic navigation (VBN) and fluoroscopy has potentially created the need for support using a head-mounted display (HMD) because bronchoscopists feel difficulty to see displays that are at a distance from them and turn their head and body in various directions. Methodology and device description. The binocular see-through Moverio BT-35E Smart Glasses can be connected via a high-definition multimedia interface and have a 720p high-definition display. We developed a system that converts fluoroscopic (live and reference), VBN, and bronchoscopic image signals through a converter and references them using the Moverio BT-35E. Preliminary results. We performed a virtual bronchoscopy-guided transbronchial biopsy simulation using the system. Four experienced pulmonologists performed a simulated bronchoscopy of 5 cases each with the Moverio BT-35E glasses, using bronchoscopy training model. For all procedures, the bronchoscope was advanced successfully into the target bronchus according to the VBN image. None of the operators reported eye or body fatigue during or after the procedure. Current status. This small-scale simulation study suggests the feasibility of using a HMD during bronchoscopy. For clinical use, it is necessary to evaluate the safety and usefulness of the system in larger clinical trials in the future.
Collapse
Affiliation(s)
- Shotaro Okachi
- Department of Respiratory Medicine, Nagoya University Graduate School of Medicine, Nagoya, Japan
| | - Takayasu Ito
- Department of Respiratory Medicine, Nagoya University Graduate School of Medicine, Nagoya, Japan
| | - Kazuhide Sato
- Department of Respiratory Medicine, Nagoya University Graduate School of Medicine, Nagoya, Japan.,Advanced Analytical and Diagnostic Imaging Center (AADIC)/Medical Engineering Unit (MEU), B3 Unit, 105233Nagoya University Institute for Advanced Research, Nagoya, Japan
| | - Shingo Iwano
- Department of Radiology, Nagoya University Graduate School of Medicine, Nagoya, Japan
| | - Yuka Shinohara
- Department of Respiratory Medicine, Nagoya University Graduate School of Medicine, Nagoya, Japan
| | - Hideyuki Itoigawa
- Department of Respiratory Medicine, Nagoya University Graduate School of Medicine, Nagoya, Japan
| | - Naozumi Hashimoto
- Department of Respiratory Medicine, Nagoya University Graduate School of Medicine, Nagoya, Japan
| |
Collapse
|
35
|
Sparwasser P, Haack M, Frey L, Haferkamp A, Borgmann H. [Virtual and augmented reality in urology]. Urologe A 2021; 61:133-141. [PMID: 34935997 PMCID: PMC8693158 DOI: 10.1007/s00120-021-01734-y] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/23/2021] [Indexed: 11/29/2022]
Abstract
Zwar haben jeher technologische Weiterentwicklungen die medizinische Versorgung in deren stetigem Wandel optimiert, so waren diese jedoch immer noch für den Anwender weitestgehend fassbar. Getrieben durch immense finanzielle Anstrengungen sind innovative Produkte und technische Lösungen entstanden, die den medizinischen Alltag transformieren und diesen in Zukunft um eine Dimension erweitern werden: die Virtual und Augmented Reality. Dieser Übersichtsartikel fasst die aktuellen wissenschaftlichen Projekte und den zukünftigen Nutzen von Virtual und Augmented Reality im Fachgebiet der Urologie zusammen.
Collapse
Affiliation(s)
- P Sparwasser
- Department of Urology, University Medical Center, Johannes Gutenberg University, Langenbeckstr. 1, 55131, Mainz, Deutschland.
| | - M Haack
- Department of Urology, University Medical Center, Johannes Gutenberg University, Langenbeckstr. 1, 55131, Mainz, Deutschland
| | - L Frey
- Department of Urology, University Medical Center, Johannes Gutenberg University, Langenbeckstr. 1, 55131, Mainz, Deutschland
| | - A Haferkamp
- Department of Urology, University Medical Center, Johannes Gutenberg University, Langenbeckstr. 1, 55131, Mainz, Deutschland
| | - H Borgmann
- Department of Urology, University Medical Center, Johannes Gutenberg University, Langenbeckstr. 1, 55131, Mainz, Deutschland
| |
Collapse
|
36
|
Simone M, Galati R, Barile G, Grasso E, De Luca R, Cartanese C, Lomonaco R, Ruggieri E, Albano A, Rucci A, Grassi G. Remote mentoring in laparotomic and laparoscopic cancer surgery during Covid-19 pandemic: an experimental setup based on mixed reality. MEDICAL EDUCATION ONLINE 2021; 26:1996923. [PMID: 34713779 PMCID: PMC8567891 DOI: 10.1080/10872981.2021.1996923] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/29/2021] [Revised: 10/14/2021] [Accepted: 10/19/2021] [Indexed: 06/13/2023]
Abstract
In this paper, Mixed Reality (MR) has been exploited in the operating rooms to perform laparoscopic and open surgery with the aim of providing remote mentoring to the medical doctors under training during the Covid-19 pandemic. The employed architecture, which has put together MR smartglasses, a Digital Imaging Player, and a Mixed Reality Toolkit, has been used for cancer surgery at the IRCCS Hospital 'Giovanni Paolo II' in southern Italy. The feasibility of using the conceived platform for real-time remote mentoring has been assessed on the basis of surveys distributed to the trainees after each surgery.
Collapse
Affiliation(s)
- Michele Simone
- Department of Oncology Surgery, IRCCS Istituto Tumori “Giovanni Paolo II”, Bari, Italy
| | - Rocco Galati
- Department of Oncology Surgery, IRCCS Istituto Tumori “Giovanni Paolo II”, Bari, Italy
| | - Graziana Barile
- Department of Oncology Surgery, IRCCS Istituto Tumori “Giovanni Paolo II”, Bari, Italy
| | - Emanuele Grasso
- Department of Oncology Surgery, IRCCS Istituto Tumori “Giovanni Paolo II”, Bari, Italy
| | - Raffaele De Luca
- Department of Oncology Surgery, IRCCS Istituto Tumori “Giovanni Paolo II”, Bari, Italy
| | - Carmine Cartanese
- Department of Oncology Surgery, IRCCS Istituto Tumori “Giovanni Paolo II”, Bari, Italy
| | - Rocco Lomonaco
- Department of Oncology Surgery, IRCCS Istituto Tumori “Giovanni Paolo II”, Bari, Italy
| | - Eustachio Ruggieri
- Department of Oncology Surgery, IRCCS Istituto Tumori “Giovanni Paolo II”, Bari, Italy
| | - Anna Albano
- Department of Oncology Surgery, IRCCS Istituto Tumori “Giovanni Paolo II”, Bari, Italy
| | - Antonello Rucci
- Department of Oncology Surgery, IRCCS Istituto Tumori “Giovanni Paolo II”, Bari, Italy
| | - Giuseppe Grassi
- Dipartimento Ingegneria Innovazione, Università del Salento, Lecce, Italy
| |
Collapse
|
37
|
Bori E, Pancani S, Vigliotta S, Innocenti B. Validation and accuracy evaluation of automatic segmentation for knee joint pre-planning. Knee 2021; 33:275-281. [PMID: 34739958 DOI: 10.1016/j.knee.2021.10.016] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/07/2021] [Revised: 09/28/2021] [Accepted: 10/12/2021] [Indexed: 02/02/2023]
Abstract
BACKGROUND Proper use of three-dimensional (3D) models generated from medical imaging data in clinical preoperative planning, training and consultation is based on the preliminary proved accuracy of the replication of the patient anatomy. Therefore, this study investigated the dimensional accuracy of 3D reconstructions of the knee joint generated from computed tomography scans via automatic segmentation by comparing them with 3D models generated through manual segmentation. METHODS Three unpaired, fresh-frozen right legs were investigated. Three-dimensional models of the femur and the tibia of each leg were manually segmented using a commercial software and compared in terms of geometrical accuracy with the 3D models automatically segmented using proprietary software. Bony landmarks were identified and used to calculate clinically relevant distances: femoral epicondylar distance; posterior femoral epicondylar distance; femoral trochlear groove length; tibial knee center tubercle distance (TKCTD). Pearson's correlation coefficient and Bland and Altman plots were used to evaluate the level of agreement between measured distances. RESULTS Differences between parameters measured on 3D models manually and automatically segmented were below 1 mm (range: -0.06 to 0.72 mm), except for TKCTD (between 1.00 and 1.40 mm in two specimens). In addition, there was a significant strong correlation between measurements. CONCLUSIONS The results obtained are comparable to those reported in previous studies where accuracy of bone 3D reconstruction was investigated. Automatic segmentation techniques can be used to quickly reconstruct reliable 3D models of bone anatomy and these results may contribute to enhance the spread of this technology in preoperative and operative settings, where it has shown considerable potential.
Collapse
Affiliation(s)
- Edoardo Bori
- BEAMS Department, Université Libre de Bruxelles, Bruxelles, Belgium.
| | | | | | | |
Collapse
|
38
|
Amantayeva A, Adilzhanova N, Issatayeva A, Blanc W, Molardi C, Tosi D. Fiber Optic Distributed Sensing Network for Shape Sensing-Assisted Epidural Needle Guidance. BIOSENSORS 2021; 11:bios11110446. [PMID: 34821662 PMCID: PMC8615863 DOI: 10.3390/bios11110446] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/14/2021] [Revised: 11/08/2021] [Accepted: 11/09/2021] [Indexed: 05/30/2023]
Abstract
Epidural anesthesia is a pain management process that requires the insertion of a miniature needle through the epidural space located within lumbar vertebrae. The use of a guidance system for manual insertion can reduce failure rates and provide increased efficiency in the process. In this work, we present and experimentally assess a guidance system based on a network of fiber optic distributed sensors. The fibers are mounted externally to the needle, without blocking its inner channel, and through a strain-to-shape detection method reconstruct the silhouette of the epidural device in real time (1 s). We experimentally assessed the shape sensing methods over 25 experiments performed in a phantom, and we observed that the sensing system correctly identified bending patterns typical in epidural insertions, characterized by the different stiffness of the tissues. By studying metrics related to the curvatures and their temporal changes, we provide identifiers that can potentially serve for the (in)correct identification of the epidural space, and support the operator through the insertion process by recognizing the bending patterns.
Collapse
Affiliation(s)
- Aida Amantayeva
- School of Engineering and Digital Sciences, Nazarbayev University, Nur-Sultan 010000, Kazakhstan; (A.A.); (N.A.); (C.M.)
| | - Nargiz Adilzhanova
- School of Engineering and Digital Sciences, Nazarbayev University, Nur-Sultan 010000, Kazakhstan; (A.A.); (N.A.); (C.M.)
| | - Aizhan Issatayeva
- Department of Engineering and Architecture, University of Parma, Parco Area delle Scienze 181/A, I-43124 Parma, Italy;
| | - Wilfried Blanc
- Université Côte d’Azur, INPHYNI, CNRS UMR7010, Avenue Joseph Vallot, 06108 Nice, France;
| | - Carlo Molardi
- School of Engineering and Digital Sciences, Nazarbayev University, Nur-Sultan 010000, Kazakhstan; (A.A.); (N.A.); (C.M.)
| | - Daniele Tosi
- School of Engineering and Digital Sciences, Nazarbayev University, Nur-Sultan 010000, Kazakhstan; (A.A.); (N.A.); (C.M.)
- National Laboratory Astana, Laboratory of Biosensors and Bioinstruments, Nur-Sultan 010000, Kazakhstan
| |
Collapse
|
39
|
Nicklaus KM, Wang H, Bordes MC, Zaharan A, Sampathkumar U, Cheong AL, Reece GP, Hanson SE, Merchant FA, Markey MK. Potential of Intraoperative 3D Photography and 3D Visualization in Breast Reconstruction. PLASTIC AND RECONSTRUCTIVE SURGERY-GLOBAL OPEN 2021; 9:e3845. [PMID: 34646718 PMCID: PMC8500585 DOI: 10.1097/gox.0000000000003845] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2021] [Accepted: 07/28/2021] [Indexed: 11/26/2022]
Abstract
Although pre- and postoperative three-dimensional (3D) photography are well-established in breast reconstruction, intraoperative 3D photography is not. We demonstrate the process of intraoperative acquisition and visualization of 3D photographs for breast reconstruction and present clinicians' opinions about intraoperative visualization tools. METHODS Mastectomy specimens were scanned with a handheld 3D scanner during breast surgery. The 3D photographs were processed to compute morphological measurements of the specimen. Three visualization modalities (screen-based viewing, augmented reality viewing, and 3D printed models) were created to show different representations of the 3D photographs to plastic surgeons. We interviewed seven surgeons about the usefulness of the visualization methods. RESULTS The average time for intraoperative acquisition of 3D photographs of the mastectomy specimen was 4 minutes, 8 seconds ± 44 seconds. The average time for image processing to compute morphological measurements of the specimen was 54.26 ± 40.39 seconds. All of the interviewed surgeons would be more inclined to use intraoperative visualization if it displayed information that they are currently missing (eg, the target shape of the reconstructed breast mound). Additionally, the surgeons preferred high-fidelity visualization tools (such as 3D printing) that are easy-to-use and have minimal disruption to their current workflow. CONCLUSIONS This study demonstrates that 3D photographs can be collected intraoperatively within acceptable time limits, and quantitative measurements can be computed timely to be utilized within the same procedure. We also report surgeons' comments on usability of visualization methods and of measurements of the mastectomy specimen, which can be used to guide future surgical practice.
Collapse
Affiliation(s)
- Krista M Nicklaus
- Department of Biomedical Engineering, The University of Texas at Austin, Austin, Tex
- Department of Plastic Surgery, The University of Texas MD Anderson Cancer Center, Houston, Tex
| | - Haoqi Wang
- Department of Biomedical Engineering, The University of Texas at Austin, Austin, Tex
- Department of Plastic Surgery, The University of Texas MD Anderson Cancer Center, Houston, Tex
| | - Mary Catherine Bordes
- Department of Plastic Surgery, The University of Texas MD Anderson Cancer Center, Houston, Tex
| | - Alex Zaharan
- Department of Electrical and Computer Engineering, University of Pittsburgh, Pittsburgh, Pa
| | | | - Audrey L Cheong
- Department of Electrical and Computer Engineering, University of Houston, Houston, Tex
| | - Gregory P Reece
- Department of Plastic Surgery, The University of Texas MD Anderson Cancer Center, Houston, Tex
| | - Summer E Hanson
- Section of Plastic and Reconstructive Surgery, University of Chicago Medicine and Biological Sciences, Chicago, Ill
| | - Fatima A Merchant
- Department of Biomedical Engineering, The University of Texas at Austin, Austin, Tex
- Department of Computer Science, University of Houston, Houston, Tex
- Department of Electrical and Computer Engineering, University of Houston, Houston, Tex
- Department of Engineering Technology, University of Houston, Houston, Tex
| | - Mia K Markey
- Department of Biomedical Engineering, The University of Texas at Austin, Austin, Tex
- Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston, Tex
| |
Collapse
|
40
|
Colaguori F, Marin-Mera M, McDonnell M, Martínez J, Valero-Moreno F, Damon A, Domingo RA, Clifton W, Fox WC, Chaichana K, Middlebrooks EH, Sabsevitz D, Forry R, Quiñones-Hinojosa A. Three-Dimensionally Printed Surgical Simulation Tool for Brain Mapping Training and Preoperative Planning. Oper Neurosurg (Hagerstown) 2021; 21:523-532. [PMID: 34561704 PMCID: PMC8637789 DOI: 10.1093/ons/opab331] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2021] [Accepted: 07/18/2021] [Indexed: 12/14/2022] Open
Abstract
BACKGROUND Brain mapping is the most reliable intraoperative tool for identifying surrounding functional cortical and subcortical brain parenchyma. Brain mapping procedures are nuanced and require a multidisciplinary team and a well-trained neurosurgeon. Current training methodology involves real-time observation and operation, without widely available surgical simulation. OBJECTIVE To develop a patient-specific, anatomically accurate, and electrically responsive biomimetic 3D-printed model for simulating brain mapping. METHODS Imaging data were converted into a 2-piece inverse 3D-rendered polyvinyl acetate shell forming an anatomically accurate brain mold. Functional and diffusion tensor imaging data were used to guide wire placement to approximate the projection fibers from the arm and leg areas in the motor homunculus. Electrical parameters were generated, and data were collected and processed to differentiate between the 2 tracts. For validation, the relationship between the electrical signal and the distance between the probe and the tract was quantified. Neurosurgeons and trainees were interviewed to assess the validity of the model. RESULTS Material testing of the brain component showed an elasticity modulus of 55 kPa (compared to 140 kPa of cadaveric brain), closely resembling the tactile feedback a live brain. The simulator's electrical properties approximated that of a live brain with a voltage-to-distance correlation coefficient of r2 = 0.86. Following 32 neurosurgeon interviews, ∼96% considered the model to be useful for training. CONCLUSION The realistic neural properties of the simulator greatly improve representation of a live surgical environment. This proof-of-concept model can be further developed to contain more complicated tractography, blood and cerebrospinal fluid circulation, and more in-depth feedback mechanisms.
Collapse
Affiliation(s)
| | | | | | | | | | - Aaron Damon
- Department of Neurologic Surgery, Mayo Clinic, Jacksonville, Florida, USA
| | - Ricardo A Domingo
- Department of Neurologic Surgery, Mayo Clinic, Jacksonville, Florida, USA
| | - William Clifton
- Department of Neurologic Surgery, Mayo Clinic, Jacksonville, Florida, USA
| | - W Christopher Fox
- Department of Neurologic Surgery, Mayo Clinic, Jacksonville, Florida, USA
| | - Kaisorn Chaichana
- Department of Neurologic Surgery, Mayo Clinic, Jacksonville, Florida, USA
| | | | - David Sabsevitz
- Department of Psychiatry and Psychology, Mayo Clinic, Jacksonville, Florida, USA
| | - Rebecca Forry
- Wallace H. Coulter Department of Biomedical Engineering, Georgia Institute of Technology, Atlanta, Georgia, USA
| | - Alfredo Quiñones-Hinojosa
- Correspondence: Alfredo Quiñones-Hinojosa, MD, Brain Tumor Stem Cell Laboratory, Department of Neurologic Surgery, Mayo Clinic, Florida, 4500 San Pablo Rd. S, Jacksonville, FL 32224, USA. Twitter: @DoctorQMd
| |
Collapse
|
41
|
Yoon JW, Spadola M, Blue R, Saylany A, Sharma N, Ahmad HS, Buch V, Madhavan K, Chen HI, Steinmetz MP, Welch WC, Malhotra NR. Do-It-Yourself Augmented Reality Heads-Up Display (DIY AR-HUD): A Technical Note. Int J Spine Surg 2021; 15:826-833. [PMID: 34266938 DOI: 10.14444/8106] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
BACKGROUND We present a "Do-It-Yourself" method to build an affordable augmented reality heads-up display system (AR-HUD) capable of displaying intraoperative images. All components are commercially available products, which the surgeons may use in their own practice for educational and research purposes. METHODS Moverio BT 35-E smart glasses were connected to operating room imaging modalities (ie, fluoroscopy and 3D navigation platforms) via a high-definition multimedia interface (HDMI) converter, allowing for continuous high-definition video transmission. The addition of an HDMI transmitter-receiver makes the AR-HUD system wireless. RESULTS We used our AR-HUD system in 3 patients undergoing instrumented spinal fusion. AR-HUD projected fluoroscopy images onto the surgical field, eliminating shift of surgeon focus and procedure interruption, with only a 40- to 100-ms delay in transmission, which was not clinically impactful. CONCLUSIONS An affordable AR-HUD capable of displaying real-time information into the surgeon's view can be easily designed, built, and tested in surgical practice. As wearable heads-up display technology continues to evolve rapidly, individual components presented here may be substituted to improve its functionality and usability. Surgeons are in a unique position to conduct clinical testing in the operating room environment to optimize the augmented reality system for surgical use.
Collapse
Affiliation(s)
- Jang W Yoon
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania
| | - Michael Spadola
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania
| | - Rachel Blue
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania
| | - Anissa Saylany
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania
| | - Nikhil Sharma
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania
| | - Hasan S Ahmad
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania
| | - Vivek Buch
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania
| | | | - H Isaac Chen
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania
| | - Michael P Steinmetz
- Department of Neurosurgery, Cleveland Clinic Lerner College of Medicine, Cleveland, Ohio
| | - William C Welch
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania
| | - Neil R Malhotra
- Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania
| |
Collapse
|
42
|
Godzik J, Farber SH, Urakov T, Steinberger J, Knipscher LJ, Ehredt RB, Tumialán LM, Uribe JS. "Disruptive Technology" in Spine Surgery and Education: Virtual and Augmented Reality. Oper Neurosurg (Hagerstown) 2021; 21:S85-S93. [PMID: 34128065 DOI: 10.1093/ons/opab114] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2021] [Accepted: 03/04/2021] [Indexed: 01/09/2023] Open
Abstract
BACKGROUND Technological advancements are the drivers of modern-day spine care. With the growing pressure to deliver faster and better care, surgical-assist technology is needed to harness computing power and enable the surgeon to improve outcomes. Virtual reality (VR) and augmented reality (AR) represent the pinnacle of emerging technology, not only to deliver higher quality education through simulated care, but also to provide valuable intraoperative information to assist in more efficient and more precise surgeries. OBJECTIVE To describe how the disruptive technologies of VR and AR interface in spine surgery and education. METHODS We review the relevance of VR and AR technologies in spine care, and describe the feasibility and limitations of the technologies. RESULTS We discuss potential future applications, and provide a case study demonstrating the feasibility of a VR program for neurosurgical spine education. CONCLUSION Initial experiences with VR and AR technologies demonstrate their applicability and ease of implementation. However, further prospective studies through multi-institutional and industry-academic partnerships are necessary to solidify the future of VR and AR in spine surgery education and clinical practice.
Collapse
Affiliation(s)
- Jakub Godzik
- Department of Neurosurgery, Barrow Neurological Institute, St. Joseph's Hospital and Medical Center, Phoenix, Arizona, USA
| | - S Harrison Farber
- Department of Neurosurgery, Barrow Neurological Institute, St. Joseph's Hospital and Medical Center, Phoenix, Arizona, USA
| | - Timur Urakov
- Department of Neurosurgery, University of Miami, Miami, Florida, USA
| | - Jeremy Steinberger
- Department of Neurosurgery, Mount Sinai Health System, New York, New York, USA
| | - Liza J Knipscher
- Neuroscience Publications, Barrow Neurological Institute, St. Joseph's Hospital and Medical Center, Phoenix, Arizona, USA
| | - Ryan B Ehredt
- Neuroscience Publications, Barrow Neurological Institute, St. Joseph's Hospital and Medical Center, Phoenix, Arizona, USA
| | - Luis M Tumialán
- Department of Neurosurgery, Barrow Neurological Institute, St. Joseph's Hospital and Medical Center, Phoenix, Arizona, USA
| | - Juan S Uribe
- Department of Neurosurgery, Barrow Neurological Institute, St. Joseph's Hospital and Medical Center, Phoenix, Arizona, USA
| |
Collapse
|
43
|
Iqbal H, Tatti F, Rodriguez Y Baena F. Augmented reality in robotic assisted orthopaedic surgery: A pilot study. J Biomed Inform 2021; 120:103841. [PMID: 34146717 DOI: 10.1016/j.jbi.2021.103841] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2020] [Revised: 06/11/2021] [Accepted: 06/14/2021] [Indexed: 01/18/2023]
Abstract
BACKGROUND The research and development of augmented-reality (AR) technologies in surgical applications has seen an evolution of the traditional user-interfaces (UI) utilised by clinicians when conducting robot-assisted orthopaedic surgeries. The typical UI for such systems relies on surgeons managing 3D medical imaging data in the 2D space of a touchscreen monitor, located away from the operating site. Conversely, AR can provide a composite view overlaying the real surgical scene with co-located virtual holographic representations of medical data, leading to a more immersive and intuitive operator experience. MATERIALS AND METHODS This work explores the integration of AR within an orthopaedic setting by capturing and replicating the UI of an existing surgical robot within an AR head-mounted display worn by the clinician. The resulting mixed-reality workflow enabled users to simultaneously view the operating-site and real-time holographic operating informatics when carrying out a robot-assisted patellofemoral-arthroplasty (PFA). Ten surgeons were recruited to test the impact of the AR system on procedure completion time and operating surface roughness. RESULTS AND DISCUSSION The integration of AR did not appear to require subjects to significantly alter their surgical techniques, which was demonstrated by non-significant changes to the study's clinical metrics, with a statistically insignificant mean increase in operating time (+0.778 s, p = 0.488) and a statistically insignificant change in mean surface roughness (p = 0.274). Additionally, a post-operative survey indicated a positive consensus on the usability of the AR system without incurring noticeable physical distress such as eyestrain or fatigue. CONCLUSIONS Overall, these study results demonstrated a successful integration of AR technologies within the framework of an existing robot-assisted surgical platform with no significant negative effects in two quantitative metrics of surgical performance, and a positive outcome relating to user-centric and ergonomic evaluation criteria.
Collapse
Affiliation(s)
- Hisham Iqbal
- Mechatronics in Medicine Laboratory, Imperial College London, London, UK.
| | - Fabio Tatti
- Mechatronics in Medicine Laboratory, Imperial College London, London, UK
| | | |
Collapse
|
44
|
Ha J, Parekh P, Gamble D, Masters J, Jun P, Hester T, Daniels T, Halai M. Opportunities and challenges of using augmented reality and heads-up display in orthopaedic surgery: A narrative review. J Clin Orthop Trauma 2021; 18:209-215. [PMID: 34026489 PMCID: PMC8131920 DOI: 10.1016/j.jcot.2021.04.031] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/08/2021] [Revised: 03/28/2021] [Accepted: 04/29/2021] [Indexed: 12/16/2022] Open
Abstract
BACKGROUND & AIM Utilization of augmented reality (AR) and heads-up displays (HUD) to aid orthopaedic surgery has the potential to benefit surgeons and patients alike through improved accuracy, safety, and educational benefits. With the COVID-19 pandemic, the opportunity for adoption of novel technology is more relevant. The aims are to assess the technology available, to understand the current evidence regarding the benefit and to consider challenges to implementation in clinical practice. METHODS & RESULTS PRISMA guidelines were used to filter the literature. Of 1004 articles returned the following exclusion criteria were applied: 1) reviews/commentaries 2) unrelated to orthopaedic surgery 3) use of other AR wearables beyond visual aids leaving 42 papers for review.This review illustrates benefits including enhanced accuracy and reduced time of surgery, reduced radiation exposure and educational benefits. CONCLUSION Whilst there are obstacles to overcome, there are already reports of technology being used. As with all novel technologies, a greater understanding of the learning curve is crucial, in addition to shielding our patients from this learning curve. Improvements in usability and implementing surgeons' specific needs should increase uptake.
Collapse
Affiliation(s)
- Joon Ha
- Queen Elizabeth Hospital, London, UK,Corresponding author.
| | | | | | - James Masters
- Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), UK
| | - Peter Jun
- University of Alberta, Edmonton, Canada
| | | | | | - Mansur Halai
- St Michael's Hospital, University of Toronto, Canada
| |
Collapse
|
45
|
Lareyre F, Chaudhuri A, Adam C, Carrier M, Mialhe C, Raffort J. Applications of Head-Mounted Displays and Smart Glasses in Vascular Surgery. Ann Vasc Surg 2021; 75:497-512. [PMID: 33823254 DOI: 10.1016/j.avsg.2021.02.033] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2021] [Revised: 02/22/2021] [Accepted: 02/25/2021] [Indexed: 12/11/2022]
Abstract
OBJECTIVES Advances in virtual, augmented and mixed reality have led to the development of wearable technologies including head mounted displays (HMD) and smart glasses. While there is a growing interest on their potential applications in health, only a few studies have addressed so far their use in vascular surgery. The aim of this review was to summarize the fundamental notions associated with these technologies and to discuss potential applications and current limits for their use in vascular surgery. METHODS A comprehensive literature review was performed to introduce the fundamental concepts and provide an overview of applications of HMD and smart glasses in surgery. RESULTS HMD and smart glasses demonstrated a potential interest for the education of surgeons including anatomical teaching, surgical training, teaching and telementoring. Applications for pre-surgical planning have been developed in general and cardiac surgery and could be transposed for a use in vascular surgery. The use of wearable technologies in the operating room has also been investigated in both general and cardiovascular surgery and demonstrated its potential interest for image-guided surgery and data collection. CONCLUSION Studies performed so far represent a proof of concept of the interest of HMD and smart glasses in vascular surgery for education of surgeons and for surgical practice. Although these technologies exhibited encouraging results for applications in vascular surgery, technical improvements and further clinical research in large series are required before hoping using them in daily clinical practice.
Collapse
Affiliation(s)
- Fabien Lareyre
- Department of Vascular Surgery, Hospital of Antibes-Juan-les-Pins, France; Université Côte d'Azur, CHU, Inserm U1065, C3M, Nice, France.
| | - Arindam Chaudhuri
- Bedfordshire-Milton Keynes Vascular Centre, Bedfordshire Hospitals NHS Foundation Trust, Bedford, UK
| | - Cédric Adam
- Laboratory of Applied Mathematics and Computer Science (MICS), CentraleSupélec, Université Paris-Saclay, France
| | - Marion Carrier
- Laboratory of Applied Mathematics and Computer Science (MICS), CentraleSupélec, Université Paris-Saclay, France
| | - Claude Mialhe
- Cardiovascular Surgery Unit, Cardio Thoracic Centre of Monaco, Monaco
| | - Juliette Raffort
- Université Côte d'Azur, CHU, Inserm U1065, C3M, Nice, France; Clinical Chemistry Laboratory, University Hospital of Nice, France
| |
Collapse
|
46
|
Guidance for Acupuncture Robot with Potentially Utilizing Medical Robotic Technologies. EVIDENCE-BASED COMPLEMENTARY AND ALTERNATIVE MEDICINE 2021; 2021:8883598. [PMID: 33859714 PMCID: PMC8026281 DOI: 10.1155/2021/8883598] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/18/2020] [Revised: 03/20/2021] [Accepted: 03/23/2021] [Indexed: 11/18/2022]
Abstract
Acupuncture is gaining increasing attention and recognition all over the world. However, a lot of physical labor is paid by acupuncturists. It is natural to resort to a robot which can improve the accuracy as well as the efficacy of therapy. Several teams have separately developed real acupuncture robots or related technologies and even went to the stage of clinical trial and then achieved success commercially. A completed clinical practical acupuncture robot is not far from reach with the combination of existing mature medical robotic technologies. A hand-eye-brain coordination framework is proposed in this review to integrate the potential utilizing technologies including force feedback, binocular vision, and automatic prescription. We should take acupuncture prescription with artificial intelligence and future development trends into account and make a feasible choice in development of modern acupuncture.
Collapse
|
47
|
Scherl C, Stratemeier J, Rotter N, Hesser J, Schönberg SO, Servais JJ, Männle D, Lammert A. Augmented Reality with HoloLens® in Parotid Tumor Surgery: A Prospective Feasibility Study. ORL J Otorhinolaryngol Relat Spec 2021; 83:439-448. [PMID: 33784686 DOI: 10.1159/000514640] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2020] [Accepted: 01/02/2021] [Indexed: 11/19/2022]
Abstract
INTRODUCTION Augmented reality can improve planning and execution of surgical procedures. Head-mounted devices such as the HoloLens® (Microsoft, Redmond, WA, USA) are particularly suitable to achieve these aims because they are controlled by hand gestures and enable contactless handling in a sterile environment. OBJECTIVES So far, these systems have not yet found their way into the operating room for surgery of the parotid gland. This study explored the feasibility and accuracy of augmented reality-assisted parotid surgery. METHODS 2D MRI holographic images were created, and 3D holograms were reconstructed from MRI DICOM files and made visible via the HoloLens. 2D MRI slices were scrolled through, 3D images were rotated, and 3D structures were shown and hidden only using hand gestures. The 3D model and the patient were aligned manually. RESULTS The use of augmented reality with the HoloLens in parotic surgery was feasible. Gestures were recognized correctly. Mean accuracy of superimposition of the holographic model and patient's anatomy was 1.3 cm. Highly significant differences were seen in position error of registration between central and peripheral structures (p = 0.0059), with a least deviation of 10.9 mm (centrally) and highest deviation for the peripheral parts (19.6-mm deviation). CONCLUSION This pilot study offers a first proof of concept of the clinical feasibility of the HoloLens for parotid tumor surgery. Workflow is not affected, but additional information is provided. The surgical performance could become safer through the navigation-like application of reality-fused 3D holograms, and it improves ergonomics without compromising sterility. Superimposition of the 3D holograms with the surgical field was possible, but further invention is necessary to improve the accuracy.
Collapse
Affiliation(s)
- Claudia Scherl
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Johanna Stratemeier
- Institute of Experimental Radiation Oncology, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Nicole Rotter
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Jürgen Hesser
- Institute of Experimental Radiation Oncology, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Stefan O Schönberg
- Department of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Jérôme J Servais
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - David Männle
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Anne Lammert
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| |
Collapse
|
48
|
Evaluation of Three-Dimensional Heads up Ophthalmic Surgery Demonstration From the Perspective of Surgeons and Postgraduate Trainees. J Craniofac Surg 2021; 32:2285-2291. [PMID: 33770023 DOI: 10.1097/scs.0000000000007645] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022] Open
Abstract
BACKGROUND To evaluate the satisfaction of surgeons and trainees with three-dimensional (3D) ophthalmic surgery during a demonstration compared to traditional surgery. METHODS This validated questionnaire-based study was conducted over 1-month during which Ngenuity 3D surgery was demonstrated. All surgeons and trainees exposed were recruited to complete a questionnaire comprising visualization, physical, ease of use, teaching and learning, and overall satisfaction. RESULTS All 7 surgeons and 33 postgraduate students responded. Surgeons reported no significant difference except overall (P = 0.047, paired t-test). Postgraduate trainees reported significantly better experience with 3D for illumination (P = 0.008), manoeuvrability (P = 0.01), glare (P = 0.037), eye strain (P = 0.008), neck and upper back strain (P = 0.000), lower back pain (P = 0.019), communication (P = 0.002), comfortable environment (P = 0.001), sharing of knowledge (P = 0.000), and overall (P = 0.009). CONCLUSIONS During early experience, surgeons and trainees reported better satisfaction with 3D overall. Trainees had better satisfaction with 3D in various subcomponents of visualization, physical, ease of use, and education.
Collapse
|
49
|
Marzullo A, Moccia S, Calimeri F, De Momi E. AIM in Endoscopy Procedures. Artif Intell Med 2021. [DOI: 10.1007/978-3-030-58080-3_164-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/07/2022]
|
50
|
Yuk FJ, Maragkos GA, Sato K, Steinberger J. Current innovation in virtual and augmented reality in spine surgery. ANNALS OF TRANSLATIONAL MEDICINE 2021; 9:94. [PMID: 33553387 PMCID: PMC7859743 DOI: 10.21037/atm-20-1132] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
Abstract
In spinal surgery, outcomes are directly related both to patient and procedure selection, as well as the accuracy and precision of instrumentation placed. Poorly placed instrumentation can lead to spinal cord, nerve root or vascular injury. Traditionally, spine surgery was performed by open methods and placement of instrumentation under direct visualization. However, minimally invasive surgery (MIS) has seen substantial advances in spine, with an ever-increasing range of indications and procedures. For these reasons, novel methods to visualize anatomy and precisely guide surgery, such as intraoperative navigation, are extremely useful in this field. In this review, we present the recent advances and innovations utilizing simulation methods in spine surgery. The application of these techniques is still relatively new, however quickly being integrated in and outside the operating room. These include virtual reality (VR) (where the entire simulation is virtual), mixed reality (MR) (a combination of virtual and physical components), and augmented reality (AR) (the superimposition of a virtual component onto physical reality). VR and MR have primarily found applications in a teaching and preparatory role, while AR is mainly applied in hands-on surgical settings. The present review attempts to provide an overview of the latest advances and applications of these methods in the neurosurgical spine setting.
Collapse
Affiliation(s)
- Frank J Yuk
- Department of Neurosurgery, Mount Sinai Hospital, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Georgios A Maragkos
- Department of Neurosurgery, Mount Sinai Hospital, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Kosuke Sato
- Hospital for Special Surgery, New York, NY, USA
| | - Jeremy Steinberger
- Department of Neurosurgery, Mount Sinai Hospital, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| |
Collapse
|