1
|
Asadi Z, Asadi M, Kazemipour N, Léger É, Kersten-Oertel M. A decade of progress: bringing mixed reality image-guided surgery systems in the operating room. Comput Assist Surg (Abingdon) 2024; 29:2355897. [PMID: 38794834 DOI: 10.1080/24699322.2024.2355897] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/26/2024] Open
Abstract
Advancements in mixed reality (MR) have led to innovative approaches in image-guided surgery (IGS). In this paper, we provide a comprehensive analysis of the current state of MR in image-guided procedures across various surgical domains. Using the Data Visualization View (DVV) Taxonomy, we analyze the progress made since a 2013 literature review paper on MR IGS systems. In addition to examining the current surgical domains using MR systems, we explore trends in types of MR hardware used, type of data visualized, visualizations of virtual elements, and interaction methods in use. Our analysis also covers the metrics used to evaluate these systems in the operating room (OR), both qualitative and quantitative assessments, and clinical studies that have demonstrated the potential of MR technologies to enhance surgical workflows and outcomes. We also address current challenges and future directions that would further establish the use of MR in IGS.
Collapse
Affiliation(s)
- Zahra Asadi
- Department of Computer Science and Software Engineering, Concordia University, Montréal, Canada
| | - Mehrdad Asadi
- Department of Computer Science and Software Engineering, Concordia University, Montréal, Canada
| | - Negar Kazemipour
- Department of Computer Science and Software Engineering, Concordia University, Montréal, Canada
| | - Étienne Léger
- Montréal Neurological Institute & Hospital (MNI/H), Montréal, Canada
- McGill University, Montréal, Canada
| | - Marta Kersten-Oertel
- Department of Computer Science and Software Engineering, Concordia University, Montréal, Canada
| |
Collapse
|
2
|
Bracale U, Iacone B, Tedesco A, Gargiulo A, Di Nuzzo MM, Sannino D, Tramontano S, Corcione F. The use of mixed reality in the preoperative planning of colorectal surgery: Preliminary experience with a narrative review. Cir Esp 2024; 102 Suppl 1:S36-S44. [PMID: 38307256 DOI: 10.1016/j.cireng.2024.01.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2023] [Accepted: 01/14/2024] [Indexed: 02/04/2024]
Abstract
New advanced technologies have recently been developed and preliminarily applied to surgery, including virtual reality (VR), augmented reality (AR) and mixed reality (MR). We retrospectively review all colorectal cases in which we used holographic 3D reconstruction from February 2020 to December 2022. This innovative approach was used to identify vascular anomalies, pinpoint tumor locations, evaluate infiltration into neighboring organs and devise surgical plans for both training and educating trainee assistants. We have also provided a state-of-the-art analysis, briefly highlighting what has been stated by the scientific literature to date. VR facilitates training and anatomical assessments, while AR enhances training and laparoscopic performance evaluations. MR, powered by HoloLens, enriches anatomic recognition, navigation, and visualization. Successful implementation was observed in 10 colorectal cancer cases, showcasing the effectiveness of MR in improving preoperative planning and its intraoperative application. This technology holds significant promise for advancing colorectal surgery by elevating safety and reliability standards.
Collapse
Affiliation(s)
- Umberto Bracale
- Department of Medicine, Surgery and Dentistry, University of Salerno, 84084 Salerno, Italy
| | - Biancamaria Iacone
- Department of Public Health, University of Naples Federico II, 80131 Naples, Italy.
| | - Anna Tedesco
- Department of Public Health, University of Naples Federico II, 80131 Naples, Italy
| | - Antonio Gargiulo
- Department of Public Health, University of Naples Federico II, 80131 Naples, Italy
| | | | - Daniele Sannino
- Department of Public Health, University of Naples Federico II, 80131 Naples, Italy
| | - Salvatore Tramontano
- Department of Medicine, Surgery and Dentistry, University of Salerno, 84084 Salerno, Italy
| | - Francesco Corcione
- Department of Public Health, University of Naples Federico II, 80131 Naples, Italy
| |
Collapse
|
3
|
Wise PA, Preukschas AA, Özmen E, Bellemann N, Norajitra T, Sommer CM, Stock C, Mehrabi A, Müller-Stich BP, Kenngott HG, Nickel F. Intraoperative liver deformation and organ motion caused by ventilation, laparotomy, and pneumoperitoneum in a porcine model for image-guided liver surgery. Surg Endosc 2024; 38:1379-1389. [PMID: 38148403 PMCID: PMC10881715 DOI: 10.1007/s00464-023-10612-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2023] [Accepted: 11/26/2023] [Indexed: 12/28/2023]
Abstract
BACKGROUND Image-guidance promises to make complex situations in liver interventions safer. Clinical success is limited by intraoperative organ motion due to ventilation and surgical manipulation. The aim was to assess influence of different ventilatory and operative states on liver motion in an experimental model. METHODS Liver motion due to ventilation (expiration, middle, and full inspiration) and operative state (native, laparotomy, and pneumoperitoneum) was assessed in a live porcine model (n = 10). Computed tomography (CT)-scans were taken for each pig for each possible combination of factors. Liver motion was measured by the vectors between predefined landmarks along the hepatic vein tree between CT scans after image segmentation. RESULTS Liver position changed significantly with ventilation. Peripheral regions of the liver showed significantly higher motion (maximal Euclidean motion 17.9 ± 2.7 mm) than central regions (maximal Euclidean motion 12.6 ± 2.1 mm, p < 0.001) across all operative states. The total average motion measured 11.6 ± 0.7 mm (p < 0.001). Between the operative states, the position of the liver changed the most from native state to pneumoperitoneum (14.6 ± 0.9 mm, p < 0.001). From native state to laparotomy comparatively, the displacement averaged 9.8 ± 1.2 mm (p < 0.001). With pneumoperitoneum, the breath-dependent liver motion was significantly reduced when compared to other modalities. Liver motion due to ventilation was 7.7 ± 0.6 mm during pneumoperitoneum, 13.9 ± 1.1 mm with laparotomy, and 13.5 ± 1.4 mm in the native state (p < 0.001 in all cases). CONCLUSIONS Ventilation and application of pneumoperitoneum caused significant changes in liver position. Liver motion was reduced but clearly measurable during pneumoperitoneum. Intraoperative guidance/navigation systems should therefore account for ventilation and intraoperative changes of liver position and peripheral deformation.
Collapse
Affiliation(s)
- Philipp A Wise
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Im Neuenheimer Feld 420, 69120, Heidelberg, Germany
| | - Anas A Preukschas
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Im Neuenheimer Feld 420, 69120, Heidelberg, Germany
- Department of General, Visceral and Thoracic Surgery, University Medical Center Hamburg-Eppendorf, Martinistraße 52, 20246, Hamburg, Germany
| | - Emre Özmen
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Im Neuenheimer Feld 420, 69120, Heidelberg, Germany
| | - Nadine Bellemann
- Department of Diagnostic and Interventional Radiology, Heidelberg University, Im Neuenheimer Feld 420, 69120, Heidelberg, Germany
| | - Tobias Norajitra
- Division of Medical and Biological Informatics, German Cancer Research Center, Im Neuenheimer Feld 280, 69120, Heidelberg, Germany
| | - Christof M Sommer
- Department of Diagnostic and Interventional Radiology, Heidelberg University, Im Neuenheimer Feld 420, 69120, Heidelberg, Germany
| | - Christian Stock
- Institute for Medical Biometry and Informatics, Heidelberg University, Im Neuenheimer Feld 305, 69120, Heidelberg, Germany
| | - Arianeb Mehrabi
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Im Neuenheimer Feld 420, 69120, Heidelberg, Germany
| | - Beat P Müller-Stich
- Division of Abdominal Surgery, Clarunis-Academic Centre of Gastrointestinal Diseases, St. Clara and University Hospital of Basel, Petersgraben 4, 4051, Basel, Switzerland
| | - Hannes G Kenngott
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Im Neuenheimer Feld 420, 69120, Heidelberg, Germany
| | - Felix Nickel
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Im Neuenheimer Feld 420, 69120, Heidelberg, Germany.
- Department of General, Visceral and Thoracic Surgery, University Medical Center Hamburg-Eppendorf, Martinistraße 52, 20246, Hamburg, Germany.
| |
Collapse
|
4
|
Feodorovici P, Schnorr P, Bedetti B, Zalepugas D, Schmidt J, Arensmeyer JC. Collaborative Virtual Reality Real-Time 3D Image Editing for Chest Wall Resections and Reconstruction Planning. INNOVATIONS-TECHNOLOGY AND TECHNIQUES IN CARDIOTHORACIC AND VASCULAR SURGERY 2023; 18:525-530. [PMID: 38073259 DOI: 10.1177/15569845231217072] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
The integration of extended reality (XR) technologies into health care procedures presents transformative opportunities, particularly in surgical processes. This study delves into the utilization of virtual reality (VR) for preoperative planning related to chest wall resections in thoracic surgery. Leveraging the capabilities of 3-dimensional (3D) imaging, real-time visualization, and collaborative VR environments, surgeons gain enhanced anatomical insights and can develop predictive surgical strategies. Two clinical cases highlighted the effectiveness of this approach, showcasing the potential for personalized and intricate surgical planning. The setup provides an immersive, dynamic representation of real patient data, enabling collaboration among teams from separate locations. While VR offers enhanced interactive and visualization capabilities, preliminary evidence suggests it may support more refined preoperative strategies, potentially influence postoperative outcomes, and optimize resource management. However, its comparative advantage over traditional methods needs further empirical validation. Emphasizing the potential of XR, this exploration suggests its broad implications in thoracic surgery, especially when dealing with complex cases requiring multidisciplinary collaboration in the immersive virtual space, often referred to as the metaverse. This innovative approach necessitates further examination, marking a shift toward future surgical preparations. In this article, we sought to demonstrate the technique of an immersive real-time volume-rendered collaborative VR-planning tool using exemplary case studies in chest wall surgery.
Collapse
Affiliation(s)
- Philipp Feodorovici
- Division of Thoracic Surgery, Department of General, Visceral, Thoracic and Vascular Surgery, University Hospital Bonn, Germany
| | - Philipp Schnorr
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, Germany
| | - Benedetta Bedetti
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, Germany
| | - Donatas Zalepugas
- Division of Thoracic Surgery, Department of General, Visceral, Thoracic and Vascular Surgery, University Hospital Bonn, Germany
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, Germany
| | - Joachim Schmidt
- Division of Thoracic Surgery, Department of General, Visceral, Thoracic and Vascular Surgery, University Hospital Bonn, Germany
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, Germany
| | - Jan C Arensmeyer
- Division of Thoracic Surgery, Department of General, Visceral, Thoracic and Vascular Surgery, University Hospital Bonn, Germany
| |
Collapse
|
5
|
Shi X, Guo H, Zhu C, Qiu G, Liang T, Lian J, Ma Y, Wang S, Li X. Mixed reality in primary retroperitoneal tumour surgery: Evaluation of preoperative and intraoperative application value. Int J Med Robot 2023:e2584. [PMID: 37792998 DOI: 10.1002/rcs.2584] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2023] [Revised: 09/06/2023] [Accepted: 09/25/2023] [Indexed: 10/06/2023]
Abstract
OBJECTIVE To evaluate the feasibility and application value of mixed reality technology (MR) in Primary retroperitoneal tumour (PRT) surgery. METHODS From 276 patients who underwent PRT resection at the First Affiliated Hospital of Xi'an Jiaotong University, we screened 46 patients who underwent MR-assisted retroperitoneal tumour resection and 46 patients who underwent tumour resection without MR assistance. The intraoperative and postoperative recovery of the patients in both groups were compared, and the reliability and validity of the application of MR were further examined using the Likert scale. RESULTS There was a significant difference in the mean intraoperative bleeding volume between the two groups, but it was reduced in the MR group. The results of the Likert scale showed higher scores in the MR group than non-MR group. CONCLUSIONS MR can be used to assist PRT resection and has great potential to improve the rate of complete retroperitoneal tumour resection.
Collapse
Affiliation(s)
- Xiaoqiang Shi
- Department of General Surgery, The First Affiliated Hospital of Xi'an Jiaotong University, Xi'an, Shaanxi, China
| | - Hainan Guo
- Department of General Surgery, The First Affiliated Hospital of Xi'an Jiaotong University, Xi'an, Shaanxi, China
| | - Chao Zhu
- Department of General Surgery, The First Affiliated Hospital of Xi'an Jiaotong University, Xi'an, Shaanxi, China
- Department of General Surgery, The People's Hospital of Suide County, Suide, Shaanxi, China
| | - Guanglin Qiu
- Department of General Surgery, The First Affiliated Hospital of Xi'an Jiaotong University, Xi'an, Shaanxi, China
| | - Ting Liang
- Department of Radiology, The First Affiliated Hospital of Xi'an Jiaotong University, Xi'an, Shaanxi, China
- Department of Biomedical Engineering, the Key Laboratory of Biomedical Information Engineering of the Ministry of Education, School of Life Science and Technology, Xi'an Jiaotong University, Xi'an, Shaanxi, China
| | - Jie Lian
- Department of Pathology, The First Affiliated Hospital of Xi'an Jiaotong University, Xi'an, Shaanxi, China
| | - Yanfei Ma
- Department of General Surgery, The First Affiliated Hospital of Xi'an Jiaotong University, Xi'an, Shaanxi, China
- Second Department of General Surgery, The Suide Campus, The First Hospital of Yulin, Yulin, Shaanxi, China
| | - Shufeng Wang
- Department of General Surgery, The First Affiliated Hospital of Xi'an Jiaotong University, Xi'an, Shaanxi, China
| | - Xuqi Li
- Department of General Surgery, The First Affiliated Hospital of Xi'an Jiaotong University, Xi'an, Shaanxi, China
| |
Collapse
|
6
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 17] [Impact Index Per Article: 17.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
7
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
8
|
Jiang J, Zhang J, Sun J, Wu D, Xu S. User's image perception improved strategy and application of augmented reality systems in smart medical care: A review. Int J Med Robot 2023; 19:e2497. [PMID: 36629798 DOI: 10.1002/rcs.2497] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2022] [Revised: 12/26/2022] [Accepted: 01/06/2023] [Indexed: 01/12/2023]
Abstract
BACKGROUND Augmented reality (AR) is a new human-computer interaction technology that combines virtual reality, computer vision, and computer networks. With the rapid advancement of the medical field towards intelligence and data visualisation, AR systems are becoming increasingly popular in the medical field because they can provide doctors with clear enough medical images and accurate image navigation in practical applications. However, it has been discovered that different display types of AR systems have different effects on doctors' perception of the image after virtual-real fusion during the actual medical application. If doctors cannot correctly perceive the image, they may be unable to correctly match the virtual information with the real world, which will have a significant impact on their ability to recognise complex structures. METHODS This paper uses Citespace, a literature analysis tool, to visualise and analyse the research hotspots when AR systems are used in the medical field. RESULTS A visual analysis of the 1163 articles retrieved from the Web of Science Core Collection database reveals that display technology and visualisation technology are the key research directions of AR systems at the moment. CONCLUSION This paper categorises AR systems based on their display principles, reviews current image perception optimisation schemes for various types of systems, and analyses and compares different display types of AR systems based on their practical applications in the field of smart medical care so that doctors can select the appropriate display types based on different application scenarios. Finally, the future development direction of AR display technology is anticipated in order for AR technology to be more effectively applied in the field of smart medical care. The advancement of display technology for AR systems is critical for their use in the medical field, and the advantages and disadvantages of various display types should be considered in different application scenarios to select the best AR system.
Collapse
Affiliation(s)
- Jingang Jiang
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China.,Robotics & Its Engineering Research Center, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Jiawei Zhang
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Jianpeng Sun
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Dianhao Wu
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Shuainan Xu
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| |
Collapse
|
9
|
Zary N, Eysenbach G, Van Doormaal TPC, Ruurda JP, Van der Kaaij NP, De Heer LM. Mixed Reality in Modern Surgical and Interventional Practice: Narrative Review of the Literature. JMIR Serious Games 2023; 11:e41297. [PMID: 36607711 PMCID: PMC9947976 DOI: 10.2196/41297] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2022] [Revised: 10/17/2022] [Accepted: 10/31/2022] [Indexed: 11/07/2022] Open
Abstract
BACKGROUND Mixed reality (MR) and its potential applications have gained increasing interest within the medical community over the recent years. The ability to integrate virtual objects into a real-world environment within a single video-see-through display is a topic that sparks imagination. Given these characteristics, MR could facilitate preoperative and preinterventional planning, provide intraoperative and intrainterventional guidance, and aid in education and training, thereby improving the skills and merits of surgeons and residents alike. OBJECTIVE In this narrative review, we provide a broad overview of the different applications of MR within the entire spectrum of surgical and interventional practice and elucidate on potential future directions. METHODS A targeted literature search within the PubMed, Embase, and Cochrane databases was performed regarding the application of MR within surgical and interventional practice. Studies were included if they met the criteria for technological readiness level 5, and as such, had to be validated in a relevant environment. RESULTS A total of 57 studies were included and divided into studies regarding preoperative and interventional planning, intraoperative and interventional guidance, as well as training and education. CONCLUSIONS The overall experience with MR is positive. The main benefits of MR seem to be related to improved efficiency. Limitations primarily seem to be related to constraints associated with head-mounted display. Future directions should be aimed at improving head-mounted display technology as well as incorporation of MR within surgical microscopes, robots, and design of trials to prove superiority.
Collapse
Affiliation(s)
| | | | - Tristan P C Van Doormaal
- University Medical Center Utrecht, Utrecht, Netherlands.,University Hospital Zurich, Zurich, Switzerland
| | | | | | | |
Collapse
|
10
|
Shahbaz M, Miao H, Farhaj Z, Gong X, Weikai S, Dong W, Jun N, Shuwei L, Yu D. Mixed reality navigation training system for liver surgery based on a high-definition human cross-sectional anatomy data set. Cancer Med 2023; 12:7992-8004. [PMID: 36607128 PMCID: PMC10134360 DOI: 10.1002/cam4.5583] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2022] [Revised: 11/24/2022] [Accepted: 12/17/2022] [Indexed: 01/07/2023] Open
Abstract
OBJECTIVES This study aims to use the three-dimensional (3D) mixed-reality model of liver, entailing complex intrahepatic systems and to deeply study the anatomical structures and to promote the training, diagnosis and treatment of liver diseases. METHODS Vascular perfusion human specimens were used for thin-layer frozen milling to obtain liver cross-sections. The 104-megapixel-high-definition cross sectional data set was established and registered to achieve structure identification and manual segmentation. The digital model was reconstructed and data was used to print a 3D hepatic model. The model was combined with HoloLens mixed reality technology to reflect the complex relationships of intrahepatic systems. We simulated 3D patient specific anatomy for identification and preoperative planning, conducted a questionnaire survey, and evaluated the results. RESULTS The 3D digital model and 1:1 transparent and colored model of liver established truly reflected intrahepatic vessels and their complex relationships. The reconstructed model imported into HoloLens could be accurately matched with the 3D model. Only 7.7% participants could identify accessory hepatic veins. The depth and spatial-relationship of intrahepatic structures were better understandable for 92%. The 100%, 84.6%, 69% and 84% believed the 3D models were useful in planning, safer surgical paths, reducing intraoperative complications and training of young surgeons respectively. CONCLUSIONS A detailed 3D model can be reconstructed using the higher quality cross-sectional anatomical data set. When combined with 3D printing and HoloLens technology, a novel hybrid-reality navigation-training system for liver surgery is created. Mixed Reality training is a worthy alternative to provide 3D information to clinicians and its possible application in surgery. This conclusion was obtained based on a questionnaire and evaluation. Surgeons with extensive experience in surgical operations perceived in the questionnaire that this technology might be useful in liver surgery, would help in precise preoperative planning, accurate intraoperative identification, and reduction of hepatic injury.
Collapse
Affiliation(s)
- Muhammad Shahbaz
- Department of Radiology, Qilu Hospital of Shandong UniversityJinanShandongChina
- Research Center for Sectional and Imaging AnatomyDigital Human Institute, School of Basic Medical Science, Shandong UniversityJinanShandongChina
- Department of General SurgeryQilu Hospital of Shandong UniversityJinanShandongChina
| | - Huachun Miao
- Department of Anatomy, Wannan Medical CollegeWuhuAnhuiChina
| | - Zeeshan Farhaj
- Department of Cardiovascular Surgery, Shandong Qianfoshan Hospital, Cheeloo College of MedicineShandong UniversityJinanShandongChina
| | - Xin Gong
- Department of Anatomy, Wannan Medical CollegeWuhuAnhuiChina
| | - Sun Weikai
- Department of Radiology, Qilu Hospital of Shandong UniversityJinanShandongChina
| | - Wenqing Dong
- Department of Anatomy, Wannan Medical CollegeWuhuAnhuiChina
| | - Niu Jun
- Department of General SurgeryQilu Hospital of Shandong UniversityJinanShandongChina
| | - Liu Shuwei
- Research Center for Sectional and Imaging AnatomyDigital Human Institute, School of Basic Medical Science, Shandong UniversityJinanShandongChina
| | - Dexin Yu
- Department of Radiology, Qilu Hospital of Shandong UniversityJinanShandongChina
| |
Collapse
|
11
|
Šlosar L, Voelcker-Rehage C, Paravlić AH, Abazovic E, de Bruin ED, Marusic U. Combining physical and virtual worlds for motor-cognitive training interventions: Position paper with guidelines on technology classification in movement-related research. Front Psychol 2022; 13:1009052. [PMID: 36591050 PMCID: PMC9797127 DOI: 10.3389/fpsyg.2022.1009052] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2022] [Accepted: 11/23/2022] [Indexed: 12/15/2022] Open
Abstract
Efficient movements require intact motor and cognitive function. There is a growing literature on motor-cognitive interventions to improve the overall quality of life of healthy or diseased older people. For such interventions, novel technological advances are crucial not only in terms of motivation but also to improve the user experience in a multi-stimuli world, usually offered as a mixture of real and virtual environments. This article provides a classification system for movement-related research dealing with motor-cognitive interventions performed in different extents of a virtual environment. The classification is divided into three categories: (a) type of digital device with the associated degree of immersiveness provided; (b) presence or absence of a human-computer interaction; and (c) activity engagement during training, defined by activity >1.5 Metabolic Equivalent of task. Since virtual reality (VR) often categorizes different technologies under the same term, we propose a taxonomy of digital devices ranging from computer monitors and projectors to head-mounted VR technology. All immersive technologies that have developed rapidly in recent years are grouped under the umbrella term Extended Reality (XR). These include augmented reality (AR), mixed reality (MR), and VR, as well as all technologies that have yet to be developed. This technology has potential not only for gaming and entertainment, but also for research, motor-cognitive training programs, rehabilitation, telemedicine, etc. This position paper provides definitions, recommendations, and guidelines for future movement-related interventions based on digital devices, human-computer interactions, and physical engagement to use terms more consistently and contribute to a clearer understanding of their implications.
Collapse
Affiliation(s)
- Luka Šlosar
- Science and Research Centre Koper, Institute for Kinesiology Research, Koper, Slovenia,Department of Health Sciences, Alma Mater Europaea – ECM, Maribor, Slovenia
| | - Claudia Voelcker-Rehage
- Neuromotor Behavior and Exercise, Institute of Sport and Exercise Sciences, University of Münster, Münster, Germany
| | - Armin H. Paravlić
- Science and Research Centre Koper, Institute for Kinesiology Research, Koper, Slovenia,Faculty of Sport, University of Ljubljana, Ljubljana, Slovenia,Faculty of Sports Studies, Masaryk University, Brno, Czechia
| | - Ensar Abazovic
- Faculty of Sport and Physical Education, University of Sarajevo, Sarajevo, Bosnia and Herzegovina
| | - Eling D. de Bruin
- Department of Neurobiology, Care Sciences and Society, Karolinska Institute, Stockholm, Sweden,Department of Health, OST – Eastern Swiss University of Applied Sciences, St. Gallen, Switzerland,Department of Health Sciences and Technology, Institute of Human Movement Sciences and Sport, ETH Zurich, Zurich, Switzerland
| | - Uros Marusic
- Science and Research Centre Koper, Institute for Kinesiology Research, Koper, Slovenia,Department of Health Sciences, Alma Mater Europaea – ECM, Maribor, Slovenia,*Correspondence: Uros Marusic,
| |
Collapse
|
12
|
Ceccariglia F, Cercenelli L, Badiali G, Marcelli E, Tarsitano A. Application of Augmented Reality to Maxillary Resections: A Three-Dimensional Approach to Maxillofacial Oncologic Surgery. J Pers Med 2022; 12:jpm12122047. [PMID: 36556268 PMCID: PMC9785494 DOI: 10.3390/jpm12122047] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2022] [Revised: 12/03/2022] [Accepted: 12/07/2022] [Indexed: 12/14/2022] Open
Abstract
In the relevant global context, although virtual reality, augmented reality, and mixed reality have been emerging methodologies for several years, only now have technological and scientific advances made them suitable for revolutionizing clinical care and medical settings through the provision of advanced features and improved healthcare services. Over the past fifteen years, tools and applications using augmented reality (AR) have been designed and tested in the context of various surgical and medical disciplines, including maxillofacial surgery. The purpose of this paper is to show how a marker-less AR guidance system using the Microsoft® HoloLens 2 can be applied in mandible and maxillary demolition surgery to guide maxillary osteotomies. We describe three mandibular and maxillary oncologic resections performed during 2021 using AR support. In these three patients, we applied a marker-less tracking method based on recognition of the patient's facial profile. The surgeon, using HoloLens 2 smart glasses, could see the virtual surgical planning superimposed on the patient's anatomy. We showed that performing osteotomies under AR guidance is feasible and viable, as demonstrated by comparison with osteotomies performed using CAD-CAM cutting guides. This technology has advantages and disadvantages. However, further research is needed to improve the stability and robustness of the marker-less tracking method applied to patient face recognition.
Collapse
Affiliation(s)
- Francesco Ceccariglia
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Correspondence: ; Tel.: +39-051-2144197
| | - Laura Cercenelli
- eDimes Lab-Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy
| | - Giovanni Badiali
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
| | - Emanuela Marcelli
- eDimes Lab-Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy
| | - Achille Tarsitano
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
| |
Collapse
|
13
|
Selecting the Best Approach for the Treatment of Multiple Non-Metastatic Hepatocellular Carcinoma. Cancers (Basel) 2022; 14:cancers14235997. [PMID: 36497478 PMCID: PMC9737585 DOI: 10.3390/cancers14235997] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2022] [Revised: 11/29/2022] [Accepted: 12/03/2022] [Indexed: 12/12/2022] Open
Abstract
According to the Barcelona Clinic Liver Cancer (BCLC) staging system, the optimal strategy for patients with multiple HCC within the Milan Criteria is liver transplantation (LT). However, LT cannot be offered to all the patients due to organ shortages and long waiting lists, as well as because of the advanced disease carrying a high risk of poor outcomes. For early stages, liver resection (LR) or thermal ablation (TA) can be proposed, while trans-arterial chemoembolization (TACE) still remains the treatment of choice for intermediate stages (BCLC-B). Asian guidelines and the National Comprehensive Cancer Network suggest LR for resectable multinodular HCCs, even beyond Milan criteria. In this scenario, a growing body of evidence shows better outcomes after surgical resection when compared with TACE. Trans-arterial radioembolization (TARE) and stereotaxic body radiation therapy (SBRT) can also play an important role in this setting. Furthermore, the role of minimally invasive liver surgery (MILS) specifically for patients with multiple HCC is still not clear. This review aims to summarize current knowledge about the best therapeutical strategy for multiple HCC while focusing on the role of minimally invasive surgery and on the most attractive future perspectives.
Collapse
|
14
|
Innovation, disruptive Technologien und Transformation in der Gefäßchirurgie. GEFÄSSCHIRURGIE 2022. [DOI: 10.1007/s00772-022-00943-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
15
|
The intraoperative use of augmented and mixed reality technology to improve surgical outcomes: A systematic review. Int J Med Robot 2022; 18:e2450. [DOI: 10.1002/rcs.2450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2022] [Revised: 07/23/2022] [Accepted: 07/27/2022] [Indexed: 11/07/2022]
|
16
|
Huber T, Huettl F, Hanke LI, Vradelis L, Heinrich S, Hansen C, Boedecker C, Lang H. Leberchirurgie 4.0 - OP-Planung, Volumetrie, Navigation und Virtuelle
Realität. Zentralbl Chir 2022; 147:361-368. [DOI: 10.1055/a-1844-0549] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
ZusammenfassungDurch die Optimierung der konservativen Behandlung, die Verbesserung der
bildgebenden Verfahren und die Weiterentwicklung der Operationstechniken haben
sich das operative Spektrum sowie der Maßstab für die Resektabilität in Bezug
auf die Leberchirurgie in den letzten Jahrzehnten deutlich verändert.Dank zahlreicher technischer Entwicklungen, insbesondere der 3-dimensionalen
Segmentierung, kann heutzutage die präoperative Planung und die Orientierung
während der Operation selbst, vor allem bei komplexen Eingriffen, unter
Berücksichtigung der patientenspezifischen Anatomie erleichtert werden.Neue Technologien wie 3-D-Druck, virtuelle und augmentierte Realität bieten
zusätzliche Darstellungsmöglichkeiten für die individuelle Anatomie.
Verschiedene intraoperative Navigationsmöglichkeiten sollen die präoperative
Planung im Operationssaal verfügbar machen, um so die Patientensicherheit zu
erhöhen.Dieser Übersichtsartikel soll einen Überblick über den gegenwärtigen Stand der
verfügbaren Technologien sowie einen Ausblick in den Operationssaal der Zukunft
geben.
Collapse
Affiliation(s)
- Tobias Huber
- Klinik für Allgemein-, Viszeral- und Transplantationschirurgie,
Universitätsmedizin Mainz, Mainz, Deutschland
| | - Florentine Huettl
- Klinik für Allgemein-, Viszeral- und Transplantationschirurgie,
Universitätsmedizin Mainz, Mainz, Deutschland
| | - Laura Isabel Hanke
- Klinik für Allgemein-, Viszeral- und Transplantationschirurgie,
Universitätsmedizin Mainz, Mainz, Deutschland
| | - Lukas Vradelis
- Klinik für Allgemein-, Viszeral- und Transplantationschirurgie,
Universitätsmedizin Mainz, Mainz, Deutschland
| | - Stefan Heinrich
- Klinik für Allgemein-, Viszeral- und Transplantationschirurgie,
Universitätsmedizin Mainz, Mainz, Deutschland
| | - Christian Hansen
- Fakultät für Informatik, Otto von Guericke Universität
Magdeburg, Magdeburg, Deutschland
| | - Christian Boedecker
- Klinik für Allgemein-, Viszeral- und Transplantationschirurgie,
Universitätsmedizin Mainz, Mainz, Deutschland
| | - Hauke Lang
- Klinik für Allgemein-, Viszeral- und Transplantationschirurgie,
Universitätsmedizin Mainz, Mainz, Deutschland
| |
Collapse
|
17
|
Jia T, Qiao B, Ren Y, Xing L, Ding B, Yuan F, Luo Q, Li H. Case Report: Application of Mixed Reality Combined With A Surgical Template for Precise Periapical Surgery. Front Surg 2022; 9:923299. [PMID: 36034400 PMCID: PMC9407037 DOI: 10.3389/fsurg.2022.923299] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2022] [Accepted: 05/23/2022] [Indexed: 11/13/2022] Open
Abstract
ObjectiveThe etiology of apical diseases is diverse, and most are due to incomplete root canal therapy. The common clinical manifestations include gingival abscess, fistula and bone destruction. The currently existing limitation of procedures is that surgeons cannot visually evaluate the surgical areas. We sought to combine mixed reality (MR) technology with a 3-dimensional (3D) printed surgical template to achieve visualization in apical surgery. Notably, no reports have described this application.MethodsWe created visual 3D (V3D) files and transferred them into the HoloLens system. We explained the surgical therapy plan to the patient using a mixed reality head-mounted display (MR-HMD). Then, the 3D information was preliminarily matched with the operative area, and the optimal surgical approach was determined by combining this information with 3D surgical guide plate technology.ResultsWe successfully developed a suitable surgical workflow and confirmed the optimal surgical approach from the buccal side. We completely exposed the apical lesion and removed the inflammatory granulation tissue.ConclusionWe are the first group to use the MR technique in apical surgery. We integrated the MR technique with a 3D surgical template to successfully accomplish the surgery. Desirable outcomes using minimally invasive therapy could be achieved with the MR technique.
Collapse
Affiliation(s)
- Tingting Jia
- Department of Stomatology, The First Medical Centre, Chinese PLA General Hospital, Beijing, China
| | - Bo Qiao
- Department of Stomatology, The First Medical Centre, Chinese PLA General Hospital, Beijing, China
| | - Yipeng Ren
- Department of Stomatology, The First Medical Centre, Chinese PLA General Hospital, Beijing, China
| | - Lejun Xing
- Department of Stomatology, The First Medical Centre, Chinese PLA General Hospital, Beijing, China
| | - Baichen Ding
- Department of Stomatology, The First Medical Centre, Chinese PLA General Hospital, Beijing, China
| | - Fang Yuan
- Department of Oncology, The Fifth Medical Centre, Chinese PLA General Hospital, Beijing, China
- Correspondence: Hongbo Li Qiang Luo Fang Yuan
| | - Qiang Luo
- Department of Stomatology, The First Medical Centre, Chinese PLA General Hospital, Beijing, China
- Correspondence: Hongbo Li Qiang Luo Fang Yuan
| | - Hongbo Li
- Department of Stomatology, The First Medical Centre, Chinese PLA General Hospital, Beijing, China
- Correspondence: Hongbo Li Qiang Luo Fang Yuan
| |
Collapse
|
18
|
Islam MS, Lim S. Vibrotactile feedback in virtual motor learning: A systematic review. APPLIED ERGONOMICS 2022; 101:103694. [PMID: 35086007 DOI: 10.1016/j.apergo.2022.103694] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/12/2021] [Revised: 01/14/2022] [Accepted: 01/15/2022] [Indexed: 06/14/2023]
Abstract
Vibrotactile feedback can be effectively applied to motor (physical) learning in virtual environments, as it can provide task-intrinsic and augmented feedback to users, assisting them in enhancing their motor performance. This review investigates current uses of vibrotactile feedback systems in motor learning applications built upon virtual environments by systematically synthesizing 24 peer-reviewed studies. We aim to understand: (1) the current state of the science of using real-time vibrotactile feedback in virtual environments for aiding the acquisition (or improvement) of motor skills, (2) the effectiveness of using vibrotactile feedback in such applications, and (3) research gaps and opportunities in current technology. We used the Sensing-Analysis-Assessment-Intervention framework to assess the scientific literature in our review. The review identifies several research gaps in current studies, as well as potential design considerations that can improve vibrotactile feedback systems in virtual motor learning applications, including the selection and placement of feedback devices and feedback designs.
Collapse
Affiliation(s)
- Md Shafiqul Islam
- Department of Industrial and Systems Engineering, Virginia Tech, Blacksburg, VA, 24061, USA
| | - Sol Lim
- Department of Industrial and Systems Engineering, Virginia Tech, Blacksburg, VA, 24061, USA.
| |
Collapse
|
19
|
Su S, Lei P, Wang C, Gao F, Zhong D, Hu Y. Mixed Reality Technology in Total Knee Arthroplasty: An Updated Review With a Preliminary Case Report. Front Surg 2022; 9:804029. [PMID: 35495740 PMCID: PMC9053587 DOI: 10.3389/fsurg.2022.804029] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2021] [Accepted: 03/16/2022] [Indexed: 11/13/2022] Open
Abstract
Background Augmented reality and mixed reality have been used to help surgeons perform complex surgeries. With the development of technology, mixed reality (MR) technology has been used to improve the success rate of complex hip arthroplasty due to its unique advantages. At present, there are few reports on the application of MR technology in total knee arthroplasty. We presented a case of total knee arthroplasty with the help of mixed reality technology. Case Presentation We presented a case of a 71-year-old woman who was diagnosed with bilateral knee osteoarthritis with varus deformity, especially on the right side. After admission, the right total knee arthroplasty was performed with the assistance of MR technology. Before the operation, the three-dimensional virtual model of the knee joint of the patient was reconstructed for condition analysis, operation plan formulation, and operation simulation. During the operation, the three-dimensional virtual images of the femur and tibia coincided with the real body of the patient, showing the osteotomy plane designed before the operation, which can accurately guide the completion of osteotomy and prosthesis implantation. Conclusions As far as we know, this is the first report on total knee arthroplasty under the guidance of mixed reality technology.
Collapse
Affiliation(s)
- Shilong Su
- Department of Orthopedics, Xiangya Hospital, Central South University, Changsha, China
- Department of Orthopedics, The First Hospital of Changsha, Changsha, China
| | - Pengfei Lei
- Department of Orthopedics, Xiangya Hospital, Central South University, Changsha, China
- Department of Orthopedics, The First Affiliated Hospital, College of Medicine, Zhejiang University, Hangzhou, China
| | - Chenggong Wang
- Department of Orthopedics, Xiangya Hospital, Central South University, Changsha, China
| | - Fawei Gao
- Department of Orthopedics, Xiangya Hospital, Central South University, Changsha, China
| | - Da Zhong
- Department of Orthopedics, Xiangya Hospital, Central South University, Changsha, China
- *Correspondence: Da Zhong
| | - Yihe Hu
- Department of Orthopedics, Xiangya Hospital, Central South University, Changsha, China
- Department of Orthopedics, The First Affiliated Hospital, College of Medicine, Zhejiang University, Hangzhou, China
| |
Collapse
|
20
|
Leaping the Boundaries in Laparoscopic Liver Surgery for Hepatocellular Carcinoma. Cancers (Basel) 2022; 14:cancers14082012. [PMID: 35454921 PMCID: PMC9028003 DOI: 10.3390/cancers14082012] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2022] [Revised: 04/07/2022] [Accepted: 04/12/2022] [Indexed: 02/08/2023] Open
Abstract
Simple Summary Recent advances in surgical techniques and perioperative management lead to a redefinition of the actual frontiers of Laparoscopic Liver Resection (LLR) by including patients with more advanced disease. Nonetheless, because of both underlying liver conditions and technical difficulty, LLR for Hepatocellular Carcinoma (HCC) is still considered as a challenging procedure. Specific concerns exist about LLR in cirrhotic patients, posterosuperior segments, giant and multiple tumors, as well as repeat resections. This review focuses on the specific limits of this approach in HCC patients in order to put into practice all the pre- and intra-operative precautions to overcome their boundaries, making this technique the standard of care within high-volume hepatobiliary centers. Abstract The minimally invasive approach for hepatocellular carcinoma (HCC) had a slower diffusion compared to other surgical fields, mainly due to inherent peculiarities regarding the risks of uncontrollable bleeding, oncological inadequacy, and the need for both laparoscopic and liver major skills. Recently, laparoscopic liver resection (LLR) has been associated with an improved postoperative course, including reduced postoperative decompensation, intraoperative blood losses, length of hospitalization, and unaltered oncological outcomes, leading to its adoption within international guidelines. However, LLR for HCC still faces several limitations, mainly linked to the impaired function of underlying parenchyma, tumor size and numbers, and difficult tumor position. The aim of this review is to highlight the state of the art and future perspectives of LLR for HCC, focusing on key points for overcoming currents limitations and pushing the boundaries in minimally invasive liver surgery (MILS).
Collapse
|
21
|
Birlo M, Edwards PJE, Clarkson M, Stoyanov D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med Image Anal 2022; 77:102361. [PMID: 35168103 PMCID: PMC10466024 DOI: 10.1016/j.media.2022.102361] [Citation(s) in RCA: 19] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 11/17/2021] [Accepted: 01/10/2022] [Indexed: 12/11/2022]
Abstract
This article presents a systematic review of optical see-through head mounted display (OST-HMD) usage in augmented reality (AR) surgery applications from 2013 to 2020. Articles were categorised by: OST-HMD device, surgical speciality, surgical application context, visualisation content, experimental design and evaluation, accuracy and human factors of human-computer interaction. 91 articles fulfilled all inclusion criteria. Some clear trends emerge. The Microsoft HoloLens increasingly dominates the field, with orthopaedic surgery being the most popular application (28.6%). By far the most common surgical context is surgical guidance (n=58) and segmented preoperative models dominate visualisation (n=40). Experiments mainly involve phantoms (n=43) or system setup (n=21), with patient case studies ranking third (n=19), reflecting the comparative infancy of the field. Experiments cover issues from registration to perception with very different accuracy results. Human factors emerge as significant to OST-HMD utility. Some factors are addressed by the systems proposed, such as attention shift away from the surgical site and mental mapping of 2D images to 3D patient anatomy. Other persistent human factors remain or are caused by OST-HMD solutions, including ease of use, comfort and spatial perception issues. The significant upward trend in published articles is clear, but such devices are not yet established in the operating room and clinical studies showing benefit are lacking. A focused effort addressing technical registration and perceptual factors in the lab coupled with design that incorporates human factors considerations to solve clear clinical problems should ensure that the significant current research efforts will succeed.
Collapse
Affiliation(s)
- Manuel Birlo
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK.
| | - P J Eddie Edwards
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Matthew Clarkson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| |
Collapse
|
22
|
Reinschluessel AV, Muender T, Salzmann D, Döring T, Malaka R, Weyhe D. Virtual Reality for Surgical Planning – Evaluation Based on Two Liver Tumor Resections. Front Surg 2022; 9:821060. [PMID: 35296126 PMCID: PMC8919284 DOI: 10.3389/fsurg.2022.821060] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2021] [Accepted: 01/24/2022] [Indexed: 11/29/2022] Open
Abstract
Purpose For complex cases, preoperative surgical planning is a standard procedure to ensure patient safety and keep the surgery time to a minimum. Based on the available information, such as MRI or CT images, and prior anatomical knowledge the surgeons create their own mental 3D model of the organ of interest. This is challenging, requires years of training and an inherent uncertainty remains even for experienced surgeons. Goal Virtual reality (VR) is by nature excellent in showing spatial relationships through its stereoscopic displays. Therefore, it is well suited to be used to support the understanding of individual anatomy of patient-specific 3D organ models generated from MRI or CT data. Utilizing this potential, we developed a VR surgical planning tool that provides a 3D view of the medical data for better spatial understanding and natural interaction with the data in 3D space. Following a user-centered design process, in this first user study, we focus on usability, usefulness, and target audience feedback. Thereby, we also investigate the individual impact the tool and the 3D presentation of the organ have on the understanding of the 3D structures for the surgical team. Methods We employed the VR prototype for surgical planning using a standard VR setup to two real cases of patients with liver tumors who were scheduled for surgery at a University Hospital for Visceral Surgery. Surgeons (N = 4) used the VR prototype before the surgery to plan the procedure in addition to their regular planning process. We used semi-structured interviews before and after the surgery to explore the benefits and pitfalls of VR surgical planning. Results The participants used on average 14.3 min (SD = 3.59) to plan the cases in VR. The reported usability was good. Results from the interviews and observations suggest that planning in VR can be very beneficial for surgeons. They reported an improved spatial understanding of the individual anatomical structures and better identification of anatomical variants. Additionally, as the surgeons mentioned an improved recall of the information and better identification of surgical relevant structures, the VR tool has the potential to improve the surgery and patient safety.
Collapse
Affiliation(s)
- Anke V. Reinschluessel
- Digital Media Lab, University of Bremen, Bremen, Germany
- *Correspondence: Anke V. Reinschluessel
| | - Thomas Muender
- Digital Media Lab, University of Bremen, Bremen, Germany
- Thomas Muender
| | - Daniela Salzmann
- University Hospital for Visceral Surgery, Pius-Hospital Oldenburg, Carl Von Ossietzky University Oldenburg, Oldenburg, Germany
| | - Tanja Döring
- Digital Media Lab, University of Bremen, Bremen, Germany
- Tanja Döring
| | - Rainer Malaka
- Digital Media Lab, University of Bremen, Bremen, Germany
- Rainer Malaka
| | - Dirk Weyhe
- University Hospital for Visceral Surgery, Pius-Hospital Oldenburg, Carl Von Ossietzky University Oldenburg, Oldenburg, Germany
| |
Collapse
|
23
|
Trends in the Use of Augmented Reality, Virtual Reality, and Mixed Reality in Surgical Research: a Global Bibliometric and Visualized Analysis. Indian J Surg 2022; 84:52-69. [PMID: 35228782 PMCID: PMC8866921 DOI: 10.1007/s12262-021-03243-w] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2021] [Accepted: 12/11/2021] [Indexed: 11/15/2022] Open
Abstract
There have been many major developments in the use of augmented reality (AR), virtual reality (VR), and mixed reality (MR) technologies in the context of global surgical research, yet few reports on the trends in this field have been published to date. This study was therefore designed to explore these worldwide trends in this clinically important field. Relevant studies published from 1 January 2009 through 13 October 2020 were retrieved from the Science Citation Index-Expanded (SCI-E) tool of the Web of Science database. Bibliometric techniques were then used to analyze the resultant data, with visual bibliographic coupling, co-authorship, co-citation, co-occurrence, and publication trend analyses subsequently being conducted with GraphPad Prism 8 and with the visualization of similarities (VOS) software tool. There is no patient and public involved. In total, 6221 relevant studies were incorporated into this analysis. At a high level, clear global annual increases in the number of publications in this field were observed. The USA made the greatest contributions to this field over the studied period, with the highest H-index value, the most citations, and the greatest total link strength for analyzed publications. The country with the highest number of average citations per publication was Scotland. The Surgical Endoscopy And Other Interventional Techniques journal contributed the greatest number of publications in this field. The University of London was the institution that produced the greatest volume of research in this field. Overall, studies could be broadly classified into five clusters: Neurological Research, Surgical Techniques, Technological Products, Rehabilitative Medicine, and Clinical Therapy. The trends detected in the present analysis suggest that the number of global publications pertaining to the use of AR, VR, and MR techniques in surgical research is likely to increase in the coming years. Particular attention should be paid to emerging trends in related fields including MR, extended reality, head-mounted displays, navigation, and holographic images.
Collapse
|
24
|
Lu L, Wang H, Liu P, Liu R, Zhang J, Xie Y, Liu S, Huo T, Xie M, Wu X, Ye Z. Applications of Mixed Reality Technology in Orthopedics Surgery: A Pilot Study. Front Bioeng Biotechnol 2022; 10:740507. [PMID: 35273954 PMCID: PMC8902164 DOI: 10.3389/fbioe.2022.740507] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2021] [Accepted: 01/21/2022] [Indexed: 12/28/2022] Open
Abstract
Objective: The aim of this study is to explore the potential of mixed reality (MR) technology in the visualization of orthopedic surgery. Methods: The visualization system with MR technology is widely used in orthopedic surgery. The system is composed of a 3D imaging workstation, a cloud platform, and an MR space station. An intelligent segmentation algorithm is adopted on the 3D imaging workstation to create a 3D anatomical model with zooming and rotation effects. This model is then exploited for efficient 3D reconstruction of data for computerized tomography (CT) and magnetic resonance imaging (MRI). Additionally, the model can be uploaded to the cloud platform for physical parameter tuning, model positioning, rendering and high-dimensional display. Using Microsoft’s HoloLens glasses in combination with the MR system, we project and view 3D holograms in real time under different clinical scenarios. After each procedure, nine surgeons completed a Likert-scale questionnaire on communication and understanding, spatial awareness and effectiveness of MR technology use. In addition to that, the National Aeronautics and Space Administration Task Load Index (NASA-TLX) is also used to evaluate the workload of MR hologram support. Results: 1) MR holograms can clearly show the 3D structures of bone fractures, which improves the understanding of different fracture types and the design of treatment plans; 2) Holograms with three-dimensional lifelike dynamic features provide an intuitive communication tool among doctors and also between doctors and patients; 3) During surgeries, a full lesion hologram can be obtained and blended in real time with a patient’s virtual 3D digital model in order to give surgeons superior visual guidance through novel high-dimensional “perspectives” of the surgical area; 4) Hologram-based magnetic navigation improves the accuracy and safety of the screw placement in orthopaedics surgeries; 5) The combination of mixed reality cloud platform and telemedicine system based on 5G provides a new technology platform for telesurgery collaboration. Results of qualitative study encourage the usage of MR technology for orthopaedics surgery. Analysis of the Likert-scale questionnaire shows that MR adds significant value to understanding and communication, spatial awareness, learning and effectiveness. Based on the NASA TLX-scale questionnaire results, mixed reality scored significantly lower under the “mental,” “temporal,” “performance,” and “frustration” categories compared to usual 2D. Conclusion: The integration of MR technology in orthopaedic surgery reduces the dependence on surgeons’ experience and provides personalized 3D visualization models for accurate diagnosis and treatment of orthopaedic abnormalities. This integration is clearly one of the prominent future development directions in medical surgery.
Collapse
Affiliation(s)
- Lin Lu
- Department of Orthopaedics Surgery, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
- Intelligent Medical Laboratory, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Honglin Wang
- Department of Orthopaedics Surgery, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
- Intelligent Medical Laboratory, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Pengran Liu
- Department of Orthopaedics Surgery, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
- Intelligent Medical Laboratory, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Rong Liu
- Department of Orthopaedic Surgery, Puren Hospital of Wuhan, Wuhan University of Science and Technology, Wuhan, China
| | - Jiayao Zhang
- Department of Orthopaedics Surgery, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
- Intelligent Medical Laboratory, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Yi Xie
- Department of Orthopaedics Surgery, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
- Intelligent Medical Laboratory, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Songxiang Liu
- Department of Orthopaedics Surgery, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
- Intelligent Medical Laboratory, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Tongtong Huo
- Intelligent Medical Laboratory, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Mao Xie
- Department of Orthopaedics Surgery, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Xinghuo Wu
- Department of Orthopaedics Surgery, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
- Intelligent Medical Laboratory, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
- *Correspondence: Xinghuo Wu, ; Zhewei Ye,
| | - Zhewei Ye
- Department of Orthopaedics Surgery, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
- Intelligent Medical Laboratory, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
- *Correspondence: Xinghuo Wu, ; Zhewei Ye,
| |
Collapse
|
25
|
Tang ZN, Hu LH, Soh HY, Yu Y, Zhang WB, Peng X. Accuracy of Mixed Reality Combined With Surgical Navigation Assisted Oral and Maxillofacial Tumor Resection. Front Oncol 2022; 11:715484. [PMID: 35096559 PMCID: PMC8795771 DOI: 10.3389/fonc.2021.715484] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2021] [Accepted: 12/20/2021] [Indexed: 11/13/2022] Open
Abstract
OBJECTIVE To evaluate the feasibility and accuracy of mixed reality combined with surgical navigation in oral and maxillofacial tumor surgery. METHODS Retrospective analysis of data of seven patients with oral and maxillofacial tumors who underwent surgery between January 2019 and January 2021 using a combination of mixed reality and surgical navigation. Virtual surgical planning and navigation plan were based on preoperative CT datasets. Through IGT-Link port, mixed reality workstation was synchronized with surgical navigation, and surgical planning data were transferred to the mixed reality workstation. Osteotomy lines were marked with the aid of both surgical navigation and mixed reality images visualized through HoloLens. Frozen section examination was used to ensure negative surgical margins. Postoperative CT datasets were obtained 1 week after the surgery, and chromatographic analysis of virtual osteotomies and actual osteotomies was carried out. Patients received standard oncological postoperative follow-up. RESULTS Of the seven patients, four had maxillary tumors and three had mandibular tumors. There were total of 13 osteotomy planes. Mean deviation between the planned osteotomy plane and the actual osteotomy plane was 1.68 ± 0.92 mm; the maximum deviation was 3.46 mm. Chromatographic analysis showed error of ≤3 mm for 80.16% of the points. Mean deviations of maxillary and mandibular osteotomy lines were approximate (1.60 ± 0.93 mm vs. 1.86 ± 0.93 mm). While five patients had benign tumors, two had malignant tumors. Mean deviations of osteotomy lines was comparable between patients with benign and malignant tumors (1.48 ± 0.74 mm vs. 2.18 ± 0.77 mm). Intraoperative frozen pathology confirmed negative resection margins in all cases. No tumor recurrence or complications occurred during mean follow-up of 15.7 months (range, 6-26 months). CONCLUSION The combination of mixed reality technology and surgical navigation appears to be feasible, safe, and effective for tumor resection in the oral and maxillofacial region.
Collapse
Affiliation(s)
- Zu-Nan Tang
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology & National Center of Stomatology & National Clinical Research Center for Oral Diseases & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing, China
| | - Lei-Hao Hu
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology & National Center of Stomatology & National Clinical Research Center for Oral Diseases & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing, China
| | - Hui Yuh Soh
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology & National Center of Stomatology & National Clinical Research Center for Oral Diseases & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing, China.,Department of Oral and Maxillofacial Surgery, Faculty of Dentistry, Universiti Kebangsaan Malaysia, Kuala Lumpur, Malaysia
| | - Yao Yu
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology & National Center of Stomatology & National Clinical Research Center for Oral Diseases & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing, China
| | - Wen-Bo Zhang
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology & National Center of Stomatology & National Clinical Research Center for Oral Diseases & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing, China
| | - Xin Peng
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology & National Center of Stomatology & National Clinical Research Center for Oral Diseases & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing, China
| |
Collapse
|
26
|
Chen Z, Zhang Y, Yan Z, Dong J, Cai W, Ma Y, Jiang J, Dai K, Liang H, He J. Artificial intelligence assisted display in thoracic surgery: development and possibilities. J Thorac Dis 2022; 13:6994-7005. [PMID: 35070382 PMCID: PMC8743398 DOI: 10.21037/jtd-21-1240] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2021] [Accepted: 11/02/2021] [Indexed: 12/24/2022]
Abstract
In this golden age of rapid development of artificial intelligence (AI), researchers and surgeons realized that AI could contribute to healthcare in all aspects, especially in surgery. The popularity of low-dose computed tomography (LDCT) and the improvement of the video-assisted thoracoscopic surgery (VATS) not only bring opportunities for thoracic surgery but also bring challenges on the way forward. Preoperatively localizing lung nodules precisely, intraoperatively identifying anatomical structures accurately, and avoiding complications requires a visual display of individuals’ specific anatomy for surgical simulation and assistance. With the advance of AI-assisted display technologies, including 3D reconstruction/3D printing, virtual reality (VR), augmented reality (AR), and mixed reality (MR), computer tomography (CT) imaging in thoracic surgery has been fully utilized for transforming 2D images to 3D model, which facilitates surgical teaching, planning, and simulation. AI-assisted display based on surgical videos is a new surgical application, which is still in its infancy. Notably, it has potential applications in thoracic surgery education, surgical quality evaluation, intraoperative assistance, and postoperative analysis. In this review, we illustrated the current AI-assisted display applications based on CT in thoracic surgery; focused on the emerging AI applications in thoracic surgery based on surgical videos by reviewing its relevant researches in other surgical fields and anticipate its potential development in thoracic surgery.
Collapse
Affiliation(s)
- Zhuxing Chen
- Department of Thoracic Surgery and Oncology, the First Affiliated Hospital of Guangzhou Medical University, National Center for Respiratory Medicine, State Key Laboratory of Respiratory Disease, National Clinical Research Center for Respiratory Disease, Guangzhou Institute of Respiratory Health, Guangzhou, China
| | - Yudong Zhang
- Department of Thoracic Surgery, the First Affiliated Hospital of Sun Yat-sen University, Guangzhou, China
| | - Zeping Yan
- Department of Thoracic Surgery and Oncology, the First Affiliated Hospital of Guangzhou Medical University, National Center for Respiratory Medicine, State Key Laboratory of Respiratory Disease, National Clinical Research Center for Respiratory Disease, Guangzhou Institute of Respiratory Health, Guangzhou, China.,Guangdong Association of Thoracic Diseases, Guangzhou, China
| | - Junguo Dong
- Department of Thoracic Surgery and Oncology, the First Affiliated Hospital of Guangzhou Medical University, National Center for Respiratory Medicine, State Key Laboratory of Respiratory Disease, National Clinical Research Center for Respiratory Disease, Guangzhou Institute of Respiratory Health, Guangzhou, China
| | - Weipeng Cai
- Department of Thoracic Surgery and Oncology, the First Affiliated Hospital of Guangzhou Medical University, National Center for Respiratory Medicine, State Key Laboratory of Respiratory Disease, National Clinical Research Center for Respiratory Disease, Guangzhou Institute of Respiratory Health, Guangzhou, China
| | - Yongfu Ma
- Department of Thoracic Surgery, the First Medical Centre, Chinese PLA General Hospital, Beijing, China
| | - Jipeng Jiang
- Department of Thoracic Surgery, the First Medical Centre, Chinese PLA General Hospital, Beijing, China
| | - Keyao Dai
- Department of Cardiothoracic Surgery, The Affiliated Hospital of Guangdong Medical University, Zhanjiang, China
| | - Hengrui Liang
- Department of Thoracic Surgery and Oncology, the First Affiliated Hospital of Guangzhou Medical University, National Center for Respiratory Medicine, State Key Laboratory of Respiratory Disease, National Clinical Research Center for Respiratory Disease, Guangzhou Institute of Respiratory Health, Guangzhou, China
| | - Jianxing He
- Department of Thoracic Surgery and Oncology, the First Affiliated Hospital of Guangzhou Medical University, National Center for Respiratory Medicine, State Key Laboratory of Respiratory Disease, National Clinical Research Center for Respiratory Disease, Guangzhou Institute of Respiratory Health, Guangzhou, China
| |
Collapse
|
27
|
Tukra S, Lidströmer N, Ashrafian H, Gianarrou S. AI in Surgical Robotics. Artif Intell Med 2022. [DOI: 10.1007/978-3-030-64573-1_323] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
28
|
Comment on: "Intraoperative 3D Hologram Support With Mixed Reality Techniques in Liver Surgery". Ann Surg 2021; 274:e761-e762. [PMID: 32649464 DOI: 10.1097/sla.0000000000004157] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
|
29
|
Huettl F, Saalfeld P, Hansen C, Preim B, Poplawski A, Kneist W, Lang H, Huber T. Virtual reality and 3D printing improve preoperative visualization of 3D liver reconstructions-results from a preclinical comparison of presentation modalities and user's preference. ANNALS OF TRANSLATIONAL MEDICINE 2021; 9:1074. [PMID: 34422986 PMCID: PMC8339861 DOI: 10.21037/atm-21-512] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/01/2021] [Accepted: 03/17/2021] [Indexed: 12/20/2022]
Abstract
Background Preoperative three-dimensional (3D) reconstructions for liver surgery planning have been shown to be effective in reduction of blood loss and operation time. However, the role of the ‘presentation modality’ is not well investigated. We present the first study to compare 3D PDFs, 3D printed models (PR) and virtual reality (VR) 3D models with regard to anatomical orientation and personal preferences in a high volume liver surgery center. Methods Thirty participants, 10 medical students, 10 residents, 5 fellows and 5 hepatopancreatobiliary (HPB) experts, assigned the tumor-bearing segments of 20 different patient’s individual liver reconstructions. Liver models were presented in a random order in all modalities. Time needed to specify the tumor location was recorded. In addition, a score was calculated factoring in correct, wrong and missing segment assignments. Furthermore, standardized test/questionnaires for spatial thinking and seeing, vegetative side effects and usability were completed. Results Participants named significantly more correct segments in VR (P=0.040) or PR (P=0.036) compared to PDF. Tumor assignment was significantly shorter with 3D PR models compared to 3D PDF (P<0.001) or VR application (P<0.001). Regardless of the modality, HPB experts were significantly faster (24±8 vs. 35±11 sec; P=0.014) and more often correct (0.87±0.12 vs. 0.83±0.15; P<0.001) than medical students. Test results for spatial thinking and seeing had no influence on time but on correctness of tumor assignment. Regarding usability and user experience the VR application achieved the highest scores without causing significant vegetative symptoms and was also the most preferred method (n=22, 73.3%) because of the multiple functions like scaling and change of transparency. Ninety percent (n=27) stated that this application can positively influence the operation planning. Conclusions 3D PR models and 3D VR models enable a better and partially faster anatomical orientation than reconstructions presented as 3D PDFs. User’s preferred the VR application over the PR models and PDF. A prospective trial is needed to evaluate the different presentation modalities regarding intra- and postoperative outcomes.
Collapse
Affiliation(s)
- Florentine Huettl
- Department of General, Visceral and Transplant Surgery, University Medical Center of the Johannes Gutenberg-University Mainz, Mainz, Germany
| | - Patrick Saalfeld
- Institute of Simulation and Graphics, Faculty of Computer Science, Otto-von-Guericke University Magdeburg, Magdeburg, Germany
| | - Christian Hansen
- Institute of Simulation and Graphics, Faculty of Computer Science, Otto-von-Guericke University Magdeburg, Magdeburg, Germany
| | - Bernhard Preim
- Institute of Simulation and Graphics, Faculty of Computer Science, Otto-von-Guericke University Magdeburg, Magdeburg, Germany
| | - Alicia Poplawski
- Institute of Medical Biostatistics, Epidemiology and Informatics (IMBEI), University Medical Center of the Johannes Gutenberg-University Mainz, Mainz, Germany
| | - Werner Kneist
- Department of General, Visceral and Transplant Surgery, University Medical Center of the Johannes Gutenberg-University Mainz, Mainz, Germany.,Department of General and Visceral Surgery, St. Georg Hospital, Eisenach, Germany
| | - Hauke Lang
- Department of General, Visceral and Transplant Surgery, University Medical Center of the Johannes Gutenberg-University Mainz, Mainz, Germany
| | - Tobias Huber
- Department of General, Visceral and Transplant Surgery, University Medical Center of the Johannes Gutenberg-University Mainz, Mainz, Germany
| |
Collapse
|
30
|
López-Ojeda W, Hurley RA. Extended-Reality Technologies: An Overview of Emerging Applications in Medical Education and Clinical Care. J Neuropsychiatry Clin Neurosci 2021; 33:A4-177. [PMID: 34289698 DOI: 10.1176/appi.neuropsych.21030067] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Affiliation(s)
- Wilfredo López-Ojeda
- Veterans Affairs Mid-Atlantic Mental Illness Research, Education, and Clinical Center, and Research and Academic Affairs Service Line, W.G. Hefner Veterans Affairs Medical Center, Salisbury, N.C. (López-Ojeda, Hurley); Department of Psychiatry and Behavioral Medicine, Wake Forest School of Medicine, Winston-Salem, N.C. (López-Ojeda); Departments of Psychiatry and Radiology, Wake Forest School of Medicine, Winston-Salem, N.C. (Hurley); and Menninger Department of Psychiatry and Behavioral Sciences, Baylor College of Medicine, Houston (Hurley)
| | - Robin A Hurley
- Veterans Affairs Mid-Atlantic Mental Illness Research, Education, and Clinical Center, and Research and Academic Affairs Service Line, W.G. Hefner Veterans Affairs Medical Center, Salisbury, N.C. (López-Ojeda, Hurley); Department of Psychiatry and Behavioral Medicine, Wake Forest School of Medicine, Winston-Salem, N.C. (López-Ojeda); Departments of Psychiatry and Radiology, Wake Forest School of Medicine, Winston-Salem, N.C. (Hurley); and Menninger Department of Psychiatry and Behavioral Sciences, Baylor College of Medicine, Houston (Hurley)
| |
Collapse
|
31
|
Hilt AD, Hierck BP, Eijkenduijn J, Wesselius FJ, Albayrak A, Melles M, Schalij MJ, Scherptong RWC. Development of a patient-oriented Hololens application to illustrate the function of medication after myocardial infarction. EUROPEAN HEART JOURNAL. DIGITAL HEALTH 2021; 2:511-520. [PMID: 36713611 PMCID: PMC9707881 DOI: 10.1093/ehjdh/ztab053] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/25/2021] [Revised: 04/25/2021] [Accepted: 06/08/2021] [Indexed: 02/07/2023]
Abstract
Aims Statin treatment is one of the hallmarks of secondary prevention after myocardial infarction. Adherence to statins tends to be difficult and can be improved by patient education. Novel technologies such as mixed reality (MR) expand the possibilities to support this process. To assess if an MR medication-application supports patient education focused on function of statins after myocardial infarction. Methods and results A human-centred design-approach was used to develop an MR statin tool for Microsoft HoloLens™. Twenty-two myocardial infarction patients were enrolled; 12 tested the application, 10 patients were controls. Clinical, demographic, and qualitative data were obtained. All patients performed a test on statin knowledge. To test if patients with a higher tendency to become involved in virtual environments affected test outcome in the intervention group, validated Presence- and Immersive Tendency Questionnaires (PQ and ITQ) were used. Twenty-two myocardial infarction patients (ST-elevation myocardial infarction, 18/22, 82%) completed the study. Ten out of 12 (83%) patients in the intervention group improved their statin knowledge by using the MR application (median 8 points, IQR 8). Test improvement was mainly the result of increased understanding of statin mechanisms in the body and secondary preventive effects. A high tendency to get involved and focused in virtual environments was moderately positive correlated with better test improvement (r = 0.57, P < 0.05). The median post-test score in the control group was poor (median 6 points, IQR 4). Conclusions An MR statin education application can be applied effectively in myocardial infarction patients to explain statin function and importance.
Collapse
Affiliation(s)
- Alexander D Hilt
- Department of Cardiology, Leiden University Medical Center, PO Box 9600, 2300 RC Leiden, The Netherlands
| | - Beerend P Hierck
- Leiden University Medical Center, Center for Innovation of Medical Education, Albinusdreef 2, 2333 ZA Leiden, The Netherlands,Leiden University, Teachers Academy, Albinusdreef 2, 2333 ZA Leiden, The Netherlands
| | - Joep Eijkenduijn
- Faculty of Technical Medicine, Delft University of Technology, Landbergstraat 15, 2628 CE Delft, The Netherlands
| | - Fons J Wesselius
- Faculty of Technical Medicine, Delft University of Technology, Landbergstraat 15, 2628 CE Delft, The Netherlands
| | - Armagan Albayrak
- Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628 CE Delft, The Netherlands
| | - Marijke Melles
- Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628 CE Delft, The Netherlands,Department of Public and Occupational Health, Amsterdam University Medical Centre, Vrije Universiteit Amsterdam, De Boelelaan 1117, 1118, 1081 HV Amsterdam, The Netherlands
| | - Martin J Schalij
- Department of Cardiology, Leiden University Medical Center, PO Box 9600, 2300 RC Leiden, The Netherlands
| | - Roderick W C Scherptong
- Department of Cardiology, Leiden University Medical Center, PO Box 9600, 2300 RC Leiden, The Netherlands,Corresponding author. Tel: +31 71 5262020,
| |
Collapse
|
32
|
Velazco-Garcia JD, Navkar NV, Balakrishnan S, Younes G, Abi-Nahed J, Al-Rumaihi K, Darweesh A, Elakkad MSM, Al-Ansari A, Christoforou EG, Karkoub M, Leiss EL, Tsiamyrtzis P, Tsekos NV. Evaluation of how users interface with holographic augmented reality surgical scenes: Interactive planning MR-Guided prostate biopsies. Int J Med Robot 2021; 17:e2290. [PMID: 34060214 DOI: 10.1002/rcs.2290] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2020] [Revised: 05/04/2021] [Accepted: 05/27/2021] [Indexed: 12/15/2022]
Abstract
BACKGROUND User interfaces play a vital role in the planning and execution of an interventional procedure. The objective of this study is to investigate the effect of using different user interfaces for planning transrectal robot-assisted MR-guided prostate biopsy (MRgPBx) in an augmented reality (AR) environment. METHOD End-user studies were conducted by simulating an MRgPBx system with end- and side-firing modes. The information from the system to the operator was rendered on HoloLens as an output interface. Joystick, mouse/keyboard, and holographic menus were used as input interfaces to the system. RESULTS The studies indicated that using a joystick improved the interactive capacity and enabled operator to plan MRgPBx in less time. It efficiently captures the operator's commands to manipulate the augmented environment representing the state of MRgPBx system. CONCLUSIONS The study demonstrates an alternative to conventional input interfaces to interact and manipulate an AR environment within the context of MRgPBx planning.
Collapse
Affiliation(s)
| | - Nikhil V Navkar
- Department of Surgery, Hamad Medical Corporation, Doha, Qatar
| | | | - Georges Younes
- Department of Surgery, Hamad Medical Corporation, Doha, Qatar
| | | | | | - Adham Darweesh
- Department of Clinical Imaging, Hamad Medical Corporation, Doha, Qatar
| | | | | | | | - Mansour Karkoub
- Department of Mechanical Engineering, Texas A&M University-Qatar, Doha, Qatar
| | - Ernst L Leiss
- Department of Computer Science, University of Houston, Houston, Texas, USA
| | | | - Nikolaos V Tsekos
- Department of Computer Science, University of Houston, Houston, Texas, USA
| |
Collapse
|
33
|
Ito K, Sugimoto M, Tsunoyama T, Nagao T, Kondo H, Nakazawa K, Tomonaga A, Miyake Y, Sakamoto T. A trauma patient care simulation using extended reality technology in the hybrid emergency room system. J Trauma Acute Care Surg 2021; 90:e108-e112. [PMID: 33797500 DOI: 10.1097/ta.0000000000003086] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Affiliation(s)
- Kaori Ito
- From the Division of Acute Care Surgery, Department of Emergency Medicine (K.I., T.T., T.N., K.N., A.T., Y.M., T.S.), Teikyo University School of Medicine; Okinaga Research Institution (M.S.), Teikyo University; and Department of Radiology (M.S.), Teikyo University School of Medicine, Tokyo, Japan
| | | | | | | | | | | | | | | | | |
Collapse
|
34
|
Fu R, Zhang C, Zhang T, Chu XP, Tang WF, Yang XN, Huang MP, Zhuang J, Wu YL, Zhong WZ. A three-dimensional printing navigational template combined with mixed reality technique for localizing pulmonary nodules. Interact Cardiovasc Thorac Surg 2021; 32:552-559. [PMID: 33751118 PMCID: PMC8923295 DOI: 10.1093/icvts/ivaa300] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2020] [Revised: 10/20/2020] [Accepted: 10/27/2020] [Indexed: 02/05/2023] Open
Abstract
OBJECTIVES Localizing non-palpable pulmonary nodules is challenging for thoracic surgeons. Here, we investigated the accuracy of three-dimensional (3D) printing technology combined with mixed reality (MR) for localizing ground glass opacity-dominant pulmonary nodules. METHODS In this single-arm study, we prospectively enrolled patients with small pulmonary nodules (<2 cm) that required accurate localization. A 3D-printing physical navigational template was designed based on the reconstruction of computed tomography images, and a 3D model was generated through the MR glasses. We set the deviation distance as the primary end point for efficacy evaluation. Clinicopathological and surgical data were obtained for further analysis. RESULTS Sixteen patients with 17 non-palpable pulmonary nodules were enrolled in this study. Sixteen nodules were localized successfully (16/17; 94.1%) using this novel approach with a median deviation of 9 mm. The mean time required for localization was 25 ± 5.2 min. For the nodules in the upper/middle and lower lobes, the median deviation was 6 mm (range, 0-12.0) and 16 mm (range, 15.0-20.0), respectively. The deviation difference between the groups was significant (Z = -2.957, P = 0.003). The pathological evaluation of resection margins was negative. CONCLUSIONS The 3D printing navigational template combined with MR can be a feasible approach for localizing pulmonary nodules.
Collapse
Affiliation(s)
- Rui Fu
- Guangdong Lung Cancer Institute, Guangdong
Provincial People’s Hospital, Guangdong Academy of Medical
Sciences, Guangzhou, China
- Shantou University Medical College,
Shantou, China
| | - Chao Zhang
- Guangdong Lung Cancer Institute, Guangdong
Provincial People’s Hospital, Guangdong Academy of Medical
Sciences, Guangzhou, China
| | - Tao Zhang
- Guangdong Lung Cancer Institute, Guangdong
Provincial People’s Hospital, Guangdong Academy of Medical
Sciences, Guangzhou, China
- Shantou University Medical College,
Shantou, China
| | - Xiang-Peng Chu
- Guangdong Lung Cancer Institute, Guangdong
Provincial People’s Hospital, Guangdong Academy of Medical
Sciences, Guangzhou, China
| | - Wen-Fang Tang
- Guangdong Lung Cancer Institute, Guangdong
Provincial People’s Hospital, Guangdong Academy of Medical
Sciences, Guangzhou, China
- Shantou University Medical College,
Shantou, China
| | - Xue-Ning Yang
- Guangdong Lung Cancer Institute, Guangdong
Provincial People’s Hospital, Guangdong Academy of Medical
Sciences, Guangzhou, China
| | - Mei-Ping Huang
- Department of Catheterization Lab, Guangdong
Cardiovascular Institute, Guangdong Provincial Key Laboratory of South China
Structural Heart Disease, Guangdong Provincial People's Hospital,
Guangdong Academy of Medical Sciences, Guangzhou, China
| | - Jian Zhuang
- Department of Cardiac Surgery, Guangdong
Cardiovascular Institute, Guangdong Provincial Key Laboratory of South China
Structural Heart Disease, Guangdong Provincial People's Hospital,
Guangdong Academy of Medical Sciences, School of Medicine, South China
University of Technology, Guangzhou, China
| | - Yi-Long Wu
- Guangdong Lung Cancer Institute, Guangdong
Provincial People’s Hospital, Guangdong Academy of Medical
Sciences, Guangzhou, China
| | - Wen-Zhao Zhong
- Guangdong Lung Cancer Institute, Guangdong
Provincial People’s Hospital, Guangdong Academy of Medical
Sciences, Guangzhou, China
- Corresponding author. Guangdong Lung Cancer Institute,
Guangdong Provincial People’s Hospital, Guangdong Academy of Medical
Sciences, Guangzhou 510080, China. Tel: +86-20-83877855; fax:
+86-20-83844620; e-mail: (W.-Z.
Zhong)
| |
Collapse
|
35
|
ZHU H, LI Y, GONG G, ZHAO MX, LIU L, YAO SY, WANG C, LI X, CHEN YD. A world's first attempt of mixed-reality system guided inferior vena cava filter implantation under remote guidance of 5G communication. J Geriatr Cardiol 2021; 18:233-237. [PMID: 33907553 PMCID: PMC8047186 DOI: 10.11909/j.issn.1671-5411.2021.03.008] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/07/2023] Open
Affiliation(s)
- Hang ZHU
- Department of Cardiology, Chinese PLA General Hospital, Beijing, China
| | - Yao LI
- Medical School of Chinese PLA, Chinese PLA General Hospital, Beijing, China
| | - Guang GONG
- Department of Vascular Surgery, The No.2 People’s Hospital of Yibin, Sichuan, China
| | - Mao-Xiang ZHAO
- Department of Cardiology, Chinese PLA General Hospital, Beijing, China
- Medical School of Chinese PLA, Chinese PLA General Hospital, Beijing, China
| | - Lin LIU
- Beijing Visual 3D Medical Science and Technology Development, CO. LLC, Beijing, China
| | - Si-Yu YAO
- Department of Cardiology, Chinese PLA General Hospital, Beijing, China
| | - Chi WANG
- Medical School of Chinese PLA, Chinese PLA General Hospital, Beijing, China
| | - Xin LI
- Beijing Visual 3D Medical Science and Technology Development, CO. LLC, Beijing, China
| | - Yun-Dai CHEN
- Department of Cardiology, Chinese PLA General Hospital, Beijing, China
| |
Collapse
|
36
|
Using virtual 3D-models in surgical planning: workflow of an immersive virtual reality application in liver surgery. Langenbecks Arch Surg 2021; 406:911-915. [PMID: 33710462 PMCID: PMC8106601 DOI: 10.1007/s00423-021-02127-7] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2020] [Accepted: 02/08/2021] [Indexed: 12/21/2022]
Abstract
Purpose Three-dimensional (3D) surgical planning is widely accepted in liver surgery. Currently, the 3D reconstructions are usually presented as 3D PDF data on regular monitors. 3D-printed liver models are sometimes used for education and planning. Methods We developed an immersive virtual reality (VR) application that enables the presentation of preoperative 3D models. The 3D reconstructions are exported as STL files and easily imported into the application, which creates the virtual model automatically. The presentation is possible in “OpenVR”-ready VR headsets. To interact with the 3D liver model, VR controllers are used. Scaling is possible, as well as changing the opacity from invisible over transparent to fully opaque. In addition, the surgeon can draw potential resection lines on the surface of the liver. All these functions can be used in a single or multi-user mode. Results Five highly experienced HPB surgeons of our department evaluated the VR application after using it for the very first time and considered it helpful according to the “System Usability Scale” (SUS) with a score of 76.6%. Especially with the subitem “necessary learning effort,” it was shown that the application is easy to use. Conclusion We introduce an immersive, interactive presentation of medical volume data for preoperative 3D liver surgery planning. The application is easy to use and may have advantages over 3D PDF and 3D print in preoperative liver surgery planning. Prospective trials are needed to evaluate the optimal presentation mode of 3D liver models. Supplementary Information The online version contains supplementary material available at 10.1007/s00423-021-02127-7.
Collapse
|
37
|
Golse N, Petit A, Lewin M, Vibert E, Cotin S. Augmented Reality during Open Liver Surgery Using a Markerless Non-rigid Registration System. J Gastrointest Surg 2021; 25:662-671. [PMID: 32040812 DOI: 10.1007/s11605-020-04519-4] [Citation(s) in RCA: 30] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/06/2019] [Accepted: 01/10/2020] [Indexed: 01/31/2023]
Abstract
INTRODUCTION Intraoperative navigation during liver resection remains difficult and requires high radiologic skills because liver anatomy is complex and patient-specific. Augmented reality (AR) during open liver surgery could be helpful to guide hepatectomies and optimize resection margins but faces many challenges when large parenchymal deformations take place. We aimed to experiment a new vision-based AR to assess its clinical feasibility and anatomical accuracy. PATIENTS AND METHODS Based on preoperative CT scan 3-D segmentations, we applied a non-rigid registration method, integrating a physics-based elastic model of the liver, computed in real time using an efficient finite element method. To fit the actual deformations, the model was driven by data provided by a single RGB-D camera. Five livers were considered in this experiment. In vivo AR was performed during hepatectomy (n = 4), with manual handling of the livers resulting in large realistic deformations. Ex vivo experiment (n = 1) consisted in repeated CT scans of explanted whole organ carrying internal metallic landmarks, in fixed deformations, and allowed us to analyze our estimated deformations and quantify spatial errors. RESULTS In vivo AR tests were successfully achieved in all patients with a fast and agile setup installation (< 10 min) and real-time overlay of the virtual anatomy onto the surgical field displayed on an external screen. In addition, an ex vivo quantification demonstrated a 7.9 mm root mean square error for the registration of internal landmarks. CONCLUSION These first experiments of a markerless AR provided promising results, requiring very little equipment and setup time, yet providing real-time AR with satisfactory 3D accuracy. These results must be confirmed in a larger prospective study to definitively assess the impact of such minimally invasive technology on pathological margins and oncological outcomes.
Collapse
Affiliation(s)
- Nicolas Golse
- Department of Surgery, Paul-Brousse Hospital, Assistance Publique Hôpitaux de Paris, Centre Hépato-Biliaire, 12 Avenue Paul Vaillant Couturier, 94804, Villejuif Cedex, France. .,DHU Hepatinov, 94800, Villejuif, France. .,INSERM, Unit 1193, 94800, Villejuif, France. .,Univ Paris-Sud, UMR-S 1193, 94800, Villejuif, France. .,Inria, Strasbourg, France.
| | | | - Maïté Lewin
- Department of Radiology, Paul-Brousse Hospital, Assistance Publique Hôpitaux de Paris, Centre Hépato-Biliaire, 94800, Villejuif, France
| | - Eric Vibert
- Department of Surgery, Paul-Brousse Hospital, Assistance Publique Hôpitaux de Paris, Centre Hépato-Biliaire, 12 Avenue Paul Vaillant Couturier, 94804, Villejuif Cedex, France.,DHU Hepatinov, 94800, Villejuif, France.,INSERM, Unit 1193, 94800, Villejuif, France.,Univ Paris-Sud, UMR-S 1193, 94800, Villejuif, France
| | | |
Collapse
|
38
|
Abstract
OBJECTIVE We present a series of cases where we used 3D printing in planning of complex liver surgery. BACKGROUND In liver surgery, three-dimensional reconstruction of the liver anatomy, in particular of vascular structures, has shown to be helpful in operation planning. So far, 3D printing has been used for medical applications only rarely. METHODS AND PATIENTS From December 2017 to December 2019, in 10 cases where surgery was assumed to be challenging operation planning was performed using full size 3D prints in addition to standard 3 phase CT scans. Models included transparent parenchyma, hepatic veins, vena cava, portal vein, hepatic artery, (biliary tree if requested), and tumors. In 7/10 cases vascular reconstructions were needed during the procedure. Nonstructured feedback of the surgical team revealed that the major benefit was visualization of the critical areas of vascular reconstruction, the expected dimensions of tangential vascular infiltration and the planning of reconstruction. In the multifocal tumors, 3D prints were considered to be helpful for intraoperative orientation to detect metastases and to improve planning of the resection. CONCLUSIONS In complex liver surgery with potential need for vascular reconstructions operation planning may be optimized using a 3D printed liver model. Prospective studies are needed to evaluate the clinical impact of 3D printing in liver surgery compared to other 3D visualizations.
Collapse
|
39
|
Bari H, Wadhwani S, Dasari BVM. Role of artificial intelligence in hepatobiliary and pancreatic surgery. World J Gastrointest Surg 2021; 13:7-18. [PMID: 33552391 PMCID: PMC7830072 DOI: 10.4240/wjgs.v13.i1.7] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/28/2020] [Revised: 12/08/2020] [Accepted: 12/17/2020] [Indexed: 02/06/2023] Open
Abstract
Over the past decade, enhanced preoperative imaging and visualization, improved delineation of the complex anatomical structures of the liver and pancreas, and intra-operative technological advances have helped deliver the liver and pancreatic surgery with increased safety and better postoperative outcomes. Artificial intelligence (AI) has a major role to play in 3D visualization, virtual simulation, augmented reality that helps in the training of surgeons and the future delivery of conventional, laparoscopic, and robotic hepatobiliary and pancreatic (HPB) surgery; artificial neural networks and machine learning has the potential to revolutionize individualized patient care during the preoperative imaging, and postoperative surveillance. In this paper, we reviewed the existing evidence and outlined the potential for applying AI in the perioperative care of patients undergoing HPB surgery.
Collapse
Affiliation(s)
- Hassaan Bari
- Department of HPB and Liver Transplantation Surgery, Queen Elizabeth Hospital, Birmingham B15 2TH, United Kingdom
| | - Sharan Wadhwani
- Department of Radiology, Queen Elizabeth Hospital, Birmingham B15 2TH, United Kingdom
| | - Bobby V M Dasari
- Department of HPB and Liver Transplantation Surgery, Queen Elizabeth Hospital, Birmingham B15 2TH, United Kingdom
| |
Collapse
|
40
|
Tukra S, Lidströmer N, Ashrafian H, Giannarou S. AI in Surgical Robotics. Artif Intell Med 2021. [DOI: 10.1007/978-3-030-58080-3_323-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
|
41
|
Lungu AJ, Swinkels W, Claesen L, Tu P, Egger J, Chen X. A review on the applications of virtual reality, augmented reality and mixed reality in surgical simulation: an extension to different kinds of surgery. Expert Rev Med Devices 2020; 18:47-62. [PMID: 33283563 DOI: 10.1080/17434440.2021.1860750] [Citation(s) in RCA: 51] [Impact Index Per Article: 12.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Background: Research proves that the apprenticeship model, which is the gold standard for training surgical residents, is obsolete. For that reason, there is a continuing effort toward the development of high-fidelity surgical simulators to replace the apprenticeship model. Applying Virtual Reality Augmented Reality (AR) and Mixed Reality (MR) in surgical simulators increases the fidelity, level of immersion and overall experience of these simulators.Areas covered: The objective of this review is to provide a comprehensive overview of the application of VR, AR and MR for distinct surgical disciplines, including maxillofacial surgery and neurosurgery. The current developments in these areas, as well as potential future directions, are discussed.Expert opinion: The key components for incorporating VR into surgical simulators are visual and haptic rendering. These components ensure that the user is completely immersed in the virtual environment and can interact in the same way as in the physical world. The key components for the application of AR and MR into surgical simulators include the tracking system as well as the visual rendering. The advantages of these surgical simulators are the ability to perform user evaluations and increase the training frequency of surgical residents.
Collapse
Affiliation(s)
- Abel J Lungu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Wout Swinkels
- Computational Sensing Systems, Department of Engineering Technology, Hasselt University, Diepenbeek, Belgium
| | - Luc Claesen
- Computational Sensing Systems, Department of Engineering Technology, Hasselt University, Diepenbeek, Belgium
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Jan Egger
- Graz University of Technology, Institute of Computer Graphics and Vision, Graz, Austria.,Graz Department of Oral &maxillofacial Surgery, Medical University of Graz, Graz, Austria.,The Laboratory of Computer Algorithms for Medicine, Medical University of Graz, Graz, Austria
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|
42
|
Ye W, Zhang X, Li T, Luo C, Yang L. Mixed-reality hologram for diagnosis and surgical planning of double outlet of the right ventricle: a pilot study. Clin Radiol 2020; 76:237.e1-237.e7. [PMID: 33309030 DOI: 10.1016/j.crad.2020.10.017] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2020] [Accepted: 10/30/2020] [Indexed: 10/22/2022]
Abstract
AIM To evaluate the mixed-reality (MR) hologram, a novel technology based on two-dimensional images, which simulates three-dimensional (3D) images and provides a dynamic and interactive alternative, for its usefulness in the diagnosis and surgical planning of double outlet of the right ventricle (DORV). MATERIALS AND METHODS Thirty-four patients who were suspected of DORV based on ultrasound findings underwent cardiac computed tomography angiography (CTA). The patients were assigned randomly to the MR holographic guidance (MRHG) group or the control group. For the patients in the MRHG group, the CTA images were converted into Standard Template Library (STL) files after segmentation, 3D reconstruction, colourisation, and transparentisation, and then exported for MR holographic visualisation. The CTA images of the patients in the control group were analysed using routine 3D reconstruction only. Diagnostic accuracy and surgical planning were compared between the two groups based on visualisation at surgery. RESULTS In the MRHG group, the 3D hologram observation was in concordance with the actual anatomical findings, and the DORV type was classified accurately in all patients. The diagnostic accuracy for the malformation was 95.5% in the MRHG group and 89.7% in the control group, but the difference was not significant (p=0.3). All the procedures were exactly the same as planned based on the 3D MR holographic model. The surgical planning time was shorter for the MRHG group (51.65 ± 11.11 min) than that for the control group (65.71 ± 18.07 min, p<0.05). CONCLUSION MR 3D holograms may provide a clear and deeper anatomical perception of DORV and improve surgical planning.
Collapse
Affiliation(s)
- W Ye
- Department of Cardiac Surgery, Chinese People's Liberation Army General Hospital, No. 28, Fu Xing Road, Hai Dian District, Beijing, China
| | - X Zhang
- Department of Radiology, Chinese People's Liberation Army General Hospital, No. 28, Fu Xing Road, Hai Dian District, Beijing, China
| | - T Li
- Department of Radiology, Chinese People's Liberation Army General Hospital, No. 28, Fu Xing Road, Hai Dian District, Beijing, China.
| | - C Luo
- Department of Radiology, Chinese People's Liberation Army General Hospital, No. 28, Fu Xing Road, Hai Dian District, Beijing, China
| | - L Yang
- Department of Radiology, Chinese People's Liberation Army General Hospital, No. 28, Fu Xing Road, Hai Dian District, Beijing, China
| |
Collapse
|
43
|
Digital intelligent technology assisted three-dimensional laparoscopic extended left hepatectomy with resection of the middle hepatic vein(Video). Surg Oncol 2020; 35:426-427. [DOI: 10.1016/j.suronc.2020.09.006] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2020] [Revised: 08/25/2020] [Accepted: 09/06/2020] [Indexed: 12/14/2022]
|
44
|
Brun H, Bugge RAB, Suther LKR, Birkeland S, Kumar R, Pelanis E, Elle OJ. Mixed reality holograms for heart surgery planning: first user experience in congenital heart disease. Eur Heart J Cardiovasc Imaging 2020; 20:883-888. [PMID: 30534951 DOI: 10.1093/ehjci/jey184] [Citation(s) in RCA: 59] [Impact Index Per Article: 14.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/08/2018] [Accepted: 11/02/2018] [Indexed: 11/12/2022] Open
Abstract
AIMS Proof of concept and feasibility study for preoperative diagnostic use of mixed reality (MR) holograms of individual 3D heart models from standard cardiac computed tomography angiograms (CTA) images. Optimal repair for complex congenital heart disease poses high demands on 3D anatomical imagination. Three-dimensional printed heart models are increasingly used for improved morphological understanding during surgical and interventional planning. Holograms are a dynamic and interactive alternative, probably with wider applications. METHODS AND RESULTS A 3D heart model was segmented from CTA images in a patient with double outlet right ventricle and transposition of the great arteries (DORV-TGA). The hologram was visualized in the wearable MR platform HoloLens® for 36 paediatric heart team members who filled out a diagnostic and quality rating questionnaire. Morphological and diagnostic output from the hologram was assessed and the 3D experience was evaluated. Locally developed app tools such as hologram rotation, scaling, and cutting were rated. Anatomy identification and diagnostic output was high as well as rating of 3D experience. Younger and female users rated the app tools higher. CONCLUSION This preliminary study demonstrates that MR holograms as surgical planning tool for congenital heart disease may have a high diagnostic value and contribute to understanding complex morphology. The first users experience of the hologram presentation was found to be very positive, with a preference among the female and the younger users. There is potential for improvement of the hologram manipulation tools.
Collapse
Affiliation(s)
- H Brun
- The Intervention Centre, Oslo University Hospital, Rikshospitalet, Sognsvannsvn 20, Oslo, Norway.,Clinic for Pediatric Cardiology, Oslo University Hospital, Rikshospitalet, Sognsvannsvn 20, Oslo, Norway
| | - R A B Bugge
- The Intervention Centre, Oslo University Hospital, Rikshospitalet, Sognsvannsvn 20, Oslo, Norway.,Department of Diagnostic Physics, Oslo University Hospital, Rikshospitalet, Sognsvannsvn 20, Oslo, Norway
| | - L K R Suther
- Department of Pediatric Radiology, Oslo University Hospital, Rikshospitalet, Sognsvannsvn 20, Oslo, Norway
| | - S Birkeland
- Department of Cardiothoracic Surgery, Oslo University Hospital, Rikshospitalet, Sognsvannsvn 20, Oslo, Norway
| | - R Kumar
- The Intervention Centre, Oslo University Hospital, Rikshospitalet, Sognsvannsvn 20, Oslo, Norway
| | - E Pelanis
- The Intervention Centre, Oslo University Hospital, Rikshospitalet, Sognsvannsvn 20, Oslo, Norway
| | - O J Elle
- The Intervention Centre, Oslo University Hospital, Rikshospitalet, Sognsvannsvn 20, Oslo, Norway.,Department of Informatics, University of Oslo, Gaustadalleen 23B, Oslo, Norway
| |
Collapse
|
45
|
Shen S, Cao S, Jiang H, Liu S, Liu X, Li Z, Liu D, Zhou Y. The Short-Term Outcomes of Gastric Cancer Patients Based on a Proposal for a Novel Classification of Perigastric Arteries. J Gastrointest Surg 2020; 24:2471-2481. [PMID: 31749096 DOI: 10.1007/s11605-019-04427-2] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/11/2019] [Accepted: 09/30/2019] [Indexed: 02/06/2023]
Abstract
PURPOSE To establish a novel classification of perigastric arteries by computerized tomography angiography (CTA) and discuss its influence in patients' short-term clinical outcomes. METHODS The clinical data were analyzed retrospectively from 680 gastric cancer patients. The types of the perigastric artery were classified according to CTA image and we compared the short-term clinical outcomes. RESULTS The perigastric arteries can be divided into seven categories. Type I, trifurcation of the celiac trunk (CT) (294/343, 85.7%); type II, hepatosplenic trunk, left gastric artery (LGA) arising from the abdominal aorta (8/343, 2.3%); type III, hepatogastric trunk, splenic artery arising from the superior mesenteric artery (SMA) (2/343, 0.6%); type IV, celiacomesenteric trunk (5/343, 1.5%); type V, common hepatic artery (CHA) arising from the SMA, gastrosplenic trunk (11/343, 3.2%); type VI, aberrant (accessory or replaced) left hepatic artery arising from LGA (21/343, 6.1%); and type VII, CHA arising from LGA (2/343, 0.6%). The number of retrieved LNs in the CTA group was significantly higher than that in the non-CTA group. However, the operation time, estimated blood loss, intraoperative vascular injury, and medical cost of the CTA group were significantly less than those in the non-CTA group. Of note, in patients with BMI ≥ 25.0, higher LNs retrieval and less vascular injury were still present in the CTA group, which was of vital importance in clinical practice. Furthermore, the CTA group displayed shorter hospital stay (LOS). CONCLUSION We established a new perigastric artery classification. Application of the classification can improve the short-term clinical outcomes of patients.
Collapse
Affiliation(s)
- Shuai Shen
- Department of Gastrointestinal Surgery, Affiliated Hospital of Qingdao University, No. 16 Jiangsu Road, Shinan District, Qingdao City, Shandong Province, China
| | - Shougen Cao
- Department of Gastrointestinal Surgery, Affiliated Hospital of Qingdao University, No. 16 Jiangsu Road, Shinan District, Qingdao City, Shandong Province, China
| | - Haitao Jiang
- Department of Gastrointestinal Surgery, Affiliated Hospital of Qingdao University, No. 16 Jiangsu Road, Shinan District, Qingdao City, Shandong Province, China
| | - Shanglong Liu
- Department of Gastrointestinal Surgery, Affiliated Hospital of Qingdao University, No. 16 Jiangsu Road, Shinan District, Qingdao City, Shandong Province, China
| | - Xiaodong Liu
- Department of Gastrointestinal Surgery, Affiliated Hospital of Qingdao University, No. 16 Jiangsu Road, Shinan District, Qingdao City, Shandong Province, China
| | - Zequn Li
- Department of Gastrointestinal Surgery, Affiliated Hospital of Qingdao University, No. 16 Jiangsu Road, Shinan District, Qingdao City, Shandong Province, China
| | - Dan Liu
- Department of Gastrointestinal Surgery, Affiliated Hospital of Qingdao University, No. 16 Jiangsu Road, Shinan District, Qingdao City, Shandong Province, China
| | - Yanbing Zhou
- Department of Gastrointestinal Surgery, Affiliated Hospital of Qingdao University, No. 16 Jiangsu Road, Shinan District, Qingdao City, Shandong Province, China.
| |
Collapse
|
46
|
Value of the surgeon's sightline on hologram registration and targeting in mixed reality. Int J Comput Assist Radiol Surg 2020; 15:2027-2039. [PMID: 32984934 PMCID: PMC7671978 DOI: 10.1007/s11548-020-02263-3] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2020] [Accepted: 09/14/2020] [Indexed: 12/12/2022]
Abstract
Purpose Mixed reality (MR) is being evaluated as a visual tool for surgical navigation. Current literature presents unclear results on intraoperative accuracy using the Microsoft HoloLens 1®. This study aims to assess the impact of the surgeon’s sightline in an inside-out marker-based MR navigation system for open surgery. Methods Surgeons at Akershus University Hospital tested this system. A custom-made phantom was used, containing 18 wire target crosses within its inner walls. A CT scan was obtained in order to segment all wire targets into a single 3D-model (hologram). An in-house software application (CTrue), developed for the Microsoft HoloLens 1, uploaded 3D-models and automatically registered the 3D-model with the phantom. Based on the surgeon’s sightline while registering and targeting (free sightline /F/or a strictly perpendicular sightline /P/), 4 scenarios were developed (FF-PF-FP-PP). Target error distance (TED) was obtained in three different working axes-(XYZ).
Results Six surgeons (5 males, age 29–62) were enrolled. A total of 864 measurements were collected in 4 scenarios, twice. Scenario PP showed the smallest TED in XYZ-axes mean = 2.98 mm ± SD 1.33; 2.28 mm ± SD 1.45; 2.78 mm ± SD 1.91, respectively. Scenario FF showed the largest TED in XYZ-axes with mean = 10.03 mm ± SD 3.19; 6.36 mm ± SD 3.36; 16.11 mm ± SD 8.91, respectively. Multiple comparison tests, grouped in scenarios and axes, showed that the majority of scenario comparisons had significantly different TED values (p < 0.05). Y-axis always presented the smallest TED regardless of scenario tested. Conclusion A strictly perpendicular working sightline in relation to the 3D-model achieves the best accuracy results. Shortcomings in this technology, as an intraoperative visual cue, can be overcome by sightline correction. Incidentally, this is the preferred working angle for open surgery.
Collapse
|
47
|
Abstract
Current developments in the field of extended reality (XR) could prove useful in the optimization of surgical workflows, time effectiveness and postoperative outcome. Although still primarily a subject of research, the state of XR technologies is rapidly improving and approaching feasibility for a broad clinical application. Surgical fields of application of XR technologies are currently primarily training, preoperative planning and intraoperative assistance. For all three areas, products already exist (some clinically approved) and technical feasibility studies have been conducted. In teaching, the use of XR can already be assessed as fundamentally practical and meaningful but still needs to be evaluated in large multicenter studies. In preoperative planning XR can also offer advantages, although technical limitations often impede routine use; however, for cases of intraoperative use informative evaluation studies are mostly lacking, so that an assessment is not yet possible in a meaningful way. Furthermore, there is a lack of assessments regarding cost-effectiveness in all three areas. The XR technologies enable proven advantages in surgical workflows despite the lack of high-quality evaluation with respect to the practical and clinical use of XR. New concepts for effective interaction with XR media also need to be developed. In the future, further research progress and technical developments in the field can be expected.
Collapse
Affiliation(s)
- Christoph Rüger
- Chirurgische Klinik, Campus Charité Mitte|Campus Virchow-Klinikum, Experimentelle Chirurgie, Charité - Universitätsmedizin Berlin, Augustenburger Platz 1, 13353, Berlin, Deutschland
| | - Simon Moosburner
- Chirurgische Klinik, Campus Charité Mitte|Campus Virchow-Klinikum, Experimentelle Chirurgie, Charité - Universitätsmedizin Berlin, Augustenburger Platz 1, 13353, Berlin, Deutschland
| | - Igor M Sauer
- Chirurgische Klinik, Campus Charité Mitte|Campus Virchow-Klinikum, Experimentelle Chirurgie, Charité - Universitätsmedizin Berlin, Augustenburger Platz 1, 13353, Berlin, Deutschland.
- Matters of Activity. Image Space Material, Berlin, Deutschland.
| |
Collapse
|
48
|
Abstract
OBJECTIVE To test the feasibility of image-guided Baha Attract implant surgery with mixed reality (MR) in the form of the HoloLens to visualize critical structures and facilitate precise Baha implant placement. METHODS A cadaveric case study of bilateral Baha Attract implant approaches was conducted using Star Atlas MR three-dimensional (3D) medical interaction system guidance at the Otolaryngology Department of PUMCH, Beijing, China. The accuracy of visual surface registration was determined by the target registration error (TRE) between the predefined points on the preoperative 3D holographic Baha Attract implant model and the postoperatively reconstructed 3D model. RESULTS Bilateral Baha Attract implantation was completed successfully for all four cadaveric heads using the Star Atlas MR 3D medical interaction system with the HoloLens. The preoperative 3D digital model characteristics (including bone quality and thickness and avoidance of cranial vessels, air cells, and cranial sutures) corresponded well with the 3D model of the actual implantation reconstructed postoperatively. The median TRE of our system was 2.97 mm (ranging from 1.98 to 4.58 mm) in terms of distance and 2.76 degrees (ranging from 0.59 to 6.4 degrees) in terms of angle. CONCLUSIONS Applying MR technology in the form of the HoloLens in Baha Attract implant surgery is feasible and could improve the accuracy of the surgery. The described MR system for Baha Attract implantation has the potential to improve the surgeon's confidence, as well as the surgical safety, efficiency, and precision.
Collapse
|
49
|
Mapping the intellectual structure of research on surgery with mixed reality: Bibliometric network analysis (2000-2019). J Biomed Inform 2020; 109:103516. [PMID: 32736125 DOI: 10.1016/j.jbi.2020.103516] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2020] [Revised: 06/16/2020] [Accepted: 07/17/2020] [Indexed: 12/27/2022]
Abstract
OBJECTIVE The purpose of this study is to view research trends on surgery with mixed reality, and present the intellectual structure using bibliometric network analysis for the period 2000-2019. METHODS Analyses are implemented in the following four steps: (1) literature dataset acquisition from article database (Web of Science, Scopus, PubMed, and IEEE digital library), (2) dataset pre-processing and refinement, (3) network construction and visualization, and (4) analysis and interpretation. Descriptive analysis, bibliometric network analysis, and in-depth qualitative analysis were conducted. RESULTS The 14,591 keywords of 5897 abstracts data were ultimately used to ascertain the intellectual structure of research on surgery with mixed reality. The dynamics of the evolution of keywords in the structure throughout the four periods is summarized with four aspects: (a) maintaining a predominant utilization tool for training, (b) widening clinical application area, (c) reallocating the continuum of mixed reality, and (d) steering advanced imaging and simulation technology. CONCLUSIONS The results of this study can provide valuable insights into technology adoption and research trends of mixed reality in surgery. These findings can help clinicians to overview prospective medical research on surgery using mixed reality. Hospitals can also understand the periodical maturity of technology of mixed reality in surgery, and, therefore, these findings can suggest an academic landscape to make a decision in adopting new technologies in surgery.
Collapse
|
50
|
Ultrasound in augmented reality: a mixed-methods evaluation of head-mounted displays in image-guided interventions. Int J Comput Assist Radiol Surg 2020; 15:1895-1905. [PMID: 32725398 PMCID: PMC8332636 DOI: 10.1007/s11548-020-02236-6] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2020] [Accepted: 07/14/2020] [Indexed: 11/24/2022]
Abstract
Purpose Augmented reality (AR) and head-mounted displays (HMD) in medical practice are current research topics. A commonly proposed use case of AR-HMDs is to display data in image-guided interventions. Although technical feasibility has been thoroughly shown, effects of AR-HMDs on interventions are not yet well researched, hampering clinical applicability. Therefore, the goal of this study is to better understand the benefits and limitations of this technology in ultrasound-guided interventions. Methods We used an AR-HMD system (based on the first-generation Microsoft Hololens) which overlays live ultrasound images spatially correctly at the location of the ultrasound transducer. We chose ultrasound-guided needle placements as a representative task for image-guided interventions. To examine the effects of the AR-HMD, we used mixed methods and conducted two studies in a lab setting: (1) In a randomized crossover study, we asked participants to place needles into a training model and evaluated task duration and accuracy with the AR-HMD as compared to the standard procedure without visual overlay and (2) in a qualitative study, we analyzed the user experience with AR-HMD using think-aloud protocols during ultrasound examinations and semi-structured interviews after the task. Results Participants (n = 20) placed needles more accurately (mean error of 7.4 mm vs. 4.9 mm, p = 0.022) but not significantly faster (mean task duration of 74.4 s vs. 66.4 s, p = 0.211) with the AR-HMD. All participants in the qualitative study (n = 6) reported limitations of and unfamiliarity with the AR-HMD, yet all but one also clearly noted benefits and/or that they would like to test the technology in practice. Conclusion We present additional, though still preliminary, evidence that AR-HMDs provide benefits in image-guided procedures. Our data also contribute insights into potential causes underlying the benefits, such as improved spatial perception. Still, more comprehensive studies are needed to ascertain benefits for clinical applications and to clarify mechanisms underlying these benefits. Electronic supplementary material The online version of this article (10.1007/s11548-020-02236-6) contains supplementary material, which is available to authorized users.
Collapse
|