1
|
Deng Z, Xiang N, Pan J. State of the Art in Immersive Interactive Technologies for Surgery Simulation: A Review and Prospective. Bioengineering (Basel) 2023; 10:1346. [PMID: 38135937 PMCID: PMC10740891 DOI: 10.3390/bioengineering10121346] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2023] [Revised: 11/08/2023] [Accepted: 11/21/2023] [Indexed: 12/24/2023] Open
Abstract
Immersive technologies have thrived on a strong foundation of software and hardware, injecting vitality into medical training. This surge has witnessed numerous endeavors incorporating immersive technologies into surgery simulation for surgical skills training, with a growing number of researchers delving into this domain. Relevant experiences and patterns need to be summarized urgently to enable researchers to establish a comprehensive understanding of this field, thus promoting its continuous growth. This study provides a forward-looking perspective by reviewing the latest development of immersive interactive technologies for surgery simulation. The investigation commences from a technological standpoint, delving into the core aspects of virtual reality (VR), augmented reality (AR) and mixed reality (MR) technologies, namely, haptic rendering and tracking. Subsequently, we summarize recent work based on the categorization of minimally invasive surgery (MIS) and open surgery simulations. Finally, the study showcases the impressive performance and expansive potential of immersive technologies in surgical simulation while also discussing the current limitations. We find that the design of interaction and the choice of immersive technology in virtual surgery development should be closely related to the corresponding interactive operations in the real surgical speciality. This alignment facilitates targeted technological adaptations in the direction of greater applicability and fidelity of simulation.
Collapse
Affiliation(s)
- Zihan Deng
- Department of Computing, School of Advanced Technology, Xi’an Jiaotong-Liverpool Uiversity, Suzhou 215123, China;
| | - Nan Xiang
- Department of Computing, School of Advanced Technology, Xi’an Jiaotong-Liverpool Uiversity, Suzhou 215123, China;
| | - Junjun Pan
- State Key Laboratory of Virtual Reality Technology and Systems, Beihang University, Beijing 100191, China;
| |
Collapse
|
2
|
Farshad-Amacker NA, Kubik-Huch RA, Kolling C, Leo C, Goldhahn J. Learning how to perform ultrasound-guided interventions with and without augmented reality visualization: a randomized study. Eur Radiol 2023; 33:2927-2934. [PMID: 36350392 PMCID: PMC10017581 DOI: 10.1007/s00330-022-09220-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2022] [Revised: 10/02/2022] [Accepted: 10/09/2022] [Indexed: 11/11/2022]
Abstract
OBJECTIVES Augmented reality (AR), which entails overlay of in situ images onto the anatomy, may be a promising technique for assisting image-guided interventions. The purpose of this study was to investigate and compare the learning experience and performance of untrained operators in puncture of soft tissue lesions, when using AR ultrasound (AR US) compared with standard US (sUS). METHODS Forty-four medical students (28 women, 16 men) who had completed a basic US course, but had no experience with AR US, were asked to perform US-guided biopsies with both sUS and AR US, with a randomized selection of the initial modality. The experimental setup aimed to simulate biopsies of superficial soft tissue lesions, such as for example breast masses in clinical practice, by use of a turkey breast containing olives. Time to puncture(s) and success (yes/no) of the biopsies was documented. All participants completed questionnaires about their coordinative skills and their experience during the training. RESULTS Despite having no experience with the AR technique, time to puncture did not differ significantly between AR US and sUS (median [range]: 17.0 s [6-60] and 14.5 s [5-41], p = 0.16), nor were there any gender-related differences (p = 0.22 and p = 0.50). AR US was considered by 79.5% of the operators to be the more enjoyable means of learning and performing US-guided biopsies. Further, a more favorable learning curve was achieved using AR US. CONCLUSIONS Students considered AR US to be the preferable and more enjoyable modality for learning how to obtain soft tissue biopsies; however, they did not perform the biopsies faster than when using sUS. KEY POINTS • Performance of standard and augmented reality US-guided biopsies was comparable • A more favorable learning curve was achieved using augmented reality US. • Augmented reality US was the preferred technique and was considered more enjoyable.
Collapse
Affiliation(s)
- Nadja A Farshad-Amacker
- Radiology, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland.
| | - Rahel A Kubik-Huch
- Institute of Radiology, Department of Medical Services, Kantonsspital Baden, Baden, Switzerland
| | - Christoph Kolling
- Institute of Translational Medicine, Department of Health Sciences and Technology, Eidgenössische Technische Hochschule (ETH), Zurich, Switzerland
| | - Cornelia Leo
- Department of Gynaecology and Obstetrics, Kantonsspital Baden, Baden, Switzerland
| | - Jörg Goldhahn
- Institute of Translational Medicine, Department of Health Sciences and Technology, Eidgenössische Technische Hochschule (ETH), Zurich, Switzerland
| |
Collapse
|
3
|
Li Z, Manzionna E, Monizzi G, Mastrangelo A, Mancini ME, Andreini D, Dankelman J, De Momi E. Position-based dynamics simulator of vessel deformations for path planning in robotic endovascular catheterization. Med Eng Phys 2022; 110:103920. [PMID: 36564143 DOI: 10.1016/j.medengphy.2022.103920] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2022] [Revised: 09/14/2022] [Accepted: 11/03/2022] [Indexed: 11/08/2022]
Abstract
A major challenge during autonomous navigation in endovascular interventions is the complexity of operating in a deformable but constrained workspace with an instrument. Simulation of deformations for it can provide a cost-effective training platform for path planning. Aim of this study is to develop a realistic, auto-adaptive, and visually plausible simulator to predict vessels' global deformation induced by the robotic catheter's contact and cyclic heartbeat motion. Based on a Position-based Dynamics (PBD) approach for vessel modeling, Particle Swarm Optimization (PSO) algorithm is employed for an auto-adaptive calibration of PBD deformation parameters and of the vessels movement due to a heartbeat. In-vitro experiments were conducted and compared with in-silico results. The end-user evaluation results were reported through quantitative performance metrics and a 5-Point Likert Scale questionnaire. Compared with literature, this simulator has an error of 0.23±0.13% for deformation and 0.30±0.85mm for the aortic root displacement. In-vitro experiments show an error of 1.35±1.38mm for deformation prediction. The end-user evaluation results show that novices are more accustomed to using joystick controllers, and cardiologists are more satisfied with the visual authenticity. The real-time and accurate performance of the simulator make this framework suitable for creating a dynamic environment for autonomous navigation of robotic catheters.
Collapse
Affiliation(s)
- Zhen Li
- Department of Electronics, Information and Bioengineering, Politecnico di Milano, Piazza Leonardo da Vinci 32, Milan 20133, Italy; Department of Biomechanical Engineering, Delft University of Technology, Mekelweg 2, CD Delft 2628, Netherlands.
| | - Enrico Manzionna
- Department of Electronics, Information and Bioengineering, Politecnico di Milano, Piazza Leonardo da Vinci 32, Milan 20133, Italy
| | | | | | | | - Daniele Andreini
- Centro Cardiologico Monzino, IRCCS, Milan, Italy; Department of Clinical Sciences and Community Health, University of Milan, Milan, Italy
| | - Jenny Dankelman
- Department of Biomechanical Engineering, Delft University of Technology, Mekelweg 2, CD Delft 2628, Netherlands
| | - Elena De Momi
- Department of Electronics, Information and Bioengineering, Politecnico di Milano, Piazza Leonardo da Vinci 32, Milan 20133, Italy
| |
Collapse
|
4
|
“Sport and Anatomy”: Teaching, Research, and Assistance at the University of Pisa. SUSTAINABILITY 2022. [DOI: 10.3390/su14138160] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Introduction: Over the last decades, the university system has experienced huge growth, facing several challenges. Accordingly, the University of Pisa recognizes the value and opportunities deriving from research and fully supports collaboration with the world of entrepreneurship and industry, as well as local communities. Study programs, teaching methods and technologies, learning environments, quality assurance, programmed student numbers, and research results are key features of the prestige of the scientific community. Aim: In this respect, “Sport and Anatomy”, a brand that includes an academic organization at the University of Pisa, holds two main goals: (i) to offer the top level in both educational and professional fields; and (ii) to optimize the fine-tuning among all these sections, thus becoming a reference point for sports management. Methods and results: Indispensable links between basic and specialist sciences through different Masters’ and schools were created. In addition to didactic activity, research activity, medical assistance, and rehabilitation were coordinated. Two main outcomes emerged from this experience: (i) improved stakeholder performances and (ii) optimized cooperation between university and local communities. Conclusions: “Sport and Anatomy” plays a key role in supervising and accomplishing in an innovative way all the three missions of the university (i.e., teaching, research, and dissemination of knowledge), thus strongly fulfilling the aims of modern university targets.
Collapse
|
5
|
McBain KA, Habib R, Laggis G, Quaiattini A, M Ventura N, Noel GPJC. Scoping review: The use of augmented reality in clinical anatomical education and its assessment tools. ANATOMICAL SCIENCES EDUCATION 2022; 15:765-796. [PMID: 34800073 DOI: 10.1002/ase.2155] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/26/2019] [Revised: 11/15/2021] [Accepted: 11/17/2021] [Indexed: 06/13/2023]
Abstract
The purpose of this review was to identify the different augmented reality (AR) modalities used to teach anatomy to students, health professional trainees, and surgeons, and to examine the assessment tools used to evaluate the performance of various AR modalities. A scoping review of four databases was performed using variations of: (1) AR, (2) medical or anatomical teaching/education/training, and (3) anatomy or radiology or cadaver. Scientific articles were identified and screened for the inclusion and exclusion criteria as per Preferred Reporting Items for Systematic Reviews and Meta-Analyses with extension for scoping reviews guidelines. Virtual reality was an exclusion criterion. From this scoping review, data were extracted from a total of 54 articles and the following four AR modalities were identified: head-mounted display, projection, instrument and screen, and mobile device. The usability, feasibility, and acceptability of these AR modalities were evaluated using a variety of quantitative and qualitative assessment tools. Within more recent years of AR integration into anatomy education, the assessment of visuospatial ability, cognitive load, time on task, and increasing academic achievement outcomes are variables of interest, which continue to warrant more exploration. Sufficiently powered studies using validated assessment tools must be conducted to better understand the role of AR in anatomical education.
Collapse
Affiliation(s)
- Kimberly A McBain
- School of Physical and Occupational Therapy, McGill University, Montreal, Quebec, Canada
| | - Rami Habib
- School of Medicine and Health Sciences, McGill University, Montreal, Quebec, Canada
| | - George Laggis
- School of Physical and Occupational Therapy, McGill University, Montreal, Quebec, Canada
| | - Andrea Quaiattini
- Schulich Library of Physical Sciences, Life Sciences, and Engineering, McGill University, Montreal, Quebec, Canada
- Institute of Health Sciences Education, Faculty of Medicine, McGill University, Montreal, Quebec, Canada
| | - Nicole M Ventura
- Institute of Health Sciences Education, Faculty of Medicine, McGill University, Montreal, Quebec, Canada
- Division of Anatomical Sciences, Department of Anatomy and Cell Biology, McGill University, Montreal, Quebec, Canada
| | - Geoffroy P J C Noel
- Institute of Health Sciences Education, Faculty of Medicine, McGill University, Montreal, Quebec, Canada
- Division of Anatomical Sciences, Department of Anatomy and Cell Biology, McGill University, Montreal, Quebec, Canada
- Division of Anatomy, Department of Surgery, University of California San Diego, San Diego, California, USA
| |
Collapse
|
6
|
Raith A, Kamp C, Stoiber C, Jakl A, Wagner M. Augmented Reality in Radiology for Education and Training—A Design Study. Healthcare (Basel) 2022; 10:healthcare10040672. [PMID: 35455849 PMCID: PMC9031241 DOI: 10.3390/healthcare10040672] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2022] [Revised: 03/28/2022] [Accepted: 03/29/2022] [Indexed: 02/04/2023] Open
Abstract
Education is an important component of every healthcare system. Patients need to be educated about their planned procedures; healthcare professionals need to be trained in their respective profession. Both patient education and the training of healthcare professionals are often completed in person, which requires resources and is bound to certain times and places. Virtual educational environments can potentially save human and monetary resources, increase learner engagement, and enable users to learn according to their own schedules. This design study describes proofs of concept for two augmented reality-enabled (AR) educational tools, utilizing a Microsoft HoloLens head-mounted display. In the first use case, we demonstrate an AR application which could be used to educate cancer patients about their radiotherapy treatment and potentially reduce patient anxiety. The second use case demonstrates an AR training environment, which could complement the practical training of undergraduate radiography students. Two prototypes—VIPER, for patient education, and ARTUR for the training of radiography students—were developed and tested for viability and usability, both based on individual user tests. Both patient and student education were evaluated as viable and usable additions to conventional educational methods, despite being limited in terms of accessibility, usability, and fidelity. Suitable hardware is becoming more accessible and capable, and higher-fidelity holograms, better utilization of real-world objects, and more intuitive input methods could increase user immersion and acceptance of the technology.
Collapse
Affiliation(s)
- Alexander Raith
- Department of Health Sciences—Radiologic Technology FH Campus Wien, University of Applied Sciences, 1100 Wien, Austria; (A.R.); (C.K.)
| | - Christoph Kamp
- Department of Health Sciences—Radiologic Technology FH Campus Wien, University of Applied Sciences, 1100 Wien, Austria; (A.R.); (C.K.)
| | - Christina Stoiber
- Institute of Creative Media Technologies, St. Pölten University of Applied Sciences, 3100 St. Pölten, Austria; (A.J.); (M.W.)
- Correspondence:
| | - Andreas Jakl
- Institute of Creative Media Technologies, St. Pölten University of Applied Sciences, 3100 St. Pölten, Austria; (A.J.); (M.W.)
| | - Markus Wagner
- Institute of Creative Media Technologies, St. Pölten University of Applied Sciences, 3100 St. Pölten, Austria; (A.J.); (M.W.)
| |
Collapse
|
7
|
Virtual Reality-Based Framework to Simulate Control Algorithms for Robotic Assistance and Rehabilitation Tasks through a Standing Wheelchair. SENSORS 2021; 21:s21155083. [PMID: 34372320 PMCID: PMC8348610 DOI: 10.3390/s21155083] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/13/2021] [Revised: 06/18/2021] [Accepted: 07/23/2021] [Indexed: 12/30/2022]
Abstract
The implementation of control algorithms oriented to robotic assistance and rehabilitation tasks for people with motor disabilities has been of increasing interest in recent years. However, practical implementation cannot be carried out unless one has the real robotic system availability. To overcome this drawback, this article presents the development of an interactive virtual reality (VR)-based framework that allows one to simulate the execution of rehabilitation tasks and robotic assistance through a robotic standing wheelchair. The virtual environment developed considers the kinematic and dynamic model of the standing human–wheelchair system with a displaced center of mass, since it can be displaced for different reasons, e.g.,: bad posture, limb amputations, obesity, etc. The standing wheelchair autonomous control scheme has been implemented through the Full Simulation (FS) and Hardware in the Loop (HIL) techniques. Finally, the performance of the virtual control schemes has been shown by means of several experiments based on robotic assistance and rehabilitation for people with motor disabilities.
Collapse
|
8
|
Evaluation of a Wearable AR Platform for Guiding Complex Craniotomies in Neurosurgery. Ann Biomed Eng 2021; 49:2590-2605. [PMID: 34297263 DOI: 10.1007/s10439-021-02834-8] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2021] [Accepted: 07/12/2021] [Indexed: 10/20/2022]
Abstract
Today, neuronavigation is widely used in daily clinical routine to perform safe and efficient surgery. Augmented reality (AR) interfaces can provide anatomical models and preoperative planning contextually blended with the real surgical scenario, overcoming the limitations of traditional neuronavigators. This study aims to demonstrate the reliability of a new-concept AR headset in navigating complex craniotomies. Moreover, we aim to prove the efficacy of a patient-specific template-based methodology for fast, non-invasive, and fully automatic planning-to-patient registration. The AR platform navigation performance was assessed with an in-vitro study whose goal was twofold: to measure the real-to-virtual 3D target visualization error (TVE), and assess the navigation accuracy through a user study involving 10 subjects in tracing a complex craniotomy. The feasibility of the template-based registration was preliminarily tested on a volunteer. The TVE mean and standard deviation were 1.3 and 0.6 mm. The results of the user study, over 30 traced craniotomies, showed that 97% of the trajectory length was traced within an error margin of 1.5 mm, and 92% within a margin of 1 mm. The in-vivo test confirmed the feasibility and reliability of the patient-specific template for registration. The proposed AR headset allows ergonomic and intuitive fruition of preoperative planning, and it can represent a valid option to support neurosurgical tasks.
Collapse
|
9
|
Obeid MF, Kelly R, McKenzie FD. Development and Validation of a Hybrid Nuss Procedure Surgical Simulator and Trainer. IEEE Trans Biomed Eng 2021; 68:2520-2528. [PMID: 33382643 DOI: 10.1109/tbme.2020.3048516] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
OBJECTIVE This work presents the development and validation of an interactive simulation training platform for the minimally invasive repair of pectus excavatum, otherwise known as the Nuss procedure. METHODS The challenges and implications of developing both an all-virtual and an all-physical version of the simulator are investigated in a training context. A hybrid system is then developed that integrates virtual and physical constituents and a haptic interface to reproduce the primary steps of the procedure and to satisfy clinically relevant prerequisites for its training system. Furthermore, this work carries out a study to investigate the system's face, content, and construct validity. RESULTS Objective and subjective evaluations of the system demonstrate its utility for surgical training and establish various levels of its validity. CONCLUSION A hybrid virtual/physical configuration of the trainer can efficiently and realistically reproduce the primary steps of the procedure. SIGNIFICANCE Outside of this work, a simulation and training platform for the Nuss procedure is not available. This system was developed in close collaboration with the pioneers of this surgical technique.
Collapse
|
10
|
Condino S, Cutolo F, Cattari N, Colangeli S, Parchi PD, Piazza R, Ruinato AD, Capanna R, Ferrari V. Hybrid Simulation and Planning Platform for Cryosurgery with Microsoft HoloLens. SENSORS 2021; 21:s21134450. [PMID: 34209748 PMCID: PMC8272062 DOI: 10.3390/s21134450] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/02/2021] [Revised: 06/23/2021] [Accepted: 06/25/2021] [Indexed: 11/16/2022]
Abstract
Cryosurgery is a technique of growing popularity involving tissue ablation under controlled freezing. Technological advancement of devices along with surgical technique improvements have turned cryosurgery from an experimental to an established option for treating several diseases. However, cryosurgery is still limited by inaccurate planning based primarily on 2D visualization of the patient’s preoperative images. Several works have been aimed at modelling cryoablation through heat transfer simulations; however, most software applications do not meet some key requirements for clinical routine use, such as high computational speed and user-friendliness. This work aims to develop an intuitive platform for anatomical understanding and pre-operative planning by integrating the information content of radiological images and cryoprobe specifications either in a 3D virtual environment (desktop application) or in a hybrid simulator, which exploits the potential of the 3D printing and augmented reality functionalities of Microsoft HoloLens. The proposed platform was preliminarily validated for the retrospective planning/simulation of two surgical cases. Results suggest that the platform is easy and quick to learn and could be used in clinical practice to improve anatomical understanding, to make surgical planning easier than the traditional method, and to strengthen the memorization of surgical planning.
Collapse
Affiliation(s)
- Sara Condino
- Information Engineering Department, University of Pisa, 56126 Pisa, Italy; (F.C.); (V.F.)
- Correspondence:
| | - Fabrizio Cutolo
- Information Engineering Department, University of Pisa, 56126 Pisa, Italy; (F.C.); (V.F.)
| | - Nadia Cattari
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56126 Pisa, Italy; (N.C.); (R.P.); (A.D.R.)
| | - Simone Colangeli
- Orthopaedic and Traumatology Division, Department of Translational Research and of New Surgical and Medical Technologies, University of Pisa, 56124 Pisa, Italy; (S.C.); (P.D.P.); (R.C.)
| | - Paolo Domenico Parchi
- Orthopaedic and Traumatology Division, Department of Translational Research and of New Surgical and Medical Technologies, University of Pisa, 56124 Pisa, Italy; (S.C.); (P.D.P.); (R.C.)
| | - Roberta Piazza
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56126 Pisa, Italy; (N.C.); (R.P.); (A.D.R.)
| | - Alfio Damiano Ruinato
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56126 Pisa, Italy; (N.C.); (R.P.); (A.D.R.)
| | - Rodolfo Capanna
- Orthopaedic and Traumatology Division, Department of Translational Research and of New Surgical and Medical Technologies, University of Pisa, 56124 Pisa, Italy; (S.C.); (P.D.P.); (R.C.)
| | - Vincenzo Ferrari
- Information Engineering Department, University of Pisa, 56126 Pisa, Italy; (F.C.); (V.F.)
| |
Collapse
|
11
|
Augmented Reality, Mixed Reality, and Hybrid Approach in Healthcare Simulation: A Systematic Review. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app11052338] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Simulation-based medical training is considered an effective tool to acquire/refine technical skills, mitigating the ethical issues of Halsted’s model. This review aims at evaluating the literature on medical simulation techniques based on augmented reality (AR), mixed reality (MR), and hybrid approaches. The research identified 23 articles that meet the inclusion criteria: 43% combine two approaches (MR and hybrid), 22% combine all three, 26% employ only the hybrid approach, and 9% apply only the MR approach. Among the studies reviewed, 22% use commercial simulators, whereas 78% describe custom-made simulators. Each simulator is classified according to its target clinical application: training of surgical tasks (e.g., specific tasks for training in neurosurgery, abdominal surgery, orthopedic surgery, dental surgery, otorhinolaryngological surgery, or also generic tasks such as palpation) and education in medicine (e.g., anatomy learning). Additionally, the review assesses the complexity, reusability, and realism of the physical replicas, as well as the portability of the simulators. Finally, we describe whether and how the simulators have been validated. The review highlights that most of the studies do not have a significant sample size and that they include only a feasibility assessment and preliminary validation; thus, further research is needed to validate existing simulators and to verify whether improvements in performance on a simulated scenario translate into improved performance on real patients.
Collapse
|
12
|
Hybrid Spine Simulator Prototype for X-ray Free Pedicle Screws Fixation Training. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app11031038] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Simulation for surgical training is increasingly being considered a valuable addition to traditional teaching methods. 3D-printed physical simulators can be used for preoperative planning and rehearsal in spine surgery to improve surgical workflows and postoperative patient outcomes. This paper proposes an innovative strategy to build a hybrid simulation platform for training of pedicle screws fixation: the proposed method combines 3D-printed patient-specific spine models with augmented reality functionalities and virtual X-ray visualization, thus avoiding any exposure to harmful radiation during the simulation. Software functionalities are implemented by using a low-cost tracking strategy based on fiducial marker detection. Quantitative tests demonstrate the accuracy of the method to track the vertebral model and surgical tools, and to coherently visualize them in either the augmented reality or virtual fluoroscopic modalities. The obtained results encourage further research and clinical validation towards the use of the simulator as an effective tool for training in pedicle screws insertion in lumbar vertebrae.
Collapse
|
13
|
Lungu AJ, Swinkels W, Claesen L, Tu P, Egger J, Chen X. A review on the applications of virtual reality, augmented reality and mixed reality in surgical simulation: an extension to different kinds of surgery. Expert Rev Med Devices 2020; 18:47-62. [PMID: 33283563 DOI: 10.1080/17434440.2021.1860750] [Citation(s) in RCA: 51] [Impact Index Per Article: 12.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Background: Research proves that the apprenticeship model, which is the gold standard for training surgical residents, is obsolete. For that reason, there is a continuing effort toward the development of high-fidelity surgical simulators to replace the apprenticeship model. Applying Virtual Reality Augmented Reality (AR) and Mixed Reality (MR) in surgical simulators increases the fidelity, level of immersion and overall experience of these simulators.Areas covered: The objective of this review is to provide a comprehensive overview of the application of VR, AR and MR for distinct surgical disciplines, including maxillofacial surgery and neurosurgery. The current developments in these areas, as well as potential future directions, are discussed.Expert opinion: The key components for incorporating VR into surgical simulators are visual and haptic rendering. These components ensure that the user is completely immersed in the virtual environment and can interact in the same way as in the physical world. The key components for the application of AR and MR into surgical simulators include the tracking system as well as the visual rendering. The advantages of these surgical simulators are the ability to perform user evaluations and increase the training frequency of surgical residents.
Collapse
Affiliation(s)
- Abel J Lungu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Wout Swinkels
- Computational Sensing Systems, Department of Engineering Technology, Hasselt University, Diepenbeek, Belgium
| | - Luc Claesen
- Computational Sensing Systems, Department of Engineering Technology, Hasselt University, Diepenbeek, Belgium
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Jan Egger
- Graz University of Technology, Institute of Computer Graphics and Vision, Graz, Austria.,Graz Department of Oral &maxillofacial Surgery, Medical University of Graz, Graz, Austria.,The Laboratory of Computer Algorithms for Medicine, Medical University of Graz, Graz, Austria
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|
14
|
Condino S, Piazza R, Viglialoro RM, Mocellin DM, Turini G, Berchiolli RN, Micheletti F, Rossi F, Pini R, Ferrari V, Ferrari M. Novel EM Guided Endovascular Instrumentation for In Situ Endograft Fenestration. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE-JTEHM 2020; 8:1900208. [PMID: 32219042 PMCID: PMC7082146 DOI: 10.1109/jtehm.2020.2973973] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/03/2019] [Revised: 11/20/2019] [Accepted: 01/27/2020] [Indexed: 01/28/2023]
Abstract
Objective: This work aims at providing novel endovascular instrumentation to overcome current technical limitations of in situ endograft fenestration including challenges in targeting the fenestration site under fluoroscopic control and supplying mechanical support during endograft perforation. Technology: Novel electromagnetically trackable instruments were developed to facilitate the navigation of the fenestration device and its stabilization at the target site. In vitro trials were performed to preliminary evaluate the proposed instrumentation for the antegrade in situ fenestration of an aortic endograft, using a laser guidewire designed ad hoc and the sharp end of a commercial endovascular guidewire. Results: In situ fenestration was successfully performed in 22 trials. A total of two laser tools were employed since an over bending of laser guidewire tip, due to its manufacturing, caused the damage of the sensor in the first device used. Conclusions: Preliminary in vitro trials demonstrate the feasibility of the proposed instrumentation which could widespread the procedure for in situ fenestration. The results obtained should be validated performing animal studies. Clinical Impact: The proposed instrumentation has the potential to expand indications for standard endovascular aneurysm repair to cases of acute syndromes.
Collapse
Affiliation(s)
- S Condino
- 1Information Engineering DepartmentUniversity of Pisa56122PisaItaly.,2EndoCAS CenterDepartment of Translational Research and New Technologies in Medicine and SurgeryUniversity of Pisa56126PisaItaly
| | - R Piazza
- 1Information Engineering DepartmentUniversity of Pisa56122PisaItaly.,2EndoCAS CenterDepartment of Translational Research and New Technologies in Medicine and SurgeryUniversity of Pisa56126PisaItaly
| | - R M Viglialoro
- 2EndoCAS CenterDepartment of Translational Research and New Technologies in Medicine and SurgeryUniversity of Pisa56126PisaItaly
| | - D M Mocellin
- 3Vascular Surgery UnitCisanello University Hospital AOUP56126PisaItaly
| | - G Turini
- 2EndoCAS CenterDepartment of Translational Research and New Technologies in Medicine and SurgeryUniversity of Pisa56126PisaItaly.,4Computer Science DepartmentKettering UniversityFlintMI48504USA
| | - R N Berchiolli
- 2EndoCAS CenterDepartment of Translational Research and New Technologies in Medicine and SurgeryUniversity of Pisa56126PisaItaly.,3Vascular Surgery UnitCisanello University Hospital AOUP56126PisaItaly
| | - F Micheletti
- 5Institute of Applied Physics "Nello Carrara," National Research Council50019Sesto FiorentinoItaly
| | - F Rossi
- 5Institute of Applied Physics "Nello Carrara," National Research Council50019Sesto FiorentinoItaly
| | - R Pini
- 5Institute of Applied Physics "Nello Carrara," National Research Council50019Sesto FiorentinoItaly
| | - V Ferrari
- 1Information Engineering DepartmentUniversity of Pisa56122PisaItaly.,2EndoCAS CenterDepartment of Translational Research and New Technologies in Medicine and SurgeryUniversity of Pisa56126PisaItaly
| | - M Ferrari
- 2EndoCAS CenterDepartment of Translational Research and New Technologies in Medicine and SurgeryUniversity of Pisa56126PisaItaly.,3Vascular Surgery UnitCisanello University Hospital AOUP56126PisaItaly
| |
Collapse
|
15
|
Lee S, Shim S, Ha HG, Lee H, Hong J. Simultaneous Optimization of Patient-Image Registration and Hand-Eye Calibration for Accurate Augmented Reality in Surgery. IEEE Trans Biomed Eng 2020; 67:2669-2682. [PMID: 31976878 DOI: 10.1109/tbme.2020.2967802] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
OBJECTIVE Augmented reality (AR) navigation using a position sensor in endoscopic surgeries relies on the quality of patient-image registration and hand-eye calibration. Conventional methods collect the necessary data to compute two output transformation matrices separately. However, the AR display setting during surgery generally differs from that during preoperative processes. Although conventional methods can identify optimal solutions under initial conditions, AR display errors are unavoidable during surgery owing to the inherent computational complexity of AR processes, such as error accumulation over successive matrix multiplications, and tracking errors of position sensor. METHODS We propose the simultaneous optimization of patient-image registration and hand-eye calibration in an AR environment before surgery. The relationship between the endoscope and a virtual object to overlay is first calculated using an endoscopic image, which also functions as a reference during optimization. After including the tracking information from the position sensor, patient-image registration and hand-eye calibration are optimized in terms of least-squares. RESULTS Experiments with synthetic data verify that the proposed method is less sensitive to computation and tracking errors. A phantom experiment with a position sensor is also conducted. The accuracy of the proposed method is significantly higher than that of the conventional method. CONCLUSION The AR accuracy of the proposed method is compared with those of the conventional ones, and the superiority of the proposed method is verified. SIGNIFICANCE This study demonstrates that the proposed method exhibits substantial potential for improving AR navigation accuracy.
Collapse
|
16
|
Abstract
Augmented reality (AR) technology is gaining popularity and scholarly interest in the rehabilitation sector because of the possibility to generate controlled, user-specific environmental and perceptual stimuli which motivate the patient, while still preserving the possibility to interact with the real environment and other subjects, including the rehabilitation specialist. The paper presents the first wearable AR application for shoulder rehabilitation, based on Microsoft HoloLens, with real-time markerless tracking of the user’s hand. Potentialities and current limits of commercial head-mounted displays (HMDs) are described for the target medical field, and details of the proposed application are reported. A serious game was designed starting from the analysis of a traditional rehabilitation exercise, taking into account HoloLens specifications to maximize user comfort during the AR rehabilitation session. The AR application implemented consistently meets the recommended target frame rate for immersive applications with HoloLens device: 60 fps. Moreover, the ergonomics and the motivational value of the proposed application were positively evaluated by a group of five rehabilitation specialists and 20 healthy subjects. Even if a larger study, including real patients, is necessary for a clinical validation of the proposed application, the results obtained encourage further investigations and the integration of additional technical features for the proposed AR application.
Collapse
|
17
|
Uppot RN, Laguna B, McCarthy CJ, De Novi G, Phelps A, Siegel E, Courtier J. Implementing Virtual and Augmented Reality Tools for Radiology Education and Training, Communication, and Clinical Care. Radiology 2019; 291:570-580. [PMID: 30990383 DOI: 10.1148/radiol.2019182210] [Citation(s) in RCA: 68] [Impact Index Per Article: 13.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Advances in virtual immersive and augmented reality technology, commercially available for the entertainment and gaming industry, hold potential for education and clinical use in medicine and the field of medical imaging. Radiology departments have begun exploring the use of these technologies to help with radiology education and clinical care. The purpose of this review article is to summarize how three institutions have explored using virtual and augmented reality for radiology.
Collapse
Affiliation(s)
- Raul N Uppot
- From the Department of Radiology, Division of Interventional Radiology, Massachusetts General Hospital, 55 Fruit St, Gray 290, Boston, MA 02114 (R.N.U., C.J.M., G.D.N.); Department of Radiology and Biomedical Imaging, University of California San Francisco Medical Center, San Francisco, Calif (B.L., A.P., J.C.); and Department of Radiology, University of Maryland Medical Center, Baltimore, Md (E.S.)
| | - Benjamin Laguna
- From the Department of Radiology, Division of Interventional Radiology, Massachusetts General Hospital, 55 Fruit St, Gray 290, Boston, MA 02114 (R.N.U., C.J.M., G.D.N.); Department of Radiology and Biomedical Imaging, University of California San Francisco Medical Center, San Francisco, Calif (B.L., A.P., J.C.); and Department of Radiology, University of Maryland Medical Center, Baltimore, Md (E.S.)
| | - Colin J McCarthy
- From the Department of Radiology, Division of Interventional Radiology, Massachusetts General Hospital, 55 Fruit St, Gray 290, Boston, MA 02114 (R.N.U., C.J.M., G.D.N.); Department of Radiology and Biomedical Imaging, University of California San Francisco Medical Center, San Francisco, Calif (B.L., A.P., J.C.); and Department of Radiology, University of Maryland Medical Center, Baltimore, Md (E.S.)
| | - Gianluca De Novi
- From the Department of Radiology, Division of Interventional Radiology, Massachusetts General Hospital, 55 Fruit St, Gray 290, Boston, MA 02114 (R.N.U., C.J.M., G.D.N.); Department of Radiology and Biomedical Imaging, University of California San Francisco Medical Center, San Francisco, Calif (B.L., A.P., J.C.); and Department of Radiology, University of Maryland Medical Center, Baltimore, Md (E.S.)
| | - Andrew Phelps
- From the Department of Radiology, Division of Interventional Radiology, Massachusetts General Hospital, 55 Fruit St, Gray 290, Boston, MA 02114 (R.N.U., C.J.M., G.D.N.); Department of Radiology and Biomedical Imaging, University of California San Francisco Medical Center, San Francisco, Calif (B.L., A.P., J.C.); and Department of Radiology, University of Maryland Medical Center, Baltimore, Md (E.S.)
| | - Eliot Siegel
- From the Department of Radiology, Division of Interventional Radiology, Massachusetts General Hospital, 55 Fruit St, Gray 290, Boston, MA 02114 (R.N.U., C.J.M., G.D.N.); Department of Radiology and Biomedical Imaging, University of California San Francisco Medical Center, San Francisco, Calif (B.L., A.P., J.C.); and Department of Radiology, University of Maryland Medical Center, Baltimore, Md (E.S.)
| | - Jesse Courtier
- From the Department of Radiology, Division of Interventional Radiology, Massachusetts General Hospital, 55 Fruit St, Gray 290, Boston, MA 02114 (R.N.U., C.J.M., G.D.N.); Department of Radiology and Biomedical Imaging, University of California San Francisco Medical Center, San Francisco, Calif (B.L., A.P., J.C.); and Department of Radiology, University of Maryland Medical Center, Baltimore, Md (E.S.)
| |
Collapse
|