1
|
Neri I, Cercenelli L, Marcuccio M, Lodi S, Koufi FD, Fazio A, Marvi MV, Marcelli E, Billi AM, Ruggeri A, Tarsitano A, Manzoli L, Badiali G, Ratti S. Dissecting human anatomy learning process through anatomical education with augmented reality: AEducAR 2.0, an updated interdisciplinary study. ANATOMICAL SCIENCES EDUCATION 2024; 17:693-711. [PMID: 38520153 DOI: 10.1002/ase.2389] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/18/2023] [Revised: 01/22/2024] [Accepted: 01/22/2024] [Indexed: 03/25/2024]
Abstract
Anatomical education is pivotal for medical students, and innovative technologies like augmented reality (AR) are transforming the field. This study aimed to enhance the interactive features of the AEducAR prototype, an AR tool developed by the University of Bologna, and explore its impact on human anatomy learning process in 130 second-year medical students at the International School of Medicine and Surgery of the University of Bologna. An interdisciplinary team of anatomists, maxillofacial surgeons, biomedical engineers, and educational scientists collaborated to ensure a comprehensive understanding of the study's objectives. Students used the updated version of AEducAR, named AEducAR 2.0, to study three anatomical topics, specifically the orbit zone, facial bones, and mimic muscles. AEducAR 2.0 offered two learning activities: one explorative and one interactive. Following each activity, students took a test to assess learning outcomes. Students also completed an anonymous questionnaire to provide background information and offer their perceptions of the activity. Additionally, 10 students participated in interviews for further insights. The results demonstrated that AEducAR 2.0 effectively facilitated learning and students' engagement. Students totalized high scores in both quizzes and declared to have appreciated the interactive features that were implemented. Moreover, interviews shed light on the interesting topic of blended learning. In particular, the present study suggests that incorporating AR into medical education alongside traditional methods might prove advantageous for students' academic and future professional endeavors. In this light, this study contributes to the growing research emphasizing the potential role of AR in shaping the future of medical education.
Collapse
Affiliation(s)
- Irene Neri
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Laura Cercenelli
- eDIMES Lab-Laboratory of Bioengineering, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, Bologna, Italy
| | - Massimo Marcuccio
- Department of Educational Science "Giovanni Maria Bertin", University of Bologna, Bologna, Italy
| | - Simone Lodi
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Foteini-Dionysia Koufi
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Antonietta Fazio
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Maria Vittoria Marvi
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Emanuela Marcelli
- eDIMES Lab-Laboratory of Bioengineering, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, Bologna, Italy
| | - Anna Maria Billi
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Alessandra Ruggeri
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Achille Tarsitano
- Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
- Department of Maxillo-Facial Surgery, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Bologna, Italy
| | - Lucia Manzoli
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Giovanni Badiali
- Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
- Department of Maxillo-Facial Surgery, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Bologna, Italy
| | - Stefano Ratti
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| |
Collapse
|
2
|
Connolly M, Iohom G, O'Brien N, Volz J, O'Muircheartaigh A, Serchan P, Biculescu A, Gadre KG, Soare C, Griseto L, Shorten G. Delivering clinical tutorials to medical students using the Microsoft HoloLens 2: A mixed-methods evaluation. BMC MEDICAL EDUCATION 2024; 24:498. [PMID: 38704522 PMCID: PMC11070104 DOI: 10.1186/s12909-024-05475-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/09/2023] [Accepted: 04/26/2024] [Indexed: 05/06/2024]
Abstract
BACKGROUND Mixed reality offers potential educational advantages in the delivery of clinical teaching. Holographic artefacts can be rendered within a shared learning environment using devices such as the Microsoft HoloLens 2. In addition to facilitating remote access to clinical events, mixed reality may provide a means of sharing mental models, including the vertical and horizontal integration of curricular elements at the bedside. This study aimed to evaluate the feasibility of delivering clinical tutorials using the Microsoft HoloLens 2 and the learning efficacy achieved. METHODS Following receipt of institutional ethical approval, tutorials on preoperative anaesthetic history taking and upper airway examination were facilitated by a tutor who wore the HoloLens device. The tutor interacted face to face with a patient and two-way audio-visual interaction was facilitated using the HoloLens 2 and Microsoft Teams with groups of students who were located in a separate tutorial room. Holographic functions were employed by the tutor. The tutor completed the System Usability Scale, the tutor, technical facilitator, patients, and students provided quantitative and qualitative feedback, and three students participated in semi-structured feedback interviews. Students completed pre- and post-tutorial, and end-of-year examinations on the tutorial topics. RESULTS Twelve patients and 78 students participated across 12 separate tutorials. Five students did not complete the examinations and were excluded from efficacy calculations. Student feedback contained 90 positive comments, including the technology's ability to broadcast the tutor's point-of-vision, and 62 negative comments, where students noted issues with the audio-visual quality, and concerns that the tutorial was not as beneficial as traditional in-person clinical tutorials. The technology and tutorial structure were viewed favourably by the tutor, facilitator and patients. Significant improvement was observed between students' pre- and post-tutorial MCQ scores (mean 59.2% Vs 84.7%, p < 0.001). CONCLUSIONS This study demonstrates the feasibility of using the HoloLens 2 to facilitate remote bedside tutorials which incorporate holographic learning artefacts. Students' examination performance supports substantial learning of the tutorial topics. The tutorial structure was agreeable to students, patients and tutor. Our results support the feasibility of offering effective clinical teaching and learning opportunities using the HoloLens 2. However, the technical limitations and costs of the device are significant, and further research is required to assess the effectiveness of this tutorial format against in-person tutorials before wider roll out of this technology can be recommended as a result of this study.
Collapse
Affiliation(s)
- Murray Connolly
- Cork University Hospital and University College Cork, Cork, Ireland.
| | - Gabriella Iohom
- Cork University Hospital and University College Cork, Cork, Ireland
| | | | | | | | | | | | | | - Corina Soare
- Cork University Hospital and University College Cork, Cork, Ireland
| | | | - George Shorten
- Cork University Hospital and University College Cork, Cork, Ireland
| |
Collapse
|
3
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 17] [Impact Index Per Article: 17.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
4
|
Aydin SO, Barut O, Yilmaz MO, Sahin B, Akyoldas G, Akgun MY, Baran O, Tanriover N. Use of 3-Dimensional Modeling and Augmented/Virtual Reality Applications in Microsurgical Neuroanatomy Training. Oper Neurosurg (Hagerstown) 2023; 24:318-323. [PMID: 36701556 DOI: 10.1227/ons.0000000000000524] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2022] [Accepted: 09/13/2022] [Indexed: 01/27/2023] Open
Abstract
BACKGROUND Understanding the microsurgical neuroanatomy of the brain is challenging yet crucial for safe and effective surgery. Training on human cadavers provides an opportunity to practice approaches and learn about the brain's complex organization from a surgical view. Innovations in visual technology, such as virtual reality (VR) and augmented reality (AR), have immensely added a new dimension to neuroanatomy education. In this regard, a 3-dimensional (3D) model and AR/VR application may facilitate the understanding of the microsurgical neuroanatomy of the brain and improve spatial recognition during neurosurgical procedures by generating a better comprehension of interrelated neuroanatomic structures. OBJECTIVE To investigate the results of 3D volumetric modeling and AR/VR applications in showing the brain's complex organization during fiber dissection. METHODS Fiber dissection was applied to the specimen, and the 3D model was created with a new photogrammetry method. After photogrammetry, the 3D model was edited using 3D editing programs and viewed in AR. The 3D model was also viewed in VR using a head-mounted display device. RESULTS The 3D model was viewed in internet-based sites and AR/VR platforms with high resolution. The fibers could be panned, rotated, and moved freely on different planes and viewed from different angles on AR and VR platforms. CONCLUSION This study demonstrated that fiber dissections can be transformed and viewed digitally on AR/VR platforms. These models can be considered a powerful teaching tool for improving the surgical spatial recognition of interrelated neuroanatomic structures. Neurosurgeons worldwide can easily avail of these models on digital platforms.
Collapse
Affiliation(s)
- Serdar Onur Aydin
- Microsurgical Neuroanatomy Laboratory, Department of Neurosurgery, Koc University Hospital, Istanbul, Turkey
| | - Ozan Barut
- Microsurgical Neuroanatomy Laboratory, Department of Neurosurgery, Cerrahpasa Medical Faculty, Istanbul University-Cerrahpasa, Istanbul, Turkey
| | - Mehmet Ozgur Yilmaz
- Microsurgical Neuroanatomy Laboratory, Department of Neurosurgery, Koc University Hospital, Istanbul, Turkey
| | - Balkan Sahin
- Microsurgical Neuroanatomy Laboratory, Department of Neurosurgery, Koc University Hospital, Istanbul, Turkey
| | - Goktug Akyoldas
- Department of Neurosurgery, Koc University Hospital, Istanbul, Turkey
| | | | - Oguz Baran
- Microsurgical Neuroanatomy Laboratory, Department of Neurosurgery, Koc University Hospital, Istanbul, Turkey
- Department of Neurosurgery, Koc University Hospital, Istanbul, Turkey
| | - Necmettin Tanriover
- Microsurgical Neuroanatomy Laboratory, Department of Neurosurgery, Cerrahpasa Medical Faculty, Istanbul University-Cerrahpasa, Istanbul, Turkey
- Department of Neurosurgery, Cerrahpasa Medical Faculty, Istanbul University-Cerrahpasa, Istanbul, Turkey
| |
Collapse
|
5
|
Curran VR, Xu X, Aydin MY, Meruvia-Pastor O. Use of Extended Reality in Medical Education: An Integrative Review. MEDICAL SCIENCE EDUCATOR 2023; 33:275-286. [PMID: 36569366 PMCID: PMC9761044 DOI: 10.1007/s40670-022-01698-4] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Accepted: 11/28/2022] [Indexed: 06/17/2023]
Abstract
UNLABELLED Extended reality (XR) has emerged as an innovative simulation-based learning modality. An integrative review was undertaken to explore the nature of evidence, usage, and effectiveness of XR modalities in medical education. One hundred and thirty-three (N = 133) studies and articles were reviewed. XR technologies are commonly reported in surgical and anatomical education, and the evidence suggests XR may be as effective as traditional medical education teaching methods and, potentially, a more cost-effective means of curriculum delivery. Further research to compare different variations of XR technologies and best applications in medical education and training are required to advance the field. SUPPLEMENTARY INFORMATION The online version contains supplementary material available at 10.1007/s40670-022-01698-4.
Collapse
Affiliation(s)
- Vernon R. Curran
- Office of Professional and Educational Development, Faculty of Medicine, Health Sciences Centre, Memorial University of Newfoundland, Room H2982, St. John’s, NL A1B 3V6 Canada
| | - Xiaolin Xu
- Faculty of Health Sciences, Queen’s University, Kingston, ON Canada
| | - Mustafa Yalin Aydin
- Department of Computer Sciences, Memorial University of Newfoundland, St. John’s, NL Canada
| | - Oscar Meruvia-Pastor
- Department of Computer Sciences, Memorial University of Newfoundland, St. John’s, NL Canada
| |
Collapse
|
6
|
Richards S. Student Engagement Using HoloLens Mixed-Reality Technology in Human Anatomy Laboratories for Osteopathic Medical Students: an Instructional Model. MEDICAL SCIENCE EDUCATOR 2023; 33:223-231. [PMID: 36691419 PMCID: PMC9850333 DOI: 10.1007/s40670-023-01728-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Accepted: 01/09/2023] [Indexed: 06/17/2023]
Abstract
Mixed-reality technology is a powerful tool used in healthcare and medical education to engage students in life-like scenarios. This blend of virtual and augmented reality images incorporates virtual projections with the real environment to allow real-time observation and interaction [1]. While this immersive technology offers advantages over cadaver dissections, it creates new challenges to keeping students engaged [2, 3]. Student engagement improves students' commitment to learning, critical thinking, and motivation and results in successful course outcomes [4, 5]. This paper provides an activity model using the HoloLens mixed-reality technology to deliver human gross anatomy laboratory sessions to first-year osteopathic medical students. The activity was designed using Gagne's model for instructional design and team-based learning to create an active learning model, which targets the behavioral, emotional, and cognitive dimensions of student engagement [6, 7]: behavioral engagement through autonomy and time on task, emotional engagement through providing the guiding exploration and narrative flow to accompany students' visual experience, and cognitive engagement by incorporating team-based learning (TBL) and case-based learning (CBL). The instructional model also answers the call for a new type of virtual reality instructor and pedagogical strategy that addresses the unique challenges and increases student engagement with this new technology. The effectiveness of this classroom activity was assessed by observing students for indicators or behaviors of student engagement, which are discussed. Further studies are required to measure the extent to which these indicators were exhibited and compare student engagement with this mixed-reality to didactic cadaver-based laboratory sessions.
Collapse
Affiliation(s)
- Sherese Richards
- California Health Sciences University, Department of Biomedical Education- Anatomy, Clovis, CA 93611 USA
| |
Collapse
|
7
|
Grad P, Przeklasa-Bierowiec AM, Malinowski KP, Witowski J, Proniewska K, Tatoń G. Application of HoloLens-based augmented reality and three-dimensional printed anatomical tooth reference models in dental education. ANATOMICAL SCIENCES EDUCATION 2022. [PMID: 36524288 DOI: 10.1002/ase.2241] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/28/2021] [Revised: 11/29/2022] [Accepted: 12/01/2022] [Indexed: 06/17/2023]
Abstract
Tooth anatomy is fundamental knowledge used in everyday dental practice to reconstruct the occlusal surface during cavity fillings. The main objective of this project was to evaluate the suitability of two types of anatomical tooth reference models used to support reconstruction of the occlusal anatomy of the teeth: (1) a three-dimensional (3D)-printed model and (2) a model displayed in augmented reality (AR) using Microsoft HoloLens. The secondary objective was to evaluate three aspects impacting the outcome: clinical experience, comfort of work, and other variables. The tertiary objective was to evaluate the usefulness of AR in dental education. Anatomical models of crowns of three different molars were made using cone beam computed tomography image segmentation, printed with a stereolithographic 3D-printer, and then displayed in the HoloLens. Each participant reconstructed the occlusal anatomy of three teeth. One without any reference materials and two with an anatomical reference model, either 3D-printed or holographic. The reconstruction work was followed by the completion of an evaluation questionnaire. The maximum Hausdorff distances (Hmax) between the superimposed images of the specimens after the procedures and the anatomical models were then calculated. The results showed that the most accurate but slowest reconstruction was achieved with the use of 3D-printed reference models and that the results were not affected by other aspects considered. For this method, the Hmax was observed to be 630 μm (p = 0.004). It was concluded that while AR models can be helpful in dental anatomy education, they are not suitable replacements for physical models.
Collapse
Affiliation(s)
- Piotr Grad
- Department of Integrated Dentistry, Institute of Dentistry, Faculty of Medicine, Jagiellonian University Medical College, Kraków, Poland
| | - Anna M Przeklasa-Bierowiec
- Department of Integrated Dentistry, Institute of Dentistry, Faculty of Medicine, Jagiellonian University Medical College, Kraków, Poland
| | - Krzysztof P Malinowski
- Department of Bioinformatics and Telemedicine, Faculty of Medicine, Jagiellonian University Medical College, Kraków, Poland
| | - Jan Witowski
- Department of Radiology, New York University Grossman School of Medicine, New York, New York, USA
| | - Klaudia Proniewska
- Department of Bioinformatics and Telemedicine, Faculty of Medicine, Jagiellonian University Medical College, Kraków, Poland
| | - Grzegorz Tatoń
- Department of Biophysics, Chair of Physiology, Faculty of Medicine, Jagiellonian University Medical College, Kraków, Poland
| |
Collapse
|
8
|
Minty I, Lawson J, Guha P, Luo X, Malik R, Cerneviciute R, Kinross J, Martin G. The use of mixed reality technology for the objective assessment of clinical skills: a validation study. BMC MEDICAL EDUCATION 2022; 22:639. [PMID: 35999532 PMCID: PMC9395785 DOI: 10.1186/s12909-022-03701-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/17/2022] [Accepted: 08/16/2022] [Indexed: 06/15/2023]
Abstract
BACKGROUND Mixed Reality technology may provide many advantages over traditional teaching methods. Despite its potential, the technology has yet to be used for the formal assessment of clinical competency. This study sought to collect validity evidence and assess the feasibility of using the HoloLens 2 mixed reality headset for the conduct and augmentation of Objective Structured Clinical Examinations (OSCEs). METHODS A prospective cohort study was conducted to compare the assessment of undergraduate medical students undertaking OSCEs via HoloLens 2 live (HLL) and recorded (HLR), and gold-standard in-person (IP) methods. An augmented mixed reality scenario was also assessed. RESULTS Thirteen undergraduate participants completed a total of 65 OSCE stations. Overall inter-modality correlation was 0.81 (p = 0.01), 0.98 (p = 0.01) and 0.82 (p = 0.01) for IP vs. HLL, HLL vs. HLR and IP vs. HLR respectively. Skill based correlations for IP vs. HLR were assessed for history taking (0.82, p = 0.01), clinical examination (0.81, p = 0.01), procedural (0.88, p = 0.01) and clinical skills (0.92, p = 0.01), and assessment of a virtual mixed reality patient (0.74, p = 0.01). The HoloLens device was deemed to be usable and practical (Standard Usability Scale (SUS) score = 51.5), and the technology was thought to deliver greater flexibility and convenience, and have the potential to expand and enhance assessment opportunities. CONCLUSIONS HoloLens 2 is comparable to traditional in-person examination of undergraduate medical students for both live and recorded assessments, and therefore is a valid and robust method for objectively assessing performance. The technology is in its infancy, and users need to develop confidence in its usability and reliability as an assessment tool. However, the potential to integrate additional functionality including holographic content, automated tracking and data analysis, and to facilitate remote assessment may allow the technology to enhance, expand and standardise examinations across a range of educational contexts.
Collapse
Affiliation(s)
- Iona Minty
- Department of Surgery and Cancer, Imperial College London, St Mary's Hospital, 10th Floor QEQM Building, London, W2 1NY, UK
| | - Jason Lawson
- Department of Surgery and Cancer, Imperial College London, St Mary's Hospital, 10th Floor QEQM Building, London, W2 1NY, UK
| | - Payal Guha
- Department of Surgery and Cancer, Imperial College London, St Mary's Hospital, 10th Floor QEQM Building, London, W2 1NY, UK
| | - Xun Luo
- Department of Surgery and Cancer, Imperial College London, St Mary's Hospital, 10th Floor QEQM Building, London, W2 1NY, UK
| | - Rukhnoor Malik
- Department of Surgery and Cancer, Imperial College London, St Mary's Hospital, 10th Floor QEQM Building, London, W2 1NY, UK
| | - Raminta Cerneviciute
- Department of Surgery and Cancer, Imperial College London, St Mary's Hospital, 10th Floor QEQM Building, London, W2 1NY, UK
| | - James Kinross
- Department of Surgery and Cancer, Imperial College London, St Mary's Hospital, 10th Floor QEQM Building, London, W2 1NY, UK
| | - Guy Martin
- Department of Surgery and Cancer, Imperial College London, St Mary's Hospital, 10th Floor QEQM Building, London, W2 1NY, UK.
| |
Collapse
|
9
|
Santos VA, Barreira MP, Saad KR. Technological resources for teaching and learning about human anatomy in the medical course: Systematic review of literature. ANATOMICAL SCIENCES EDUCATION 2022; 15:403-419. [PMID: 34664384 DOI: 10.1002/ase.2142] [Citation(s) in RCA: 13] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/24/2021] [Revised: 10/12/2021] [Accepted: 10/14/2021] [Indexed: 06/13/2023]
Abstract
The consolidation of technology as an alternative strategy to cadaveric dissection for teaching anatomy in medical courses was accelerated by the recent Covid-19 pandemic, which caused the need for social distance policies and the closure of laboratories and classrooms. Consequently, new technologies were created, and those already been developed started to be better explored. However, information about many of these instruments and resources is not available to anatomy teachers. This systematic review presents the technological means for teaching and learning about human anatomy developed and applied in medical courses in the last ten years, besides the infrastructure necessary to use them. Studies in English, Portuguese, and Spanish were searched in MEDLINE, Scopus, ERIC, LILACS, and SciELO databases, initially resulting in a total of 875 identified articles, from which 102 were included in the analysis. They were classified according to the type of technology used: three-dimensional (3D) printing (n = 22), extended reality (n = 49), digital tools (n = 23), and other technological resources (n = 8). It was made a detailed description of technologies, including the stage of the medical curriculum in which it was applied, the infrastructure utilized, and which contents were covered. The analysis shows that between all technologies, those related to the internet and 3D printing are the most applicable, both in student learning and the financial cost necessary for its structural implementation.
Collapse
Affiliation(s)
- Vinícius A Santos
- School of Medicine, Universidade Federal do Vale do São Francisco, Petrolina, Brazil
| | - Matheus P Barreira
- School of Medicine, Universidade Federal do Vale do São Francisco, Petrolina, Brazil
| | - Karen R Saad
- Department of Morphology, School of Medicine, Universidade Federal do Vale do São Francisco, Petrolina, Brazil
| |
Collapse
|
10
|
Iqbal H, Tatti F, Rodriguez Y Baena F. Augmented reality in robotic assisted orthopaedic surgery: A pilot study. J Biomed Inform 2021; 120:103841. [PMID: 34146717 DOI: 10.1016/j.jbi.2021.103841] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2020] [Revised: 06/11/2021] [Accepted: 06/14/2021] [Indexed: 01/18/2023]
Abstract
BACKGROUND The research and development of augmented-reality (AR) technologies in surgical applications has seen an evolution of the traditional user-interfaces (UI) utilised by clinicians when conducting robot-assisted orthopaedic surgeries. The typical UI for such systems relies on surgeons managing 3D medical imaging data in the 2D space of a touchscreen monitor, located away from the operating site. Conversely, AR can provide a composite view overlaying the real surgical scene with co-located virtual holographic representations of medical data, leading to a more immersive and intuitive operator experience. MATERIALS AND METHODS This work explores the integration of AR within an orthopaedic setting by capturing and replicating the UI of an existing surgical robot within an AR head-mounted display worn by the clinician. The resulting mixed-reality workflow enabled users to simultaneously view the operating-site and real-time holographic operating informatics when carrying out a robot-assisted patellofemoral-arthroplasty (PFA). Ten surgeons were recruited to test the impact of the AR system on procedure completion time and operating surface roughness. RESULTS AND DISCUSSION The integration of AR did not appear to require subjects to significantly alter their surgical techniques, which was demonstrated by non-significant changes to the study's clinical metrics, with a statistically insignificant mean increase in operating time (+0.778 s, p = 0.488) and a statistically insignificant change in mean surface roughness (p = 0.274). Additionally, a post-operative survey indicated a positive consensus on the usability of the AR system without incurring noticeable physical distress such as eyestrain or fatigue. CONCLUSIONS Overall, these study results demonstrated a successful integration of AR technologies within the framework of an existing robot-assisted surgical platform with no significant negative effects in two quantitative metrics of surgical performance, and a positive outcome relating to user-centric and ergonomic evaluation criteria.
Collapse
Affiliation(s)
- Hisham Iqbal
- Mechatronics in Medicine Laboratory, Imperial College London, London, UK.
| | - Fabio Tatti
- Mechatronics in Medicine Laboratory, Imperial College London, London, UK
| | | |
Collapse
|