1
|
A critical outlook at augmented reality and its adoption in education. COMPUTERS AND EDUCATION OPEN 2022. [DOI: 10.1016/j.caeo.2022.100103] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
|
2
|
Bindschadler M, Buddhe S, Ferguson MR, Jones T, Friedman SD, Otto RK. HEARTBEAT4D: An Open-source Toolbox for Turning 4D Cardiac CT into VR/AR. J Digit Imaging 2022; 35:1759-1767. [PMID: 35614275 PMCID: PMC9712868 DOI: 10.1007/s10278-022-00659-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2021] [Revised: 04/20/2022] [Accepted: 05/18/2022] [Indexed: 11/30/2022] Open
Abstract
Four-dimensional data sets are increasingly common in MRI and CT. While clinical visualization often focuses on individual temporal phases capturing the tissue(s) of interest, it may be possible to gain additional insight through exploring animated 3D reconstructions of physiological motion made possible by augmented or virtual reality representations of 4D patient imaging. Cardiac CT acquisitions can provide sufficient spatial resolution and temporal data to support advanced visualization, however, there are no open-source tools readily available to facilitate the transformation from raw medical images to dynamic and interactive augmented or virtual reality representations. To address this gap, we developed a workflow using free and open-source tools to process 4D cardiac CT imaging starting from raw DICOM data and ending with dynamic AR representations viewable on a phone, tablet, or computer. In addition to assembling the workflow using existing platforms (3D Slicer and Unity), we also contribute two new features: 1. custom software which can propagate a segmentation created for one cardiac phase to all others and export to surface files in a fully automated fashion, and 2. a user interface and linked code for the animation and interactive review of the surfaces in augmented reality. Validation of the surface-based areas demonstrated excellent correlation with radiologists' image-based areas (R > 0.99). While our tools were developed specifically for 4D cardiac CT, the open framework will allow it to serve as a blueprint for similar applications applied to 4D imaging of other tissues and using other modalities. We anticipate this and related workflows will be useful both clinically and for educational purposes.
Collapse
Affiliation(s)
- M Bindschadler
- Department of Neurology, Seattle, WA, USA
- Department of Radiology, Seattle Childrens, Seattle, WA, USA
| | - S Buddhe
- Department of Pediatrics, Seattle Children's Heart Center and the University of Washington, Seattle, WA, USA
| | - M R Ferguson
- Department of Radiology, University of Washington, Seattle, WA, USA
- Department of Radiology, Seattle Childrens, Seattle, WA, USA
| | - T Jones
- Department of Pediatrics, Seattle Children's Heart Center and the University of Washington, Seattle, WA, USA
| | - S D Friedman
- Department of Neurology, Seattle, WA, USA
- Department of Improvement and Innovation, Seattle, WA, USA
| | - R K Otto
- Department of Radiology, University of Washington, Seattle, WA, USA.
- Department of Radiology, Seattle Childrens, Seattle, WA, USA.
| |
Collapse
|
3
|
de la Hoz-Torres ML, Aguilar AJ, Martínez-Aires MD, Ruiz DP. Modelling and visualization for the analysis and comprehension of the acoustic performance of buildings through the implementation of a building information modelling-based methodology. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2022; 152:1515. [PMID: 36182289 DOI: 10.1121/10.0013886] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/20/2021] [Accepted: 08/15/2022] [Indexed: 06/16/2023]
Abstract
Technical and technological advances have revolutionised the architecture, engineering, and construction industries in recent decades. Building information modelling (BIM) methodology has become essential in the process of information management and the development of building projects. This study aims to analyse the potential advantages of the implementation of BIM-based models for the acquisition of theoretical and procedural knowledge about building acoustics. This procedure was implemented as part of a problem-solving exercise in Science, Technology, Engineering, and Mathematics (STEM) university degrees. For this purpose, three-dimensional (3D) BIM models were generated to assess the contribution of their implementation in the process of visualization, comprehension, and analysis of the acoustic behaviour of buildings. The participants' experiences and satisfaction with the BIM models were measured through a questionnaire. The results showed a high level of satisfaction among the participants and good potential for the application of 3D models based on BIM methodology for the acquisition of knowledge and practical skills in building acoustics. These results highlight the potential of BIM models to provide information for understanding the procedure followed during data collection in the experimental analysis and to facilitate the understanding of system behavior.
Collapse
Affiliation(s)
| | - Antonio J Aguilar
- Department of Applied Physics, University of Granada, Granada, 18002, Spain
| | | | - Diego P Ruiz
- Department of Applied Physics, University of Granada, Granada, 18002, Spain
| |
Collapse
|
4
|
McBain KA, Habib R, Laggis G, Quaiattini A, M Ventura N, Noel GPJC. Scoping review: The use of augmented reality in clinical anatomical education and its assessment tools. ANATOMICAL SCIENCES EDUCATION 2022; 15:765-796. [PMID: 34800073 DOI: 10.1002/ase.2155] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/26/2019] [Revised: 11/15/2021] [Accepted: 11/17/2021] [Indexed: 06/13/2023]
Abstract
The purpose of this review was to identify the different augmented reality (AR) modalities used to teach anatomy to students, health professional trainees, and surgeons, and to examine the assessment tools used to evaluate the performance of various AR modalities. A scoping review of four databases was performed using variations of: (1) AR, (2) medical or anatomical teaching/education/training, and (3) anatomy or radiology or cadaver. Scientific articles were identified and screened for the inclusion and exclusion criteria as per Preferred Reporting Items for Systematic Reviews and Meta-Analyses with extension for scoping reviews guidelines. Virtual reality was an exclusion criterion. From this scoping review, data were extracted from a total of 54 articles and the following four AR modalities were identified: head-mounted display, projection, instrument and screen, and mobile device. The usability, feasibility, and acceptability of these AR modalities were evaluated using a variety of quantitative and qualitative assessment tools. Within more recent years of AR integration into anatomy education, the assessment of visuospatial ability, cognitive load, time on task, and increasing academic achievement outcomes are variables of interest, which continue to warrant more exploration. Sufficiently powered studies using validated assessment tools must be conducted to better understand the role of AR in anatomical education.
Collapse
Affiliation(s)
- Kimberly A McBain
- School of Physical and Occupational Therapy, McGill University, Montreal, Quebec, Canada
| | - Rami Habib
- School of Medicine and Health Sciences, McGill University, Montreal, Quebec, Canada
| | - George Laggis
- School of Physical and Occupational Therapy, McGill University, Montreal, Quebec, Canada
| | - Andrea Quaiattini
- Schulich Library of Physical Sciences, Life Sciences, and Engineering, McGill University, Montreal, Quebec, Canada
- Institute of Health Sciences Education, Faculty of Medicine, McGill University, Montreal, Quebec, Canada
| | - Nicole M Ventura
- Institute of Health Sciences Education, Faculty of Medicine, McGill University, Montreal, Quebec, Canada
- Division of Anatomical Sciences, Department of Anatomy and Cell Biology, McGill University, Montreal, Quebec, Canada
| | - Geoffroy P J C Noel
- Institute of Health Sciences Education, Faculty of Medicine, McGill University, Montreal, Quebec, Canada
- Division of Anatomical Sciences, Department of Anatomy and Cell Biology, McGill University, Montreal, Quebec, Canada
- Division of Anatomy, Department of Surgery, University of California San Diego, San Diego, California, USA
| |
Collapse
|
5
|
Deng S, Wheeler G, Toussaint N, Munroe L, Bhattacharya S, Sajith G, Lin E, Singh E, Chu KYK, Kabir S, Pushparajah K, Simpson JM, Schnabel JA, Gomez A. A Virtual Reality System for Improved Image-Based Planning of Complex Cardiac Procedures. J Imaging 2021; 7:151. [PMID: 34460787 PMCID: PMC8404926 DOI: 10.3390/jimaging7080151] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2021] [Revised: 08/13/2021] [Accepted: 08/17/2021] [Indexed: 12/03/2022] Open
Abstract
The intricate nature of congenital heart disease requires understanding of the complex, patient-specific three-dimensional dynamic anatomy of the heart, from imaging data such as three-dimensional echocardiography for successful outcomes from surgical and interventional procedures. Conventional clinical systems use flat screens, and therefore, display remains two-dimensional, which undermines the full understanding of the three-dimensional dynamic data. Additionally, the control of three-dimensional visualisation with two-dimensional tools is often difficult, so used only by imaging specialists. In this paper, we describe a virtual reality system for immersive surgery planning using dynamic three-dimensional echocardiography, which enables fast prototyping for visualisation such as volume rendering, multiplanar reformatting, flow visualisation and advanced interaction such as three-dimensional cropping, windowing, measurement, haptic feedback, automatic image orientation and multiuser interactions. The available features were evaluated by imaging and nonimaging clinicians, showing that the virtual reality system can help improve the understanding and communication of three-dimensional echocardiography imaging and potentially benefit congenital heart disease treatment.
Collapse
Affiliation(s)
- Shujie Deng
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Gavin Wheeler
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Nicolas Toussaint
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Lindsay Munroe
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Suryava Bhattacharya
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Gina Sajith
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Ei Lin
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Eeshar Singh
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Ka Yee Kelly Chu
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Saleha Kabir
- Department of Congenital Heart Disease, Evelina London Children’s Hospital, Guy’s and St Thomas’ National Health Service Foundation Trust, London SE1 7EH, UK;
| | - Kuberan Pushparajah
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
- Department of Congenital Heart Disease, Evelina London Children’s Hospital, Guy’s and St Thomas’ National Health Service Foundation Trust, London SE1 7EH, UK;
| | - John M. Simpson
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
- Department of Congenital Heart Disease, Evelina London Children’s Hospital, Guy’s and St Thomas’ National Health Service Foundation Trust, London SE1 7EH, UK;
| | - Julia A. Schnabel
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
- Department of Informatics, Technische Universität München, 85748 Garching, Germany
- Helmholtz Zentrum München—German Research Center for Environmental Health, 85764 Neuherberg, Germany
| | - Alberto Gomez
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| |
Collapse
|
6
|
López-Ojeda W, Hurley RA. Extended-Reality Technologies: An Overview of Emerging Applications in Medical Education and Clinical Care. J Neuropsychiatry Clin Neurosci 2021; 33:A4-177. [PMID: 34289698 DOI: 10.1176/appi.neuropsych.21030067] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Affiliation(s)
- Wilfredo López-Ojeda
- Veterans Affairs Mid-Atlantic Mental Illness Research, Education, and Clinical Center, and Research and Academic Affairs Service Line, W.G. Hefner Veterans Affairs Medical Center, Salisbury, N.C. (López-Ojeda, Hurley); Department of Psychiatry and Behavioral Medicine, Wake Forest School of Medicine, Winston-Salem, N.C. (López-Ojeda); Departments of Psychiatry and Radiology, Wake Forest School of Medicine, Winston-Salem, N.C. (Hurley); and Menninger Department of Psychiatry and Behavioral Sciences, Baylor College of Medicine, Houston (Hurley)
| | - Robin A Hurley
- Veterans Affairs Mid-Atlantic Mental Illness Research, Education, and Clinical Center, and Research and Academic Affairs Service Line, W.G. Hefner Veterans Affairs Medical Center, Salisbury, N.C. (López-Ojeda, Hurley); Department of Psychiatry and Behavioral Medicine, Wake Forest School of Medicine, Winston-Salem, N.C. (López-Ojeda); Departments of Psychiatry and Radiology, Wake Forest School of Medicine, Winston-Salem, N.C. (Hurley); and Menninger Department of Psychiatry and Behavioral Sciences, Baylor College of Medicine, Houston (Hurley)
| |
Collapse
|