1
|
Du W, Zhong X, Jia Y, Jiang R, Yang H, Ye Z, Zong Z. A Novel Scenario-Based, Mixed-Reality Platform for Training Nontechnical Skills of Battlefield First Aid: Prospective Interventional Study. JMIR Serious Games 2022; 10:e40727. [PMID: 36472903 PMCID: PMC9768658 DOI: 10.2196/40727] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2022] [Revised: 09/13/2022] [Accepted: 10/31/2022] [Indexed: 12/12/2022] Open
Abstract
BACKGROUND Although battlefield first aid (BFA) training shares many common features with civilian training, such as the need to address technical skills and nontechnical skills (NTSs), it is more highly scenario-dependent. Studies into extended reality show clear benefits in medical training; however, the training effects of extended reality on NTSs, including teamwork and decision-making in BFA, have not been fully proven. OBJECTIVE The current study aimed to create and test a scenario-based, mixed-reality platform suitable for training NTSs in BFA. METHODS First, using next-generation modeling technology and an animation synchronization system, a 10-person offensive battle drill was established. Decision-making training software addressing basic principles of tactical combat casualty care was constructed and integrated into the scenarios with Unreal Engine 4 (Epic Games). Large-space teamwork and virtual interaction systems that made sense in the proposed platform were developed. Unreal Engine 4 and software engineering technology were used to combine modules to establish a mixed-reality BFA training platform. A total of 20 Grade 4 medical students were recruited to accept BFA training with the platform. Pretraining and posttraining tests were carried out in 2 forms to evaluate the training effectiveness: one was knowledge acquisition regarding the NTS and the other was a real-world, scenario-based test. In addition, the students were asked to rate their agreement with a series of survey items on a 5-point Likert scale. RESULTS A battlefield geographic environment, tactical scenarios, scenario-based decision software, large-space teamwork, and virtual interaction system modules were successfully developed and combined to establish the mixed-reality training platform for BFA. The posttraining score of the students' knowledge acquisition was significantly higher than that of pretraining (t=-12.114; P≤.001). Furthermore, the NTS score and the total score that the students obtained in the real-world test were significantly higher than those before training (t=-17.756 and t=-21.354, respectively; P≤.001). However, there was no significant difference between the scores of technical skills that the students obtained before and after training. A posttraining survey revealed that the students found the platform helpful in improving NTSs for BFA, and they were confident in applying BFA skills after training. However, most trainees thought that the platform was not helpful for improving the technical skills of BFA, and 45% (9/20) of the trainees were not satisfied with the simulation effect. CONCLUSIONS A scenario-based, mixed-reality platform was constructed in this study. In this platform, interaction of the movement of multiple players in a large space and the interaction of decision-making by the trainees between the real world and the virtual world were accomplished. The platform could improve the NTSs of BFA. Future works, including improvement of the simulation effects and development of a training platform that could effectively improve both the technical skills and NTSs of BFA, will be carried out.
Collapse
Affiliation(s)
- Wenqiong Du
- State Key Laboratory of Trauma, Burn and Combined Injury, Department for Combat Casualty Care Training, Training Base for Army Health Care, Army Medical University, Chongqing, China
| | - Xin Zhong
- State Key Laboratory of Trauma, Burn and Combined Injury, Department for Combat Casualty Care Training, Training Base for Army Health Care, Army Medical University, Chongqing, China
| | - Yijun Jia
- State Key Laboratory of Trauma, Burn and Combined Injury, Department for Combat Casualty Care Training, Training Base for Army Health Care, Army Medical University, Chongqing, China
| | - Renqing Jiang
- State Key Laboratory of Trauma, Burn and Combined Injury, Department for Combat Casualty Care Training, Training Base for Army Health Care, Army Medical University, Chongqing, China
| | - Haoyang Yang
- State Key Laboratory of Trauma, Burn and Combined Injury, Department for Combat Casualty Care Training, Training Base for Army Health Care, Army Medical University, Chongqing, China
| | - Zhao Ye
- State Key Laboratory of Trauma, Burn and Combined Injury, Department for Combat Casualty Care Training, Training Base for Army Health Care, Army Medical University, Chongqing, China
| | - Zhaowen Zong
- State Key Laboratory of Trauma, Burn and Combined Injury, Department for Combat Casualty Care Training, Training Base for Army Health Care, Army Medical University, Chongqing, China
| |
Collapse
|
2
|
Wiebe A, Kannen K, Selaskowski B, Mehren A, Thöne AK, Pramme L, Blumenthal N, Li M, Asché L, Jonas S, Bey K, Schulze M, Steffens M, Pensel MC, Guth M, Rohlfsen F, Ekhlas M, Lügering H, Fileccia H, Pakos J, Lux S, Philipsen A, Braun N. Virtual reality in the diagnostic and therapy for mental disorders: A systematic review. Clin Psychol Rev 2022; 98:102213. [PMID: 36356351 DOI: 10.1016/j.cpr.2022.102213] [Citation(s) in RCA: 21] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2021] [Revised: 08/21/2022] [Accepted: 10/11/2022] [Indexed: 01/27/2023]
Abstract
BACKGROUND Virtual reality (VR) technologies are playing an increasingly important role in the diagnostics and treatment of mental disorders. OBJECTIVE To systematically review the current evidence regarding the use of VR in the diagnostics and treatment of mental disorders. DATA SOURCE Systematic literature searches via PubMed (last literature update: 9th of May 2022) were conducted for the following areas of psychopathology: Specific phobias, panic disorder and agoraphobia, social anxiety disorder, generalized anxiety disorder, posttraumatic stress disorder (PTSD), obsessive-compulsive disorder, eating disorders, dementia disorders, attention-deficit/hyperactivity disorder, depression, autism spectrum disorder, schizophrenia spectrum disorders, and addiction disorders. ELIGIBILITY CRITERIA To be eligible, studies had to be published in English, to be peer-reviewed, to report original research data, to be VR-related, and to deal with one of the above-mentioned areas of psychopathology. STUDY EVALUATION For each study included, various study characteristics (including interventions and conditions, comparators, major outcomes and study designs) were retrieved and a risk of bias score was calculated based on predefined study quality criteria. RESULTS Across all areas of psychopathology, k = 9315 studies were inspected, of which k = 721 studies met the eligibility criteria. From these studies, 43.97% were considered assessment-related, 55.48% therapy-related, and 0.55% were mixed. The highest research activity was found for VR exposure therapy in anxiety disorders, PTSD and addiction disorders, where the most convincing evidence was found, as well as for cognitive trainings in dementia and social skill trainings in autism spectrum disorder. CONCLUSION While VR exposure therapy will likely find its way successively into regular patient care, there are also many other promising approaches, but most are not yet mature enough for clinical application. REVIEW REGISTRATION PROSPERO register CRD42020188436. FUNDING The review was funded by budgets from the University of Bonn. No third party funding was involved.
Collapse
Affiliation(s)
- Annika Wiebe
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Kyra Kannen
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Benjamin Selaskowski
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Aylin Mehren
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Ann-Kathrin Thöne
- School of Child and Adolescent Cognitive Behavior Therapy (AKiP), Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany
| | - Lisa Pramme
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Nike Blumenthal
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Mengtong Li
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Laura Asché
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Stephan Jonas
- Institute for Digital Medicine, University Hospital Bonn, Bonn, Germany
| | - Katharina Bey
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Marcel Schulze
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Maria Steffens
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Max Christian Pensel
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Matthias Guth
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Felicia Rohlfsen
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Mogda Ekhlas
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Helena Lügering
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Helena Fileccia
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Julian Pakos
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Silke Lux
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Alexandra Philipsen
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany
| | - Niclas Braun
- Department of Psychiatry and Psychotherapy, University Hospital Bonn, Bonn, Germany.
| |
Collapse
|
3
|
Wang Z, Liu J, Zhang W, Nie W, Liu H. Diagnosis and Intervention for Children With Autism Spectrum Disorder: A Survey. IEEE Trans Cogn Dev Syst 2022. [DOI: 10.1109/tcds.2021.3093040] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
Affiliation(s)
- Zhiyong Wang
- State Key Laboratory of Mechanical System and Vibration, Shanghai Jiao Tong University, Shanghai, China
| | - Jingjing Liu
- State Key Laboratory of Mechanical System and Vibration, Shanghai Jiao Tong University, Shanghai, China
| | - Wanqi Zhang
- State Key Laboratory of Mechanical System and Vibration, Shanghai Jiao Tong University, Shanghai, China
| | - Wei Nie
- State Key Laboratory of Robotics and Systems, Harbin Institute of Technology Shenzhen, Shenzhen, China
| | - Honghai Liu
- State Key Laboratory of Robotics and Systems, Harbin Institute of Technology Shenzhen, Shenzhen, China
| |
Collapse
|
4
|
Artiran S, Ravisankar R, Luo S, Chukoskie L, Cosman P. Measuring Social Modulation of Gaze in Autism Spectrum Condition With Virtual Reality Interviews. IEEE Trans Neural Syst Rehabil Eng 2022; 30:2373-2384. [PMID: 35969548 DOI: 10.1109/tnsre.2022.3198933] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Gaze behavior in dyadic conversations can indicate active listening and attention. However, gaze behavior that is different from the engagement expected during neurotypical social interaction cues may be interpreted as uninterested or inattentive, which can be problematic in both personal and professional situations. Neurodivergent individuals, such as those with autism spectrum conditions, often exhibit social communication differences broadly including via gaze behavior. This project aims to support situational social gaze practice through a virtual reality (VR) mock job interview practice using the HTC Vive Pro Eye VR headset. We show how gaze behavior varies in the mock job interview between neurodivergent and neurotypical participants. We also investigate the social modulation of gaze behavior based on conversational role (speaking and listening). Our three main contributions are: (i) a system for fully-automatic analysis of social modulation of gaze behavior using a portable VR headset with a novel realistic mock job interview, (ii) a signal processing pipeline, which employs Kalman filtering and spatial-temporal density-based clustering techniques, that can improve the accuracy of the headset's built-in eye-tracker, and (iii) being the first to investigate social modulation of gaze behavior among neurotypical/divergent individuals in the realm of immersive VR.
Collapse
|
5
|
Dechsling A, Orm S, Kalandadze T, Sütterlin S, Øien RA, Shic F, Nordahl-Hansen A. Virtual and Augmented Reality in Social Skills Interventions for Individuals with Autism Spectrum Disorder: A Scoping Review. J Autism Dev Disord 2021; 52:4692-4707. [PMID: 34783991 PMCID: PMC9556391 DOI: 10.1007/s10803-021-05338-5] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/20/2021] [Indexed: 11/25/2022]
Abstract
In the last decade, there has been an increase in publications on technology-based interventions for autism spectrum disorder (ASD). Virtual reality based assessments and intervention tools are promising and have shown to be acceptable amongst individuals with ASD. This scoping review reports on 49 studies utilizing virtual reality and augmented reality technology in social skills interventions for individuals with ASD. The included studies mostly targeted children and adolescents, but few targeted very young children or adults. Our findings show that the mode number of participants with ASD is low, and that female participants are underrepresented. Our review suggests that there is need for studies that apply virtual and augmented realty with more rigorous designs involving established and evidenced-based intervention strategies.
Collapse
Affiliation(s)
- Anders Dechsling
- Faculty of Teacher Education and Languages, Østfold University College, B R A veien 4, 1757, Halden, Norway.
| | - Stian Orm
- Department of Welfare and Participation, Western Norway University of Applied Sciences, Bergen, Norway
| | - Tamara Kalandadze
- Faculty of Teacher Education and Languages, Østfold University College, B R A veien 4, 1757, Halden, Norway
| | - Stefan Sütterlin
- Faculty of Computer Science, Albstadt-Sigmaringen University, Sigmaringen, Germany.,Faculty of Health, Welfare and Organisation, Østfold University College, Halden, Norway
| | - Roald A Øien
- Department of Education, The Arctic University of Norway - University of Tromsø, Tromsö, Norway.,Child Study Center, Yale University School of Medicine, New Haven, USA
| | - Frederick Shic
- Center for Child Health, Behavior and Development, Seattle Children's Research Institute, Seattle, USA.,Department of Pediatrics, University of Washington School of Medicine, Washington, USA
| | - Anders Nordahl-Hansen
- Faculty of Teacher Education and Languages, Østfold University College, B R A veien 4, 1757, Halden, Norway
| |
Collapse
|
6
|
Koochaki F, Najafizadeh L. A Data-Driven Framework for Intention Prediction via Eye Movement With Applications to Assistive Systems. IEEE Trans Neural Syst Rehabil Eng 2021; 29:974-984. [PMID: 34038364 DOI: 10.1109/tnsre.2021.3083815] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
Fast and accurate human intention prediction can significantly advance the performance of assistive devices for patients with limited motor or communication abilities. Among available modalities, eye movement can be valuable for inferring the user's intention, as it can be tracked non-invasively. However, existing limited studies in this domain do not provide the level of accuracy required for the reliable operation of assistive systems. By taking a data-driven approach, this paper presents a new framework that utilizes the spatial and temporal patterns of eye movement along with deep learning to predict the user's intention. In the proposed framework, the spatial patterns of gaze are identified by clustering the gaze points based on their density over displayed images in order to find the regions of interest (ROIs). The temporal patterns of gaze are identified via hidden Markov models (HMMs) to find the transition sequence between ROIs. Transfer learning is utilized to identify the objects of interest in the displayed images. Finally, models are developed to predict the user's intention after completing the task as well as at early stages of the task. The proposed framework is evaluated in an experiment involving predicting intended daily-life activities. Results indicate that an average classification accuracy of 97.42% is achieved, which is considerably higher than existing gaze-based intention prediction studies.
Collapse
|