1
|
Salimi S, Asgari Z, Mohammadnejad A, Teimazi A, Bakhtiari M. Efficacy of virtual reality and augmented reality in anatomy education: A systematic review and meta-analysis. ANATOMICAL SCIENCES EDUCATION 2024. [PMID: 39300601 DOI: 10.1002/ase.2501] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/09/2021] [Revised: 07/23/2024] [Accepted: 07/28/2024] [Indexed: 09/22/2024]
Abstract
Anatomy is the cornerstone of medical education. Virtual reality (VR) and augmented reality (AR) technologies are becoming increasingly popular in the development of anatomy education. Various studies have evaluated VR and AR in anatomy education. This meta-analysis aims to evaluate the effectiveness of VR and AR in anatomical education. The protocol was registered in Prospero. Scopus, PubMed, Web of Science, and Cochrane Library databases were searched. From the 4487 articles gathered, 24 randomized controlled trials were finally selected according to inclusion criteria. According to the results of the meta-analysis, VR had a moderate and significant effect on the improvement of knowledge scores in comparison with other methods (standardized mean difference = 0.58; 95% CI = 0.22, 0.95; p < 0.01). Due to the high degree of heterogeneity (I2 = 87.44%), subgroup analyses and meta-regression were performed on eight variables. In enhancing the "attitude," VR was found to be more "useful" than other methods (p = 0.01); however, no significant difference was found for "enjoyable" and "easy to use" statements. Compared with other methods, the effect of AR on knowledge scores was non-significant (SMD = -0.02; 95% CI = -0.39, 0.34; p = 0.90); also, in subgroup analyses and meta-regression, the results were non-significant. The results indicate that, unlike AR, VR could be used as an effective tool for teaching anatomy in medical education. Given the observed heterogeneity across the included studies, further research is warranted to identify those variables that may impact the efficacy of VR and AR in anatomy education.
Collapse
Affiliation(s)
- Sajjad Salimi
- Student Research Committee, Faculty of Medicine, Kermanshah University of Medical Sciences, Kermanshah, Iran
| | - Zahra Asgari
- Student Research Committee, Faculty of Medicine, Kermanshah University of Medical Sciences, Kermanshah, Iran
| | - Amirreza Mohammadnejad
- Student Research Committee, Faculty of Medicine, Kermanshah University of Medical Sciences, Kermanshah, Iran
| | - Ashkan Teimazi
- Student Research Committee, Faculty of Medicine, Kermanshah University of Medical Sciences, Kermanshah, Iran
| | - Mitra Bakhtiari
- Department of Anatomical Sciences, Faculty of Medicine, Kermanshah University of Medical Sciences, Kermanshah, Iran
| |
Collapse
|
2
|
Kim SJ. Virtual fashion experiences in virtual reality fashion show spaces. Front Psychol 2023; 14:1276856. [PMID: 38046109 PMCID: PMC10693427 DOI: 10.3389/fpsyg.2023.1276856] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2023] [Accepted: 11/07/2023] [Indexed: 12/05/2023] Open
Abstract
Introduction Virtual reality (VR) provides a new fashion space and fashion experience. This study focuses on immersive VR and fashion shows to empirically explore the VR fashion space and fashion experience. Insights specific to fashion have not been presented in as much depth in the literature; thus, the current findings are particularly valuable and insightful. Methods This study employed three immersive VR (IVR) fashion show stimuli and in-depth interviews according to a semi-structured questionnaire. Collected data were analyzed based on the concept of VR space and VR experience derived through literature research. Results The VR fashion space was divided into three types and VR experiences of cognitive presence, sensible immersion, emotional immersion, and aesthetic interaction were derived accordingly. First, the physical representation of a fashion show induced a cognitive and emotional sense of presence, in which users felt as though they had moved to the same time and place as those at the fashion show. Second, participants experienced cognitive confusion owing to the differences with a priori experiences in the fashion show space (i.e., reality and imagination coexist). Third, participants transcended the limitations of physical reality while in the fashion show space of pataphysics (which was realized with human imagination), and they moved beyond the stage of confusion that is experienced while facing realistic objects to connect to creative inspiration. Discussion The difference in the properties of VR space may be associated with distinct VR fashion experiences. The findings suggest that (1) a priori elements such as sociocultural contexts and personal experiences differ in the experiential dimension of virtual space, (2) the VR fashion show space induces a psychological experience between brand and consumer, and (3) creative inspiration and exploratory play can be greatly induced in a user if the immersive fashion space is further from the original source.
Collapse
Affiliation(s)
- Se Jin Kim
- Department of Clothing and Textiles, Changwon National University, Changwon, Republic of Korea
| |
Collapse
|
3
|
Melo M, Goncalves G, Monteiro P, Coelho H, Vasconcelos-Raposo J, Bessa M. Do Multisensory Stimuli Benefit the Virtual Reality Experience? A Systematic Review. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:1428-1442. [PMID: 32746276 DOI: 10.1109/tvcg.2020.3010088] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
The majority of virtual reality (VR) applications rely on audiovisual stimuli and do not exploit the addition of other sensory cues that could increase the potential of VR. This systematic review surveys the existing literature on multisensory VR and the impact of haptic, olfactory, and taste cues over audiovisual VR. The goal is to identify the extent to which multisensory stimuli affect the VR experience, which stimuli are used in multisensory VR, the type of VR setups used, and the application fields covered. An analysis of the 105 studies that met the eligibility criteria revealed that 84.8 percent of the studies show a positive impact of multisensory VR experiences. Haptics is the most commonly used stimulus in multisensory VR systems (86.6 percent). Non-immersive and immersive VR setups are preferred over semi-immersive setups. Regarding the application fields, a considerable part was adopted by health professionals and science and engineering professionals. We further conclude that smell and taste are still underexplored, and they can bring significant value to VR applications. More research is recommended on how to synthesize and deliver these stimuli, which still require complex and costly apparatus be integrated into the VR experience in a controlled and straightforward manner.
Collapse
|
4
|
Cooper N, Millela F, Cant I, White MD, Meyer G. Transfer of training-Virtual reality training with augmented multisensory cues improves user experience during training and task performance in the real world. PLoS One 2021; 16:e0248225. [PMID: 33760859 PMCID: PMC7990292 DOI: 10.1371/journal.pone.0248225] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2020] [Accepted: 02/22/2021] [Indexed: 01/22/2023] Open
Abstract
Virtual reality (VR) can create safe, cost-effective, and engaging learning environments. It is commonly assumed that improvements in simulation fidelity lead to better learning outcomes. Some aspects of real environments, for example vestibular or haptic cues, are difficult to recreate in VR, but VR offers a wealth of opportunities to provide additional sensory cues in arbitrary modalities that provide task relevant information. The aim of this study was to investigate whether these cues improve user experience and learning outcomes, and, specifically, whether learning using augmented sensory cues translates into performance improvements in real environments. Participants were randomly allocated into three matched groups: Group 1 (control) was asked to perform a real tyre change only. The remaining two groups were trained in VR before performance was evaluated on the same, real tyre change task. Group 2 was trained using a conventional VR system, while Group 3 was trained in VR with augmented, task relevant, multisensory cues. Objective performance, time to completion and error number, subjective ratings of presence, perceived workload, and discomfort were recorded. The results show that both VR training paradigms improved performance for the real task. Providing additional, task-relevant cues during VR training resulted in higher objective performance during the real task. We propose a novel method to quantify the relative performance gains between training paradigms that estimates the relative gain in terms of training time. Systematic differences in subjective ratings that show comparable workload ratings, higher presence ratings and lower discomfort ratings, mirroring objective performance measures, were also observed. These findings further support the use of augmented multisensory cues in VR environments as an efficient method to enhance performance, user experience and, critically, the transfer of training from virtual to real environment scenarios.
Collapse
Affiliation(s)
- Natalia Cooper
- Construction Research Centre, National Research Council Canada, Ottawa, Ontario, Canada
| | | | - Iain Cant
- Virtual Engineering Centre, University of Liverpool, Liverpool, Merseyside, United Kingdom
| | - Mark D. White
- Department of Engineering, University of Liverpool, Liverpool, Merseyside, United Kingdom
| | - Georg Meyer
- Department of Psychology, University of Liverpool, Liverpool, Merseyside, United Kingdom
- * E-mail:
| |
Collapse
|
5
|
Campos J, Ramkhalawansingh R, Pichora-Fuller MK. Hearing, self-motion perception, mobility, and aging. Hear Res 2018; 369:42-55. [DOI: 10.1016/j.heares.2018.03.025] [Citation(s) in RCA: 53] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/01/2017] [Revised: 02/20/2018] [Accepted: 03/29/2018] [Indexed: 11/30/2022]
|
6
|
Cooper N, Milella F, Pinto C, Cant I, White M, Meyer G. The effects of substitute multisensory feedback on task performance and the sense of presence in a virtual reality environment. PLoS One 2018; 13:e0191846. [PMID: 29390023 PMCID: PMC5794113 DOI: 10.1371/journal.pone.0191846] [Citation(s) in RCA: 34] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2016] [Accepted: 01/12/2018] [Indexed: 01/22/2023] Open
Abstract
Objective and subjective measures of performance in virtual reality environments increase as more sensory cues are delivered and as simulation fidelity increases. Some cues (colour or sound) are easier to present than others (object weight, vestibular cues) so that substitute cues can be used to enhance informational content in a simulation at the expense of simulation fidelity. This study evaluates how substituting cues in one modality by alternative cues in another modality affects subjective and objective performance measures in a highly immersive virtual reality environment. Participants performed a wheel change in a virtual reality (VR) environment. Auditory, haptic and visual cues, signalling critical events in the simulation, were manipulated in a factorial design. Subjective ratings were recorded via questionnaires. The time taken to complete the task was used as an objective performance measure. The results show that participants performed best and felt an increased sense of immersion and involvement, collectively referred to as ‘presence’, when substitute multimodal sensory feedback was provided. Significant main effects of audio and tactile cues on task performance and on participants' subjective ratings were found. A significant negative relationship was found between the objective (overall completion times) and subjective (ratings of presence) performance measures. We conclude that increasing informational content, even if it disrupts fidelity, enhances performance and user’s overall experience. On this basis we advocate the use of substitute cues in VR environments as an efficient method to enhance performance and user experience.
Collapse
Affiliation(s)
- Natalia Cooper
- Construction Research Centre, National Research Council, Ottawa, Canada
- * E-mail:
| | | | - Carlo Pinto
- Virtual Engineering Centre, Daresbury, United Kingdom
| | - Iain Cant
- Virtual Engineering Centre, Daresbury, United Kingdom
| | - Mark White
- Department of Psychology, University of Liverpool, Liverpool, United Kingdom
| | - Georg Meyer
- Department of Psychology, University of Liverpool, Liverpool, United Kingdom
| |
Collapse
|
7
|
Munirama S, Eisma R, Columb M, Corner G, McLeod G. Physical properties and functional alignment of soft-embalmed Thiel human cadaver when used as a simulator for ultrasound-guided regional anaesthesia. Br J Anaesth 2016; 116:699-707. [DOI: 10.1093/bja/aev548] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/16/2015] [Indexed: 11/14/2022] Open
|
8
|
Keshavarz B, Hettinger LJ, Vena D, Campos JL. Combined effects of auditory and visual cues on the perception of vection. Exp Brain Res 2013; 232:827-36. [DOI: 10.1007/s00221-013-3793-9] [Citation(s) in RCA: 47] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2013] [Accepted: 11/20/2013] [Indexed: 11/29/2022]
|
9
|
Modulation of visually evoked postural responses by contextual visual, haptic and auditory information: a 'virtual reality check'. PLoS One 2013; 8:e67651. [PMID: 23840760 PMCID: PMC3695920 DOI: 10.1371/journal.pone.0067651] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2013] [Accepted: 05/22/2013] [Indexed: 11/19/2022] Open
Abstract
Externally generated visual motion signals can cause the illusion of self-motion in space (vection) and corresponding visually evoked postural responses (VEPR). These VEPRs are not simple responses to optokinetic stimulation, but are modulated by the configuration of the environment. The aim of this paper is to explore what factors modulate VEPRs in a high quality virtual reality (VR) environment where real and virtual foreground objects served as static visual, auditory and haptic reference points. Data from four experiments on visually evoked postural responses show that: 1) visually evoked postural sway in the lateral direction is modulated by the presence of static anchor points that can be haptic, visual and auditory reference signals; 2) real objects and their matching virtual reality representations as visual anchors have different effects on postural sway; 3) visual motion in the anterior-posterior plane induces robust postural responses that are not modulated by the presence of reference signals or the reality of objects that can serve as visual anchors in the scene. We conclude that automatic postural responses for laterally moving visual stimuli are strongly influenced by the configuration and interpretation of the environment and draw on multisensory representations. Different postural responses were observed for real and virtual visual reference objects. On the basis that automatic visually evoked postural responses in high fidelity virtual environments should mimic those seen in real situations we propose to use the observed effect as a robust objective test for presence and fidelity in VR.
Collapse
|
10
|
Vision contingent auditory pitch aftereffects. Exp Brain Res 2013; 229:97-102. [PMID: 23727883 DOI: 10.1007/s00221-013-3596-z] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2013] [Accepted: 05/23/2013] [Indexed: 10/26/2022]
Abstract
Visual motion aftereffects can occur contingent on arbitrary sounds. Two circles, placed side by side, were alternately presented, and the onsets were accompanied by tone bursts of high and low frequencies, respectively. After a few minutes of exposure to the visual apparent motion with the tones, a circle blinking at a fixed location was perceived as a lateral motion in the same direction as the previously exposed apparent motion (Teramoto et al. in PLoS One 5:e12255, 2010). In the present study, we attempted to reverse this contingency (pitch aftereffects contingent on visual information). Results showed that after prolonged exposure to the audio-visual stimuli, the apparent visual motion systematically affected the perceived pitch of the auditory stimuli. When the leftward apparent visual motion was paired with the high-low-frequency sequence during the adaptation phase, a test tone sequence was more frequently perceived as a high-low-pitch sequence when the leftward apparent visual motion was presented and vice versa. Furthermore, the effect was specific for the exposed visual field and did not transfer to the other side, thus ruling out an explanation in terms of simple response bias. These results suggest that new audiovisual associations can be established within a short time, and visual information processing and auditory processing can mutually influence each other.
Collapse
|