1
|
Zulkarnain AHB, Cao X, Kókai Z, Gere A. Self-Assessed Experience of Emotional Involvement in Sensory Analysis Performed in Virtual Reality. Foods 2024; 13:375. [PMID: 38338511 PMCID: PMC10855596 DOI: 10.3390/foods13030375] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2023] [Revised: 01/17/2024] [Accepted: 01/20/2024] [Indexed: 02/12/2024] Open
Abstract
Virtual reality (VR) technology has gained significant attention in various fields, including education for health professionals, sensory science, psychology, and consumer research. The first aim of the paper is to explore the self-assessed experience of emotional involvement in sensory analysis performed in VR. The Positive and Negative Affect Schedule (PANAS) is a widely used self-report measure that assesses positive and negative affective states. VR sensory analysis involves the use of immersive, interactive, and multi-sensory environments to evaluate sensory perception and emotional responses. By synthesizing relevant literature, this paper provides insights into the impact of VR on affective states, the effectiveness of VR in eliciting emotions, and the potential applications of the PANAS in VR sensory analysis. Furthermore, the second aim of the paper is to uncover the effect of VR sensory evaluation on the participant's emotional states, as it has a significant effect on their evaluations. The results suggest an increase in the sum of positive effects and a decrease in the negative ones. Although these results are promising, the relationship between the PANAS and VR sensory analysis is still underexplored, with limited research investigating the specific effects of VR on affective states measured using the PANAS. Further research is needed to better understand the potential of the PANAS in assessing emotional responses in VR environments and its implications for sensory analysis.
Collapse
Affiliation(s)
| | | | | | - Attila Gere
- Institute of Food Science and Technology, Hungarian University of Agriculture and Life Sciences, Villányi út. 29-31, H-1118 Budapest, Hungary; (A.H.B.Z.); (X.C.); (Z.K.)
| |
Collapse
|
2
|
Navas-Medrano S, Soler-Dominguez JL, Pons P. Mixed Reality for a collective and adaptive mental health metaverse. Front Psychiatry 2024; 14:1272783. [PMID: 38250268 PMCID: PMC10796542 DOI: 10.3389/fpsyt.2023.1272783] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/10/2023] [Accepted: 12/11/2023] [Indexed: 01/23/2024] Open
Abstract
This research paper explores the significant transformative potential of Mixed Reality (MR) technology as enabler of the metaverse, specifically aimed at enhancing mental health therapies. The emerging world of the metaverse, a multiuser, adaptive, three-dimensional digital space, paired with the interactive and immersive benefits of MR technology, promises a paradigm shift in how mental health support is delivered. Unlike traditional platforms, MR allows for therapy within the comfort of the user's familiar surroundings, while incorporating the benefits of social collaboration and interactions. The metaverse environment fosters heightened personalization and deeper user engagement, thereby offering a more tailored approach to computerized therapy. Beyond its immersive capabilities, MR offers potential for real-time, smart adaptations to the users' psycho-physiological state, targeting unique patients' needs on a diverse spectrum of therapeutic techniques, thus broadening the scope of mental health support. Furthermore, it opens avenues for continuous emotional support in everyday life situations. This research discusses the benefits and potentials of integrating MR within a mental health metaverse, highlighting how this innovative approach could significantly complement traditional therapeutic methods, fostering improved treatment efficacy, focusing on social and collective experiences, and increasing patient engagement.
Collapse
|
3
|
Saffari F, Zarei S, Kakaria S, Bigné E, Bruni LE, Ramsøy TZ. The Role of Stimuli-Driven and Goal-Driven Attention in Shopping Decision-Making Behaviors-An EEG and VR Study. Brain Sci 2023; 13:928. [PMID: 37371406 DOI: 10.3390/brainsci13060928] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2023] [Revised: 05/26/2023] [Accepted: 06/05/2023] [Indexed: 06/29/2023] Open
Abstract
The human attention system, similar to other networks in the brain, is of a complex nature. At any moment, our attention can shift between external and internal stimuli. In this study, we aimed to assess three EEG-based measures of attention (Power Spectral Density, Connectivity, and Spectral Entropy) in decision-making situations involving goal-directed and stimulus-driven attention using a Virtual Reality supermarket. We collected the EEG data of 29 participants in 2 shopping phases, planned and unplanned purchases. The three mentioned features were extracted and a statistical analysis was conducted. We evaluated the discriminatory power of these features using an SVM classifier. The results showed a significant (p-value < 0.001) increase in theta power over frontal, central, and temporal lobes for the planned purchase phase. There was also a significant decrease in alpha power over frontal and parietal lobes in the unplanned purchase phase. A significant increase in the frontoparietal connectivity during the planned purchase was observed. Additionally, an increase in spectral entropy was observed in the frontoparietal region for the unplanned purchase phase. The classification results showed that spectral entropy has the highest discriminatory power. This study can provide further insights into the attentional behaviors of consumers and how their type of attentional control can affect their decision-making processes.
Collapse
Affiliation(s)
- Farzad Saffari
- Neurons Inc., 2630 Hoje-Taastrup, Denmark
- Augmented Cognition Lab, Aalborg University, 2450 Copenhagen, Denmark
| | - Sahar Zarei
- Neurons Inc., 2630 Hoje-Taastrup, Denmark
- Department of Psychology, University of Copenhagen, 1172 Copenhagen, Denmark
| | - Shobhit Kakaria
- Faculty of Economics, University of Valencia, 46010 Valencia, Spain
| | - Enrique Bigné
- Faculty of Economics, University of Valencia, 46010 Valencia, Spain
| | - Luis E Bruni
- Augmented Cognition Lab, Aalborg University, 2450 Copenhagen, Denmark
| | | |
Collapse
|
4
|
El Basbasse Y, Packheiser J, Peterburs J, Maymon C, Güntürkün O, Grimshaw G, Ocklenburg S. Walk the plank! Using mobile electroencephalography to investigate emotional lateralization of immersive fear in virtual reality. ROYAL SOCIETY OPEN SCIENCE 2023; 10:221239. [PMID: 37266038 PMCID: PMC10230188 DOI: 10.1098/rsos.221239] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/08/2022] [Accepted: 04/03/2023] [Indexed: 06/03/2023]
Abstract
Most studies on emotion processing induce emotions through images or films. However, this method lacks ecological validity, limiting generalization to real-life emotion processing. More realistic paradigms using virtual reality (VR) may be better suited to investigate authentic emotional states and their neuronal correlates. This pre-registered study examines the neuronal underpinnings of naturalistic fear, measured using mobile electroencephalography (EEG). Seventy-five healthy participants walked across a virtual plank which extended from the side of a skyscraper-either 80 storeys up (the negative condition) or at street level (the neutral condition). Subjective ratings showed that the negative condition induced feelings of fear. Following the VR experience, participants passively viewed negative and neutral images from the international affective picture system (IAPS) outside of VR. We compared frontal alpha asymmetry between the plank and IAPS task and across valence of the conditions. Asymmetry indices in the plank task revealed greater right-hemispheric lateralization during the negative VR condition, relative to the neutral VR condition and to IAPS viewing. Within the IAPS task, no significant asymmetries were detected. In summary, our findings indicate that immersive technologies such as VR can advance emotion research by providing more ecologically valid ways to induce emotion.
Collapse
Affiliation(s)
- Yasmin El Basbasse
- Department of Biopsychology, Faculty of Psychology, Institute of Cognitive Neuroscience, Ruhr-University Bochum, Universitätsstrasse 150, 44780 Bochum, Germany
| | - Julian Packheiser
- Netherlands Institute for Neuroscience, Social Brain Lab, 1105 BA Amsterdam, The Netherlands
| | - Jutta Peterburs
- Institute for Systems Medicine & Department of Human Medicine, MSH Medical School Hamburg, Victoria University of Wellington, Wellington 6140, New Zealand
| | - Christopher Maymon
- School of Psychology, Victoria University of Wellington, Wellington 6140, New Zealand
| | - Onur Güntürkün
- Department of Biopsychology, Faculty of Psychology, Institute of Cognitive Neuroscience, Ruhr-University Bochum, Universitätsstrasse 150, 44780 Bochum, Germany
- Research Center One Health Ruhr, Research Alliance Ruhr, Ruhr University Bochum, Bochum, Germany
| | - Gina Grimshaw
- School of Psychology, Victoria University of Wellington, Wellington 6140, New Zealand
| | - Sebastian Ocklenburg
- Department of Biopsychology, Faculty of Psychology, Institute of Cognitive Neuroscience, Ruhr-University Bochum, Universitätsstrasse 150, 44780 Bochum, Germany
- Department of Psychology, MSH Medical School Hamburg, Am Kaiserkai 1, 20457 Hamburg, Germany
- Institute for Cognitive and Affective Neuroscience, Medical School Hamburg, Am Kaiserkai 1, 20457 Hamburg, Germany
| |
Collapse
|
5
|
Bieńkiewicz MMN, Janaqi S, Jean P, Bardy BG. Impact of emotion-laden acoustic stimuli on group synchronisation performance. Sci Rep 2023; 13:7094. [PMID: 37127737 PMCID: PMC10150690 DOI: 10.1038/s41598-023-34406-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2022] [Accepted: 04/28/2023] [Indexed: 05/03/2023] Open
Abstract
The ability to synchronise with other people is a core socio-motor competence acquired during human development. In this study we aimed to understand the impact of individual emotional arousal on joint action performance. We asked 15 mixed-gender groups (of 4 individuals each) to participate in a digital, four-way movement synchronisation task. Participants shared the same physical space, but could not see each other during the task. In each trial run, every participant was induced with an emotion-laden acoustic stimulus (pre-selected from the second version of International Affective Digitized Sounds). Our data demonstrated that the human ability to synchronise is overall robust to fluctuations in individual emotional arousal, but performance varies in quality and movement speed as a result of valence of emotional induction (both on the individual and group level). We found that three negative inductions per group per trial led to a drop in overall group synchronisation performance (measured as the median and standard deviation of Kuramoto's order parameter-an index measuring the strength of synchrony between oscillators, in this study, players) in the 15 sec post-induction. We report that negatively-valenced inductions led to slower oscillations, whilst positive induction afforded faster oscillations. On the individual level of synchronisation performance we found an effect of empathetic disposition (higher competence linked to better performance during the negative induction condition) and of participant's sex (males displayed better synchronisation performance with others). We believe this work is a blueprint for exploring the frontiers of inextricably bound worlds of emotion and joint action, be it physical or digital.
Collapse
Affiliation(s)
- Marta M N Bieńkiewicz
- EuroMov Digital Health in Motion, Univ. Montpellier IMT Mines Ales, Hérault, Montpellier, 34090, France.
| | - Stefan Janaqi
- EuroMov Digital Health in Motion, Univ. Montpellier IMT Mines Ales, Hérault, Montpellier, 34090, France
| | - Pierre Jean
- EuroMov Digital Health in Motion, Univ. Montpellier IMT Mines Ales, Hérault, Montpellier, 34090, France
| | - Benoît G Bardy
- EuroMov Digital Health in Motion, Univ. Montpellier IMT Mines Ales, Hérault, Montpellier, 34090, France
| |
Collapse
|
6
|
Wireless EEG: A survey of systems and studies. Neuroimage 2023; 269:119774. [PMID: 36566924 DOI: 10.1016/j.neuroimage.2022.119774] [Citation(s) in RCA: 29] [Impact Index Per Article: 29.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2022] [Revised: 11/18/2022] [Accepted: 11/27/2022] [Indexed: 12/24/2022] Open
Abstract
The popular brain monitoring method of electroencephalography (EEG) has seen a surge in commercial attention in recent years, focusing mostly on hardware miniaturization. This has led to a varied landscape of portable EEG devices with wireless capability, allowing them to be used by relatively unconstrained users in real-life conditions outside of the laboratory. The wide availability and relative affordability of these devices provide a low entry threshold for newcomers to the field of EEG research. The large device variety and the at times opaque communication from their manufacturers, however, can make it difficult to obtain an overview of this hardware landscape. Similarly, given the breadth of existing (wireless) EEG knowledge and research, it can be challenging to get started with novel ideas. Therefore, this paper first provides a list of 48 wireless EEG devices along with a number of important-sometimes difficult-to-obtain-features and characteristics to enable their side-by-side comparison, along with a brief introduction to each of these aspects and how they may influence one's decision. Secondly, we have surveyed previous literature and focused on 110 high-impact journal publications making use of wireless EEG, which we categorized by application and analyzed for device used, number of channels, sample size, and participant mobility. Together, these provide a basis for informed decision making with respect to hardware and experimental precedents when considering new, wireless EEG devices and research. At the same time, this paper provides background material and commentary about pitfalls and caveats regarding this increasingly accessible line of research.
Collapse
|
7
|
Yuan J, Hassan SS, Wu J, Koger CR, Packard RRS, Shi F, Fei B, Ding Y. Extended reality for biomedicine. NATURE REVIEWS. METHODS PRIMERS 2023; 3:15. [PMID: 37051227 PMCID: PMC10088349 DOI: 10.1038/s43586-023-00208-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/06/2023]
Abstract
Extended reality (XR) refers to an umbrella of methods that allows users to be immersed in a three-dimensional (3D) or a 4D (spatial + temporal) virtual environment to different extents, including virtual reality (VR), augmented reality (AR), and mixed reality (MR). While VR allows a user to be fully immersed in a virtual environment, AR and MR overlay virtual objects over the real physical world. The immersion and interaction of XR provide unparalleled opportunities to extend our world beyond conventional lifestyles. While XR has extensive applications in fields such as entertainment and education, its numerous applications in biomedicine create transformative opportunities in both fundamental research and healthcare. This Primer outlines XR technology from instrumentation to software computation methods, delineating the biomedical applications that have been advanced by state-of-the-art techniques. We further describe the technical advances overcoming current limitations in XR and its applications, providing an entry point for professionals and trainees to thrive in this emerging field.
Collapse
Affiliation(s)
- Jie Yuan
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
| | - Sohail S. Hassan
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
| | - Jiaojiao Wu
- Department of Research and Development, Shanghai United Imaging Intelligence Co., Ltd., Shanghai, China
| | - Casey R. Koger
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
| | - René R. Sevag Packard
- Division of Cardiology, Department of Medicine, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, CA, United States
- Ronald Reagan UCLA Medical Center, Los Angeles, CA United States
- Veterans Affairs West Los Angeles Medical Center, Los Angeles, CA, United States
| | - Feng Shi
- Department of Research and Development, Shanghai United Imaging Intelligence Co., Ltd., Shanghai, China
| | - Baowei Fei
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
- Department of Radiology, UT Southwestern Medical Center, Dallas, TX, United States
- Center for Imaging and Surgical Innovation, The University of Texas at Dallas, Richardson, TX, United States
| | - Yichen Ding
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
- Center for Imaging and Surgical Innovation, The University of Texas at Dallas, Richardson, TX, United States
- Hamon Center for Regenerative Science and Medicine, UT Southwestern Medical Center, Dallas, TX, United States
| |
Collapse
|
8
|
Zary N, Tan Z, Liu T, Chan SN, Sheng J, Wong TH, Huang J, Zhang CJP, Ming WK. Preference of Virtual Reality Games in Psychological Pressure and Depression Treatment: Discrete Choice Experiment. JMIR Serious Games 2023; 11:e34586. [PMID: 36645698 PMCID: PMC9947866 DOI: 10.2196/34586] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2021] [Revised: 05/11/2022] [Accepted: 06/12/2022] [Indexed: 01/18/2023] Open
Abstract
BACKGROUND Virtual reality (VR) can be used to build many different scenes aimed at reducing study-related stress. However, only few academic experiments on university students for preference testing have been performed. OBJECTIVE This study aims to assess the preference of VR games for stress and depression treatment using a discrete choice experiment (DCE). METHODS A total of 5 different attributes were selected based on the depression therapy parameters and attributes related to VR: (1) treatment modality; (2) therapy duration; (3) perceived remission rate; (4) probability of adverse events; and the (5) monthly cost of adding treatment to a discrete choice experiment. By comparing different attributes and levels, we could draw some conclusions about the depression therapy testing preference for university students; 1 university student was responsible for VR scene development and 1 for participant recruitment. RESULTS The utility value of different attributes for "0% Probability of adverse events" was higher than others (99.22), and the utility value of VR treatment as the most popular treatment method compared with counseling and medicine treatment was 80.95. Three parameter aspects (different treatments for depression) were statistically significant (P<.001), including "0%" and "50%" of "Probability of adverse events" and "¥500" (a currency exchange rate of ¥1 [Chinese yuan]=US $0.15 is applicable) of "The monthly cost of treatment." Most individuals preferred 12 months as the therapy duration, and the odds ratio of "12 months" was 1.095 (95% CI 0.945-1.270) when compared with the reference level (6 months). Meanwhile, the cheapest price (¥500) of depression therapy was the optimum choice for most students. CONCLUSIONS People placed great preference on VR technology psychological intervention methods, which indicates that VR may have a potential market in the treatment of psychological problems. However, adverse events and treatment costs need to be considered. This study can be used to guide policies that are relevant to the development of the application of VR technology in the field of psychological pressure and depression treatment.
Collapse
Affiliation(s)
| | - Zijian Tan
- Department of Public Health and Preventive Medicine, School of Medicine, Jinan University, Guangzhou, China
| | - Taoran Liu
- Department of Infectious Diseases and Public Health, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China (Hong Kong)
| | - Sze Ngai Chan
- Department of Obstetrics and Gynaecology, First Affiliated Hospital of Jinan University, Guangzhou, China
| | - Jie Sheng
- Department of Infectious Diseases and Public Health, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China (Hong Kong)
| | - Tak-Hap Wong
- Department of Infectious Diseases and Public Health, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China (Hong Kong)
| | - Jian Huang
- Department of Epidemiology and Biostatistics, School of Public Health, Imperial College London, London, United Kingdom
| | - Casper J P Zhang
- School of Public Health, The University of Hong Kong, Hong Kong, China (Hong Kong)
| | - Wai-Kit Ming
- Department of Infectious Diseases and Public Health, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China (Hong Kong)
| |
Collapse
|
9
|
Mavridou I, Balaguer-Ballester E, Nduka C, Seiss E. A reliable and robust online validation method for creating a novel 3D Affective Virtual Environment and Event Library (AVEL). PLoS One 2023; 18:e0278065. [PMID: 37053205 PMCID: PMC10101521 DOI: 10.1371/journal.pone.0278065] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2022] [Accepted: 11/09/2022] [Indexed: 04/14/2023] Open
Abstract
This paper describes the development and validation of 3D Affective Virtual environments and Event Library (AVEL) for affect induction in Virtual Reality (VR) settings with an online survey; a cost-effective method for remote stimuli validation which has not been sufficiently explored. Three virtual office-replica environments were designed to induce negative, neutral and positive valence. Each virtual environment also had several affect inducing events/objects. The environments were validated using an online survey containing videos of the virtual environments and pictures of the events/objects. They survey was conducted with 67 participants. Participants were instructed to rate their perceived levels of valence and arousal for each virtual environment (VE), and separately for each event/object. They also rated their perceived levels of presence for each VE, and they were asked how well they remembered the events/objects presented in each VE. Finally, an alexithymia questionnaire was administered at the end of the survey. User ratings were analysed and successfully validated the expected affect and presence levels of each VE and affect ratings for each event/object. Our results demonstrate the effectiveness of the online validation of VE material in affective and cognitive neuroscience and wider research settings as a good scientific practice for future affect induction VR studies.
Collapse
Affiliation(s)
- Ifigeneia Mavridou
- Centre of Digital Entertainment, Bournemouth University, Bournemouth, United Kingdom
- Emteq Labs, Sussex Innovation Centre, Brighton, United Kingdom
| | - Emili Balaguer-Ballester
- Department of Computing and Informatics, Faculty of Science and Technology and Interdisciplinary Neuroscience Research Centre, Bournemouth University, Poole, United Kingdom
- Bernstein Center for Computational Neuroscience Heidelberg-Mannheim, Medical Faculty of Mannheim and Heidelberg University, Mannheim, Germany
| | - Charles Nduka
- Emteq Labs, Sussex Innovation Centre, Brighton, United Kingdom
| | - Ellen Seiss
- Department of Psychology, Faculty of Science and Technology and Interdisciplinary Neuroscience Research Centre, Bournemouth University, Bournemouth, United Kingdom
| |
Collapse
|
10
|
Jeong D, Jeong M, Yang U, Han K. Eyes on me: Investigating the role and influence of eye-tracking data on user modeling in virtual reality. PLoS One 2022; 17:e0278970. [PMID: 36580442 PMCID: PMC9799296 DOI: 10.1371/journal.pone.0278970] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2022] [Accepted: 11/24/2022] [Indexed: 12/30/2022] Open
Abstract
Research has shown that sensor data generated by a user during a VR experience is closely related to the user's behavior or state, meaning that the VR user can be quantitatively understood and modeled. Eye-tracking as a sensor signal has been studied in prior research, but its usefulness in a VR context has been less examined, and most extant studies have dealt with eye-tracking within a single environment. Our goal is to expand the understanding of the relationship between eye-tracking data and user modeling in VR. In this paper, we examined the role and influence of eye-tracking data in predicting a level of cybersickness and types of locomotion. We developed and applied the same structure of a deep learning model to the multi-sensory data collected from two different studies (cybersickness and locomotion) with a total of 50 participants. The experiment results highlight not only a high applicability of our model to sensor data in a VR context, but also a significant relevance of eye-tracking data as a potential supplement to improving the model's performance and the importance of eye-tracking data in learning processes overall. We conclude by discussing the relevance of these results to potential future studies on this topic.
Collapse
Affiliation(s)
- Dayoung Jeong
- Department of Artificial Intelligence, Hanyang University, Seoul, Republic of Korea
| | - Mingon Jeong
- Department of Artificial Intelligence, Hanyang University, Seoul, Republic of Korea
| | - Ungyeon Yang
- Electronics and Telecommunications Research Institute, Daejeon, Republic of Korea
| | - Kyungsik Han
- Department of Artificial Intelligence, Hanyang University, Seoul, Republic of Korea
- * E-mail:
| |
Collapse
|
11
|
Ramasubramanian B, Reddy VS, Chellappan V, Ramakrishna S. Emerging Materials, Wearables, and Diagnostic Advancements in Therapeutic Treatment of Brain Diseases. BIOSENSORS 2022; 12:1176. [PMID: 36551143 PMCID: PMC9775999 DOI: 10.3390/bios12121176] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/09/2022] [Revised: 12/07/2022] [Accepted: 12/07/2022] [Indexed: 06/17/2023]
Abstract
Among the most critical health issues, brain illnesses, such as neurodegenerative conditions and tumors, lower quality of life and have a significant economic impact. Implantable technology and nano-drug carriers have enormous promise for cerebral brain activity sensing and regulated therapeutic application in the treatment and detection of brain illnesses. Flexible materials are chosen for implantable devices because they help reduce biomechanical mismatch between the implanted device and brain tissue. Additionally, implanted biodegradable devices might lessen any autoimmune negative effects. The onerous subsequent operation for removing the implanted device is further lessened with biodegradability. This review expands on current developments in diagnostic technologies such as magnetic resonance imaging, computed tomography, mass spectroscopy, infrared spectroscopy, angiography, and electroencephalogram while providing an overview of prevalent brain diseases. As far as we are aware, there hasn't been a single review article that addresses all the prevalent brain illnesses. The reviewer also looks into the prospects for the future and offers suggestions for the direction of future developments in the treatment of brain diseases.
Collapse
Affiliation(s)
- Brindha Ramasubramanian
- Department of Mechanical Engineering, Center for Nanofibers & Nanotechnology, National University of Singapore, Singapore 117574, Singapore
- Institute of Materials Research and Engineering (IMRE), Agency for Science, Technology and Research (A*STAR), #08-03, 2 Fusionopolis Way, Innovis, Singapore 138634, Singapore
| | - Vundrala Sumedha Reddy
- Department of Mechanical Engineering, Center for Nanofibers & Nanotechnology, National University of Singapore, Singapore 117574, Singapore
| | - Vijila Chellappan
- Institute of Materials Research and Engineering (IMRE), Agency for Science, Technology and Research (A*STAR), #08-03, 2 Fusionopolis Way, Innovis, Singapore 138634, Singapore
| | - Seeram Ramakrishna
- Department of Mechanical Engineering, Center for Nanofibers & Nanotechnology, National University of Singapore, Singapore 117574, Singapore
| |
Collapse
|
12
|
Takada N, Laohakangvalvit T, Sugaya M. Human Error Prediction Using Heart Rate Variability and Electroencephalography. SENSORS (BASEL, SWITZERLAND) 2022; 22:9194. [PMID: 36501895 PMCID: PMC9738990 DOI: 10.3390/s22239194] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/30/2022] [Revised: 11/20/2022] [Accepted: 11/24/2022] [Indexed: 06/17/2023]
Abstract
As human's simple tasks are being increasingly replaced by autonomous systems and robots, it is likely that the responsibility of handling more complex tasks will be more often placed on human workers. Thus, situations in which workplace tasks change before human workers become proficient at those tasks will arise more frequently due to rapid changes in business trends. Based on this background, the importance of preventing human error will become increasingly crucial. Existing studies on human error reveal how task errors are related to heart rate variability (HRV) indexes and electroencephalograph (EEG) indexes. However, in terms of preventing human error, analysis on their relationship with conditions before human error occurs (i.e., the human pre-error state) is still insufficient. This study aims at identifying biological indexes potentially useful for the detection of high-risk psychological states. As a result of correlation analysis between the number of errors in a Stroop task and the multiple HRV and EEG indexes obtained before and during the task, significant correlations were obtained with respect to several biological indexes. Specifically, we confirmed that conditions before the task are important for predicting the human error risk in high-cognitive-load tasks while conditions both before and during tasks are important in low-cognitive-load tasks.
Collapse
Affiliation(s)
| | | | - Midori Sugaya
- Shibaura Institute of Technology, Tokyo 135-8548, Japan
| |
Collapse
|
13
|
Radhakrishnan U, Chinello F, Koumaditis K. Investigating the effectiveness of immersive VR skill training and its link to physiological arousal. VIRTUAL REALITY 2022; 27:1091-1115. [PMID: 36405878 PMCID: PMC9663202 DOI: 10.1007/s10055-022-00699-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/08/2022] [Accepted: 09/13/2022] [Indexed: 06/05/2023]
Abstract
This paper details the motivations, design, and analysis of a study using a fine motor skill training task in both VR and physical conditions. The objective of this between-subjects study was to (a) investigate the effectiveness of immersive virtual reality for training participants in the 'buzz-wire' fine motor skill task compared to physical training and (b) investigate the link between participants' arousal with their improvements in task performance. Physiological arousal levels in the form of electro-dermal activity (EDA) and ECG (Electrocardiogram) data were collected from 87 participants, randomly distributed across the two conditions. Results indicated that VR training is as good as, or even slightly better than, training in physical training in improving task performance. Moreover, the participants in the VR condition reported an increase in self-efficacy and immersion, while marginally significant differences were observed in the presence and the temporal demand (retrieved from NASA-TLX measurements). Participants in the VR condition showed on average less arousal than those in the physical condition. Though correlation analyses between performance metrics and arousal levels did not depict any statistically significant results, a closer examination of EDA values revealed that participants with lower arousal levels during training, across conditions, demonstrated better improvements in performance than those with higher arousal. These findings demonstrate the effectiveness of VR in training and the potential of using arousal and training performance data for designing adaptive VR training systems. This paper also discusses implications for researchers who consider using biosensors and VR for motor skill experiments. Supplementary Information The online version contains supplementary material available at 10.1007/s10055-022-00699-3.
Collapse
Affiliation(s)
- Unnikrishnan Radhakrishnan
- Department of Business Development and Technology, Aarhus University, Birk Centerpark 15, 7400 Herning, Denmark
| | - Francesco Chinello
- Department of Business Development and Technology, Aarhus University, Birk Centerpark 15, 7400 Herning, Denmark
| | - Konstantinos Koumaditis
- Department of Business Development and Technology, Aarhus University, Birk Centerpark 15, 7400 Herning, Denmark
| |
Collapse
|
14
|
Li M, Pan J, Gao Y, Shen Y, Luo F, Dai J, Hao A, Qin H. Neurophysiological and Subjective Analysis of VR Emotion Induction Paradigm. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:3832-3842. [PMID: 36049001 DOI: 10.1109/tvcg.2022.3203099] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
The ecological validity of emotion-inducing scenarios is essential for emotion research. In contrast to the classical passive induction paradigm, immersive VR fully engages the psychological and physiological components of the subject, which is considered an ecologically valid paradigm for studying emotion. Several studies investigate the emotional responses to different VR tasks or games using subjective scales. However, little research regards VR as an eliciting material, especially when systematically analyzing emotional processes in VR from a neurophysiological perspective. To fill this gap and scientifically evaluate VR's ability to be used as an active method for emotion elicitation, we investigate the dynamic relationship between explicit information (subjective evaluations) and implicit information (objective neurophysiological data). A total of 28 participants are enlisted to watch eight VR videos while their SAM/IPQ scores and EEG data are recorded simultaneously. In ecologically valid scenarios, the subjective results demonstrate that VR has significant advantages for evoking emotion in arousal-valence. This conclusion is backed by our examination of objective neurophysiological evidence that VR videos effectively induce high-arousal emotions. In addition, we obtain features of critical channels and frequency oscillations associated with emotional valence, thereby validating previous research in more lifelike circumstances. In particular, we discover hemispheric asymmetry in the occipital region under high and low emotional arousal, which adds to our understanding of neural features and the dynamics of emotional arousal. As a result, we successfully integrate EEG and VR to demonstrate that VR is more pragmatic for evoking natural feelings and is beneficial for emotional research. Our research has set a precedent for new methodologies of using VR induction paradigms to acquire a more reliable explanation of affective computing.
Collapse
|
15
|
Anders C, Arnrich B. Wearable electroencephalography and multi-modal mental state classification: A systematic literature review. Comput Biol Med 2022; 150:106088. [PMID: 36137314 DOI: 10.1016/j.compbiomed.2022.106088] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Revised: 08/10/2022] [Accepted: 09/03/2022] [Indexed: 11/19/2022]
Abstract
BACKGROUND Wearable multi-modal time-series classification applications outperform their best uni-modal counterparts and hold great promise. A modality that directly measures electrical correlates from the brain is electroencephalography. Due to varying noise sources, different key brain regions, key frequency bands, and signal characteristics like non-stationarity, techniques for data pre-processing and classification algorithms are task-dependent. METHOD Here, a systematic literature review on mental state classification for wearable electroencephalography is presented. Four search terms in different combinations were used for an in-title search. The search was executed on the 29th of June 2022, across Google Scholar, PubMed, IEEEXplore, and ScienceDirect. 76 most relevant publications were set into context as the current state-of-the-art in mental state time-series classification. RESULTS Pre-processing techniques, features, and time-series classification models were analyzed. Across publications, a window length of one second was mainly chosen for classification and spectral features were utilized the most. The achieved performance per time-series classification model is analyzed, finding linear discriminant analysis, decision trees, and k-nearest neighbors models outperform support-vector machines by a factor of up to 1.5. A historical analysis depicts future trends while under-reported aspects relevant to practical applications are discussed. CONCLUSIONS Five main conclusions are given, covering utilization of available area for electrode placement on the head, most often or scarcely utilized features and time-series classification model architectures, baseline reporting practices, as well as explainability and interpretability of Deep Learning. The importance of a 'test battery' assessing the influence of data pre-processing and multi-modality on time-series classification performance is emphasized.
Collapse
Affiliation(s)
- Christoph Anders
- Hasso Plattner Institute, University of Potsdam, Potsdam, 14482, Brandenburg, Germany.
| | - Bert Arnrich
- Hasso Plattner Institute, University of Potsdam, Potsdam, 14482, Brandenburg, Germany.
| |
Collapse
|
16
|
Luong T, Holz C. Characterizing Physiological Responses to Fear, Frustration, and Insight in Virtual Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:3917-3927. [PMID: 36048988 DOI: 10.1109/tvcg.2022.3203113] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Physiological sensing often complements studies of human behavior in virtual reality (VR) to detect users' affective and cognitive states. Some psychological states, such as fear and frustration, can be particularly hard to differentiate from a physiological perspective as they are close in the arousal and valence emotional space. Moreover, it is largely unclear how users' physiological reactions are expressed in response to transient psychological states such as fear, frustration, and insight-especially since these are rich indicators for characterizing users' responses to dynamic systems but are hard to capture in highly interactive settings. We conducted a study ($N=24$) to analyze participants' pulmonary, electrodermal, cardiac, and pupillary responses to moments of fear, frustration, and insight in immersive settings. Participants interacted in five VR environments, throughout which we measured their physiological reactions and analyzed the patterns we observed. We also measured subjective fear and frustration using questionnaires. We found differences between fear and frustration pupillary, respiratory, and electrodermal responses, as well as between the pupillary changes that followed fear in a horror game and those that followed fear in a vertigo experiment. We present the relationships between fear levels, frustration levels, and their physiological responses. To detect these affective events and states, we introduce user-independent binary classification models that achieved an average micro $F_{1}$ score of 71% for detecting fear in a horror game, 75% for fear of vertigo, 76% for frustration, and 75% for insight, showing the promise for detecting these states from passive and objective signals.
Collapse
|
17
|
Use of Differential Entropy for Automated Emotion Recognition in a Virtual Reality Environment with EEG Signals. Diagnostics (Basel) 2022; 12:diagnostics12102508. [PMID: 36292197 PMCID: PMC9601226 DOI: 10.3390/diagnostics12102508] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2022] [Revised: 10/13/2022] [Accepted: 10/14/2022] [Indexed: 11/20/2022] Open
Abstract
Emotion recognition is one of the most important issues in human–computer interaction (HCI), neuroscience, and psychology fields. It is generally accepted that emotion recognition with neural data such as electroencephalography (EEG) signals, functional magnetic resonance imaging (fMRI), and near-infrared spectroscopy (NIRS) is better than other emotion detection methods such as speech, mimics, body language, facial expressions, etc., in terms of reliability and accuracy. In particular, EEG signals are bioelectrical signals that are frequently used because of the many advantages they offer in the field of emotion recognition. This study proposes an improved approach for EEG-based emotion recognition on a publicly available newly published dataset, VREED. Differential entropy (DE) features were extracted from four wavebands (theta 4–8 Hz, alpha 8–13 Hz, beta 13–30 Hz, and gamma 30–49 Hz) to classify two emotional states (positive/negative). Five classifiers, namely Support Vector Machine (SVM), k-Nearest Neighbor (kNN), Naïve Bayesian (NB), Decision Tree (DT), and Logistic Regression (LR) were employed with DE features for the automated classification of two emotional states. In this work, we obtained the best average accuracy of 76.22% ± 2.06 with the SVM classifier in the classification of two states. Moreover, we observed from the results that the highest average accuracy score was produced with the gamma band, as previously reported in studies in EEG-based emotion recognition.
Collapse
|
18
|
Nalwaya A, Das K, Pachori RB. Automated Emotion Identification Using Fourier-Bessel Domain-Based Entropies. ENTROPY (BASEL, SWITZERLAND) 2022; 24:1322. [PMID: 37420342 DOI: 10.3390/e24101322] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/07/2022] [Revised: 09/09/2022] [Accepted: 09/16/2022] [Indexed: 07/09/2023]
Abstract
Human dependence on computers is increasing day by day; thus, human interaction with computers must be more dynamic and contextual rather than static or generalized. The development of such devices requires knowledge of the emotional state of the user interacting with it; for this purpose, an emotion recognition system is required. Physiological signals, specifically, electrocardiogram (ECG) and electroencephalogram (EEG), were studied here for the purpose of emotion recognition. This paper proposes novel entropy-based features in the Fourier-Bessel domain instead of the Fourier domain, where frequency resolution is twice that of the latter. Further, to represent such non-stationary signals, the Fourier-Bessel series expansion (FBSE) is used, which has non-stationary basis functions, making it more suitable than the Fourier representation. EEG and ECG signals are decomposed into narrow-band modes using FBSE-based empirical wavelet transform (FBSE-EWT). The proposed entropies of each mode are computed to form the feature vector, which are further used to develop machine learning models. The proposed emotion detection algorithm is evaluated using publicly available DREAMER dataset. K-nearest neighbors (KNN) classifier provides accuracies of 97.84%, 97.91%, and 97.86% for arousal, valence, and dominance classes, respectively. Finally, this paper concludes that the obtained entropy features are suitable for emotion recognition from given physiological signals.
Collapse
Affiliation(s)
- Aditya Nalwaya
- Department of Electrical Engineering, Indian Institute of Technology Indore, Indore 453552, India
| | - Kritiprasanna Das
- Department of Electrical Engineering, Indian Institute of Technology Indore, Indore 453552, India
| | - Ram Bilas Pachori
- Department of Electrical Engineering, Indian Institute of Technology Indore, Indore 453552, India
| |
Collapse
|
19
|
Presti P, Ruzzon D, Avanzini P, Caruana F, Rizzolatti G, Vecchiato G. Measuring arousal and valence generated by the dynamic experience of architectural forms in virtual environments. Sci Rep 2022; 12:13376. [PMID: 35927322 PMCID: PMC9352685 DOI: 10.1038/s41598-022-17689-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2021] [Accepted: 07/29/2022] [Indexed: 11/13/2022] Open
Abstract
The built environment represents the stage surrounding our everyday life activities. To investigate how architectural design impacts individuals' affective states, we measured subjective judgments of perceived valence (pleasant and unpleasant) and arousal after the dynamic experience of a progressive change of macro visuospatial dimensions of virtual spaces. To this aim, we developed a parametric model that allowed us to create 54 virtual architectural designs characterized by a progressive change of sidewalls distance, ceiling and windows height, and color of the environment. Decreasing sidewalls distance, ceiling height variation, and increasing windows height significantly affected the participants' emotional state within virtual environments. Indeed, such architectural designs generated high arousing and unpleasant states according to subjective judgment. Overall, we observed that valence and arousal scores are affected by all the dynamic form factors which modulated the spaciousness of the surrounding. Showing that the dynamic experience of virtual environments enables the possibility of measuring the emotional impact of macro spatial architectural features, the present findings may lay the groundwork for future experiments investigating the effects that the architectural design has on individuals' mental state as a fundamental factor for the creation of future spaces.
Collapse
Affiliation(s)
- Paolo Presti
- Institute of Neuroscience, National Research Council of Italy, 43125, Parma, Italy.,Department of Medicine and Surgery, University of Parma, 43125, Parma, Italy
| | - Davide Ruzzon
- TUNED, Lombardini22, 20143, Milan, Italy.,Dipartimento Culture del Progetto, IUAV, 30125, Venice, Italy
| | - Pietro Avanzini
- Institute of Neuroscience, National Research Council of Italy, 43125, Parma, Italy
| | - Fausto Caruana
- Institute of Neuroscience, National Research Council of Italy, 43125, Parma, Italy
| | - Giacomo Rizzolatti
- Institute of Neuroscience, National Research Council of Italy, 43125, Parma, Italy
| | - Giovanni Vecchiato
- Institute of Neuroscience, National Research Council of Italy, 43125, Parma, Italy.
| |
Collapse
|
20
|
Tao K, Huang Y, Shen Y, Sun L. Automated Stress Recognition Using Supervised Learning Classifiers by Interactive Virtual Reality Scenes. IEEE Trans Neural Syst Rehabil Eng 2022; 30:2060-2066. [PMID: 35857724 DOI: 10.1109/tnsre.2022.3192571] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Virtual reality (VR) technology offers a great opportunity to explore stress disorder therapies. We created a VR stress training system, which incorporates three highly interactive stressful scenes to elicit stress, and demonstrate the concurrent variations between physiological data (heart rate, electrodermal activity and eye-blink rate) and self-reported stress ratings through a self-designed customized perceived stress questionnaire (SSAI) and wearable devices. Several supervised learning models were rigorously applied to automate stress recognition. Our findings include the evaluations of the VR system by computing Cronbach's alpha ( α = 0.72 ) and Kaiser-Meyer-Olkin (KMO) coefficient ( η = 0.78 ) through a retrospective survey, which were subsequently confirmed as reliable on four aspects (sense of presence, sense of space, sense of immersion and sense of reality) via factor analysis. Additionally, we demonstrate the effectiveness of physiology-based stress level classification (no stress, low stress and high stress) and continuous SSAI score prediction, with accuracy reaching 0.742 by bagging ensemble learning model and goodness-of-fit reaching 0.44 via multivariate stepwise regression. This study provides detailed insight regarding the effect of objective physiological measures on the validation of subjective self-ratings under a novel complex VR stress training system, which stimulates the further investigations of stress disorder recognition and treatment.
Collapse
|
21
|
Conrad CD, Aziz JR, Henneberry JM, Newman AJ. Do emotions influence safe browsing? Toward an electroencephalography marker of affective responses to cybersecurity notifications. Front Neurosci 2022; 16:922960. [PMID: 35911995 PMCID: PMC9330617 DOI: 10.3389/fnins.2022.922960] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2022] [Accepted: 06/29/2022] [Indexed: 11/29/2022] Open
Abstract
Cybersecurity notifications play an important role in encouraging users to use computers safely. Emotional reactions to such notifications are known to positively influence users’ adherence to these notifications, though it is challenging for researchers to identify and quantify users’ emotional reactions. In this study, we explored electroencephalography (EEG) signals that were elicited by the presentation of various emotionally charged image stimuli provided by the International Affective Picture System (IAPS) and compared signals to those elicited by images of cybersecurity notifications and other computer-related stimuli. Participants provided behavioral assessments of valence and arousal elicited by the images which were used to cross-reference the results. We found that EEG amplitudes corresponding to the late positive potential (LPP) were elevated in reaction to images of cybersecurity notifications as well as IAPS images known to elicit strong positive and negative valence, when compared to neutral valence or other computer-related stimuli. These findings suggest that the LPP may account for emotional deliberation about cybersecurity notifications, which could be a useful measure when conducting future studies into the role such emotional reactions play in encouraging safe computer behavior.
Collapse
Affiliation(s)
- Colin D. Conrad
- School of Information Management, Dalhousie University, Halifax, NS, Canada
- *Correspondence: Colin D. Conrad,
| | - Jasmine R. Aziz
- Department of Psychology and Neuroscience, Dalhousie University, Halifax, NS, Canada
| | | | - Aaron J. Newman
- Department of Psychology and Neuroscience, Dalhousie University, Halifax, NS, Canada
| |
Collapse
|
22
|
Baldini A, Frumento S, Menicucci D, Gemignani A, Scilingo EP, Greco A. Modeling subjective fear using skin conductance: a preliminary study in virtual reality. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2022; 2022:3451-3454. [PMID: 36086358 DOI: 10.1109/embc48229.2022.9871557] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Reliably measuring fear perception could help evaluate the effectiveness of treatments for pathological conditions such as specific phobias or post-traumatic stress syndrome (e.g., exposure therapy). In this study, we developed a novel vir-tual reality (VR) scenario to induce fear and evaluate the related physiological response by the analysis of skin conductance (SC) signal. Eighteen subjects voluntarily experienced the fear VR scenario while their SC was recorded. After the experiment, each participant was asked to score the perceived subjective fear using a Likert scale from 1 to 10. We used the cvxEDA algorithm to process the collected SC signals and extract several features able to estimate the autonomic response to the fearful stimuli. Finally, the extracted features were linearly combined to model the subjective fear perception scores by means of LASSO linear regression. The sparsification imposed by the LASSO procedure to mitigate the overfitting risk identified an optimal linear model including only the standard deviation of the tonic SC component as a regressor (p = 0.007; R2 = 0.3337). The significant contribution of this feature to the model suggests that subjects experiencing more intense subjective fear have a more variable and unstable sympathetic tone.
Collapse
|
23
|
Data Collection Framework for Context-Aware Virtual Reality Application Development in Unity: Case of Avatar Embodiment. SENSORS 2022; 22:s22124623. [PMID: 35746405 PMCID: PMC9228658 DOI: 10.3390/s22124623] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/09/2022] [Revised: 06/09/2022] [Accepted: 06/13/2022] [Indexed: 02/04/2023]
Abstract
Virtual Reality (VR) has been adopted as a leading technology for the metaverse, yet most previous VR systems provide one-size-fits-all experiences to users. Context-awareness in VR enables personalized experiences in the metaverse, such as improved embodiment and deeper integration of the real world and virtual worlds. Personalization requires context data from diverse sources. We proposed a reusable and extensible context data collection framework, ManySense VR, which unifies data collection from diverse sources for VR applications. ManySense VR was implemented in Unity based on extensible context data managers collecting data from data sources such as an eye tracker, electroencephalogram, pulse, respiration, galvanic skin response, facial tracker, and Open Weather Map. We used ManySense VR to build a context-aware embodiment VR scene where the user's avatar is synchronized with their bodily actions. The performance evaluation of ManySense VR showed good performance in processor usage, frame rate, and memory footprint. Additionally, we conducted a qualitative formative evaluation by interviewing five developers (two males and three females; mean age: 22) after they used and extended ManySense VR. The participants expressed advantages (e.g., ease-of-use, learnability, familiarity, quickness, and extensibility), disadvantages (e.g., inconvenient/error-prone data query method and lack of diversity in callback methods), future application ideas, and improvement suggestions that indicate potential and can guide future development. In conclusion, ManySense VR is an efficient tool for researchers and developers to easily integrate context data into their Unity-based VR applications for the metaverse.
Collapse
|
24
|
Moinnereau MA, de Oliveira AA, Falk TH. Immersive media experience: a survey of existing methods and tools for human influential factors assessment. QUALITY AND USER EXPERIENCE 2022; 7:5. [PMID: 35729990 PMCID: PMC9198412 DOI: 10.1007/s41233-022-00052-1] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/23/2021] [Indexed: 06/15/2023]
Abstract
Virtual reality (VR) applications, especially those where the user is untethered to a computer, are becoming more prevalent as new hardware is developed, computational power and artificial intelligence algorithms are available, and wireless communication networks are becoming more reliable, fast, and providing higher reliability. In fact, recent projections show that by 2022 the number of VR users will double, suggesting the sector was not negatively affected by the worldwide COVID-19 pandemic. The success of any immersive communication system is heavily dependent on the user experience it delivers, thus now more than ever has it become crucial to develop reliable models of immersive media experience (IMEx). In this paper, we survey the literature for existing methods and tools to assess human influential factors (HIFs) related to IMEx. In particular, subjective, behavioural, and psycho-physiological methods are covered. We describe tools available to monitor these HIFs, including the user's sense of presence and immersion, cybersickness, and mental/affective states, as well as their role in overall experience. Special focus is placed on psycho-physiological methods, as it was found that such in-depth evaluation was lacking from the existing literature. We conclude by touching on emerging applications involving multiple-sensorial immersive media and provide suggestions for future research directions to fill existing gaps. It is hoped that this survey will be useful for researchers interested in building new immersive (adaptive) applications that maximize user experience.
Collapse
Affiliation(s)
| | - Alcyr Alves de Oliveira
- Department of Psychology, Federal University of Health Sciences of Porto Alegre, Porto Alegre, Brazil
| | | |
Collapse
|
25
|
Dincelli E, Yayla A. Immersive virtual reality in the age of the Metaverse: A hybrid-narrative review based on the technology affordance perspective. JOURNAL OF STRATEGIC INFORMATION SYSTEMS 2022. [DOI: 10.1016/j.jsis.2022.101717] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
26
|
Khan A, Rasool S. Game induced emotion analysis using electroencephalography. Comput Biol Med 2022; 145:105441. [DOI: 10.1016/j.compbiomed.2022.105441] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Revised: 03/15/2022] [Accepted: 03/20/2022] [Indexed: 01/10/2023]
|
27
|
Cardiac sympathetic-vagal activity initiates a functional brain-body response to emotional arousal. Proc Natl Acad Sci U S A 2022; 119:e2119599119. [PMID: 35588453 PMCID: PMC9173754 DOI: 10.1073/pnas.2119599119] [Citation(s) in RCA: 37] [Impact Index Per Article: 18.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/27/2023] Open
Abstract
We investigate the temporal dynamics of brain and cardiac activities in healthy subjects who underwent an emotional elicitation through videos. We demonstrate that, within the first few seconds, emotional stimuli modulate heartbeat activity, which in turn stimulates an emotion intensity (arousal)–specific cortical response. The emotional processing is then sustained by a bidirectional brain–heart interplay, where the perceived arousal level modulates the amplitude of ascending heart-to-brain neural information flow. These findings may constitute fundamental knowledge linking neurophysiology and psychiatric disorders, including the link between depressive symptoms and cardiovascular disorders. A century-long debate on bodily states and emotions persists. While the involvement of bodily activity in emotion physiology is widely recognized, the specificity and causal role of such activity related to brain dynamics has not yet been demonstrated. We hypothesize that the peripheral neural control on cardiovascular activity prompts and sustains brain dynamics during an emotional experience, so these afferent inputs are processed by the brain by triggering a concurrent efferent information transfer to the body. To this end, we investigated the functional brain–heart interplay under emotion elicitation in publicly available data from 62 healthy subjects using a computational model based on synthetic data generation of electroencephalography and electrocardiography signals. Our findings show that sympathovagal activity plays a leading and causal role in initiating the emotional response, in which ascending modulations from vagal activity precede neural dynamics and correlate to the reported level of arousal. The subsequent dynamic interplay observed between the central and autonomic nervous systems sustains the processing of emotional arousal. These findings should be particularly revealing for the psychophysiology and neuroscience of emotions.
Collapse
|
28
|
Continuous Emotion Recognition for Long-Term Behavior Modeling through Recurrent Neural Networks. TECHNOLOGIES 2022. [DOI: 10.3390/technologies10030059] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
One’s internal state is mainly communicated through nonverbal cues, such as facial expressions, gestures and tone of voice, which in turn shape the corresponding emotional state. Hence, emotions can be effectively used, in the long term, to form an opinion of an individual’s overall personality. The latter can be capitalized on in many human–robot interaction (HRI) scenarios, such as in the case of an assisted-living robotic platform, where a human’s mood may entail the adaptation of a robot’s actions. To that end, we introduce a novel approach that gradually maps and learns the personality of a human, by conceiving and tracking the individual’s emotional variations throughout their interaction. The proposed system extracts the facial landmarks of the subject, which are used to train a suitably designed deep recurrent neural network architecture. The above architecture is responsible for estimating the two continuous coefficients of emotion, i.e., arousal and valence, following the broadly known Russell’s model. Finally, a user-friendly dashboard is created, presenting both the momentary and the long-term fluctuations of a subject’s emotional state. Therefore, we propose a handy tool for HRI scenarios, where robot’s activity adaptation is needed for enhanced interaction performance and safety.
Collapse
|
29
|
Datasets for Automated Affect and Emotion Recognition from Cardiovascular Signals Using Artificial Intelligence- A Systematic Review. SENSORS 2022; 22:s22072538. [PMID: 35408149 PMCID: PMC9002643 DOI: 10.3390/s22072538] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/16/2022] [Revised: 03/21/2022] [Accepted: 03/22/2022] [Indexed: 02/04/2023]
Abstract
Simple Summary We reviewed the literature on the publicly available datasets used to automatically recognise emotion and affect using artificial intelligence (AI) techniques. We were particularly interested in databases with cardiovascular (CV) data. Additionally, we assessed the quality of the included papers. We searched the sources until 31 August 2020. Each step of identification was carried out independently by two reviewers to maintain the credibility of our review. In case of disagreement, we discussed them. Each action was first planned and described in a protocol that we posted on the Open Science Framework (OSF) platform. We selected 18 works focused on providing datasets of CV signals for automated affect and emotion recognition. In total, data for 812 participants aged 17 to 47 were analysed. The most frequently recorded signal was electrocardiography. The authors most often used video stimulation. Noticeably, we did not find much necessary information in many of the works, resulting in mainly low quality among included papers. Researchers in this field should focus more on how they carry out experiments. Abstract Our review aimed to assess the current state and quality of publicly available datasets used for automated affect and emotion recognition (AAER) with artificial intelligence (AI), and emphasising cardiovascular (CV) signals. The quality of such datasets is essential to create replicable systems for future work to grow. We investigated nine sources up to 31 August 2020, using a developed search strategy, including studies considering the use of AI in AAER based on CV signals. Two independent reviewers performed the screening of identified records, full-text assessment, data extraction, and credibility. All discrepancies were resolved by discussion. We descriptively synthesised the results and assessed their credibility. The protocol was registered on the Open Science Framework (OSF) platform. Eighteen records out of 195 were selected from 4649 records, focusing on datasets containing CV signals for AAER. Included papers analysed and shared data of 812 participants aged 17 to 47. Electrocardiography was the most explored signal (83.33% of datasets). Authors utilised video stimulation most frequently (52.38% of experiments). Despite these results, much information was not reported by researchers. The quality of the analysed papers was mainly low. Researchers in the field should concentrate more on methodology.
Collapse
|
30
|
Signal Quality Investigation of a New Wearable Frontal Lobe EEG Device. SENSORS 2022; 22:s22051898. [PMID: 35271044 PMCID: PMC8914983 DOI: 10.3390/s22051898] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/31/2022] [Revised: 02/24/2022] [Accepted: 02/26/2022] [Indexed: 02/04/2023]
Abstract
The demand for non-laboratory and long-term EEG acquisition in scientific and clinical applications has put forward new requirements for wearable EEG devices. In this paper, a new wearable frontal EEG device called Mindeep was proposed. A signal quality study was then conducted, which included simulated signal tests and signal quality comparison experiments. Simulated signals with different frequencies and amplitudes were used to test the stability of Mindeep’s circuit, and the high correlation coefficients (>0.9) proved that Mindeep has a stable and reliable hardware circuit. The signal quality comparison experiment, between Mindeep and the gold standard device, Neuroscan, included three tasks: (1) resting; (2) auditory oddball; and (3) attention. In the resting state, the average normalized cross-correlation coefficients between EEG signals recorded by the two devices was around 0.72 ± 0.02, Berger effect was observed (p < 0.01), and the comparison results in the time and frequency domain illustrated the ability of Mindeep to record high-quality EEG signals. The significant differences between high tone and low tone in auditory event-related potential collected by Mindeep was observed in N2 and P2. The attention recognition accuracy of Mindeep achieved 71.12% and 74.76% based on EEG features and the XGBoost model in the two attention tasks, respectively, which were higher than that of Neuroscan (70.19% and 72.80%). The results validated the performance of Mindeep as a prefrontal EEG recording device, which has a wide range of potential applications in audiology, cognitive neuroscience, and daily requirements.
Collapse
|
31
|
Abstract
This paper presents the proposal of a method to recognize emotional states through EEG analysis. The novelty of this work lies in its feature improvement strategy, based on multiclass genetic programming with multidimensional populations (M3GP), which builds features by implementing an evolutionary technique that selects, combines, deletes, and constructs the most suitable features to ease the classification process of the learning method. In this way, the problem data can be mapped into a more favorable search space that best defines each class. After implementing the M3GP, the results showed an increment of 14.76% in the recognition rate without changing any settings in the learning method. The tests were performed on a biometric EEG dataset (BED), designed to evoke emotions and record the cerebral cortex’s electrical response; this dataset implements a low cost device to collect the EEG signals, allowing greater viability for the application of the results. The proposed methodology achieves a mean classification rate of 92.1%, and simplifies the feature management process by increasing the separability of the spectral features.
Collapse
|
32
|
Yu M, Xiao S, Hua M, Wang H, Chen X, Tian F, Li Y. EEG-based emotion recognition in an immersive virtual reality environment: From local activity to brain network features. Biomed Signal Process Control 2022. [DOI: 10.1016/j.bspc.2021.103349] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/18/2022]
|
33
|
|
34
|
A Dataset for Emotion Recognition Using Virtual Reality and EEG (DER-VREEG): Emotional State Classification Using Low-Cost Wearable VR-EEG Headsets. BIG DATA AND COGNITIVE COMPUTING 2022. [DOI: 10.3390/bdcc6010016] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
Emotions are viewed as an important aspect of human interactions and conversations, and allow effective and logical decision making. Emotion recognition uses low-cost wearable electroencephalography (EEG) headsets to collect brainwave signals and interpret these signals to provide information on the mental state of a person, with the implementation of a virtual reality environment in different applications; the gap between human and computer interaction, as well as the understanding process, would shorten, providing an immediate response to an individual’s mental health. This study aims to use a virtual reality (VR) headset to induce four classes of emotions (happy, scared, calm, and bored), to collect brainwave samples using a low-cost wearable EEG headset, and to run popular classifiers to compare the most feasible ones that can be used for this particular setup. Firstly, we attempt to build an immersive VR database that is accessible to the public and that can potentially assist with emotion recognition studies using virtual reality stimuli. Secondly, we use a low-cost wearable EEG headset that is both compact and small, and can be attached to the scalp without any hindrance, allowing freedom of movement for participants to view their surroundings inside the immersive VR stimulus. Finally, we evaluate the emotion recognition system by using popular machine learning algorithms and compare them for both intra-subject and inter-subject classification. The results obtained here show that the prediction model for the four-class emotion classification performed well, including the more challenging inter-subject classification, with the support vector machine (SVM Class Weight kernel) obtaining 85.01% classification accuracy. This shows that using less electrode channels but with proper parameter tuning and selection features affects the performance of the classifications.
Collapse
|
35
|
Raman R, Achuthan K, Nair VK, Nedungadi P. Virtual Laboratories- A historical review and bibliometric analysis of the past three decades. EDUCATION AND INFORMATION TECHNOLOGIES 2022; 27:11055-11087. [PMID: 35502162 PMCID: PMC9046012 DOI: 10.1007/s10639-022-11058-9] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/23/2022] [Accepted: 04/12/2022] [Indexed: 05/09/2023]
Abstract
Online and virtual teaching-learning has been a panacea that most educational institutions adopted from the dire need created by COVID-19. We provide a comprehensive bibliometric study of 9523 publications on virtual laboratories in higher education covering the years 1991 to 2021. Influential bibliometrics such as publications and citations, productive countries, contributing institutions, funders, journals, authors, and bibliographic couplings were studied using the Scientific Procedures and Rationales for Systematic Literature Reviews (SPAR-4-SLR) protocol. A new metric to complement citations called Field Weighted Citation Impact was introduced that considers the differences in research behavior across disciplines. Findings show that 72% of the research work was published between 2011-and 2021, most likely due to digitalization, with the highest number of publications in 2020-2021 highlighting the impact of the pandemic. Top contributing institutions were from the developed economies of Spain, Germany, and the United States. The citation impact from publications with international co-authors is the highest, highlighting the importance of co-authoring papers with different countries. For the first time, Altmetrics in the context of virtual labs were studied though a very low correlation was observed between citations and Altmetrics Attention Score. Still, the overall percentage of publications with attention showed linear growth. Our work also highlights that virtual laboratory could play a significant role in achieving the United Nations Sustainable Development Goals, specifically SDG4-Quality Education, which largely remains under-addressed.
Collapse
Affiliation(s)
- Raghu Raman
- Amrita School of Business, Amrita Vishwa Vidyapeetham, Amritapuri, India
| | - Krishnashree Achuthan
- Center for Cybersecurity Systems and Networks, Amrita Vishwa Vidyapeetham, Amritapuri, India
| | - Vinith Kumar Nair
- Amrita Center for Accreditations, Rankings & Eminence, Amrita Vishwa Vidyapeetham, Amritapuri, India
| | - Prema Nedungadi
- Center for Research, Analytics and Technology in Education (CREATE) and School of Computing, Amritapuri, Amrita Vishwa Vidyapeetham, Amritapuri, India
| |
Collapse
|
36
|
Wan C, Chen D, Yang J. Pulse rate estimation from forehead photoplethysmograph signal using RLS adaptive filtering with dynamical reference signal. Biomed Signal Process Control 2022. [DOI: 10.1016/j.bspc.2021.103189] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/02/2022]
|
37
|
Wan C, Chen D, Huang Z, Luo X. A Wearable Head Mounted Display Bio-Signals Pad System for Emotion Recognition. SENSORS (BASEL, SWITZERLAND) 2021; 22:142. [PMID: 35009684 PMCID: PMC8749721 DOI: 10.3390/s22010142] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/22/2021] [Revised: 12/14/2021] [Accepted: 12/22/2021] [Indexed: 06/14/2023]
Abstract
Multimodal bio-signals acquisition based on wearable devices and using virtual reality (VR) as stimulus source are promising techniques in emotion recognition research field. Numerous studies have shown that emotional states can be better evoked through Immersive Virtual Environments (IVE). The main goal of this paper is to provide researchers with a system for emotion recognition in VR environments. In this paper, we present a wearable forehead bio-signals acquisition pad which is attached to Head-Mounted Displays (HMD), termed HMD Bio Pad. This system can simultaneously record emotion-related two-channel electroencephalography (EEG), one-channel electrodermal activity (EDA), photoplethysmograph (PPG) and skin temperature (SKT) signals. In addition, we develop a human-computer interaction (HCI) interface which researchers can carry out emotion recognition research using VR HMD as stimulus presentation device. To evaluate the performance of the proposed system, we conducted different experiments to validate the multimodal bio-signals quality, respectively. To validate EEG signal, we have assessed the performance in terms of EEG eyes-blink task and eyes-open and eyes-closed task. The EEG eyes-blink task indicates that the proposed system can achieve comparable EEG signal quality in comparison to the dedicated bio-signals measuring device. The eyes-open and eyes-closed task proves that the proposed system can efficiently record alpha rhythm. Then we used signal-to-noise ratio (SNR) and Skin Conductance Reaction (SCR) signal to validate the performance for EDA acquisition system. A filtered EDA signal, with a high mean SNR of 28.52 dB, is plotted on HCI interface. Moreover, the SCR signal related to stimulus response can be correctly extracted from EDA signal. The SKT acquisition system has been validated effectively by the temperature change experiment when subjects are in unpleasant emotion. The pulse rate (PR) estimated from PPG signal achieved the low mean average absolute error (AAE), which is 1.12 beats per minute (BPM) over 8 recordings. In summary, the proposed HMD Bio Pad offers a portable, comfortable and easy-to-wear device for recording bio-signals. The proposed system could contribute to emotion recognition research in VR environments.
Collapse
Affiliation(s)
- Chunting Wan
- School of Automation Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China; (C.W.); (Z.H.); (X.L.)
- School of Electronic Engineering and Automation, Guilin University of Electronic Science and Technology, Guilin 541004, China
| | - Dongyi Chen
- School of Automation Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China; (C.W.); (Z.H.); (X.L.)
| | - Zhiqi Huang
- School of Automation Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China; (C.W.); (Z.H.); (X.L.)
| | - Xi Luo
- School of Automation Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China; (C.W.); (Z.H.); (X.L.)
| |
Collapse
|
38
|
Weber D, Hertweck S, Alwanni H, Fiederer LDJ, Wang X, Unruh F, Fischbach M, Latoschik ME, Ball T. A Structured Approach to Test the Signal Quality of Electroencephalography Measurements During Use of Head-Mounted Displays for Virtual Reality Applications. Front Neurosci 2021; 15:733673. [PMID: 34880720 PMCID: PMC8645583 DOI: 10.3389/fnins.2021.733673] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2021] [Accepted: 10/20/2021] [Indexed: 11/21/2022] Open
Abstract
Joint applications of virtual reality (VR) systems and electroencephalography (EEG) offer numerous new possibilities ranging from behavioral science to therapy. VR systems allow for highly controlled experimental environments, while EEG offers a non-invasive window to brain activity with a millisecond-ranged temporal resolution. However, EEG measurements are highly susceptible to electromagnetic (EM) noise and the influence of EM noise of head-mounted-displays (HMDs) on EEG signal quality has not been conclusively investigated. In this paper, we propose a structured approach to test HMDs for EM noise potentially harmful to EEG measures. The approach verifies the impact of HMDs on the frequency- and time-domain of the EEG signal recorded in healthy subjects. The verification task includes a comparison of conditions with and without an HMD during (i) an eyes-open vs. eyes-closed task, and (ii) with respect to the sensory- evoked brain activity. The approach is developed and tested to derive potential effects of two commercial HMDs, the Oculus Rift and the HTC Vive Pro, on the quality of 64-channel EEG measurements. The results show that the HMDs consistently introduce artifacts, especially at the line hum of 50 Hz and the HMD refresh rate of 90 Hz, respectively, and their harmonics. The frequency range that is typically most important in non-invasive EEG research and applications (<50 Hz) however, remained largely unaffected. Hence, our findings demonstrate that high-quality EEG recordings, at least in the frequency range up to 50 Hz, can be obtained with the two tested HMDs. However, the number of commercially available HMDs is constantly rising. We strongly suggest to thoroughly test such devices upfront since each HMD will most likely have its own EM footprint and this article provides a structured approach to implement such tests with arbitrary devices.
Collapse
Affiliation(s)
- Desirée Weber
- Neuromedical AI Lab, Department of Neurosurgery, University Medical Center Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Stephan Hertweck
- Neuromedical AI Lab, Department of Neurosurgery, University Medical Center Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Hisham Alwanni
- Neuromedical AI Lab, Department of Neurosurgery, University Medical Center Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Lukas D. J. Fiederer
- Neuromedical AI Lab, Department of Neurosurgery, University Medical Center Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Xi Wang
- Neuromedical AI Lab, Department of Neurosurgery, University Medical Center Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Fabian Unruh
- Human-Computer Interaction Group, University of Würzburg, Würzburg, Germany
| | - Martin Fischbach
- Human-Computer Interaction Group, University of Würzburg, Würzburg, Germany
| | | | - Tonio Ball
- Neuromedical AI Lab, Department of Neurosurgery, University Medical Center Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| |
Collapse
|
39
|
Uluer P, Kose H, Gumuslu E, Barkana DE. Experience with an Affective Robot Assistant for Children with Hearing Disabilities. Int J Soc Robot 2021; 15:643-660. [PMID: 34804256 PMCID: PMC8594648 DOI: 10.1007/s12369-021-00830-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/10/2021] [Indexed: 01/10/2023]
Abstract
This study presents an assistive robotic system enhanced with emotion recognition capabilities for children with hearing disabilities. The system is designed and developed for the audiometry tests and rehabilitation of children in a clinical setting and includes a social humanoid robot (Pepper), an interactive interface, gamified audiometry tests, sensory setup and a machine/deep learning based emotion recognition module. Three scenarios involving conventional setup, tablet setup and setup with the robot+tablet are evaluated with 16 children having cochlear implant or hearing aid. Several machine learning techniques and deep learning models are used for the classification of the three test setups and for the classification of the emotions (pleasant, neutral, unpleasant) of children using the recorded physiological signals by E4 wristband. The results show that the collected signals during the tests can be separated successfully and the positive and negative emotions of children can be better distinguished when they interact with the robot than in the other two setups. In addition, the children’s objective and subjective evaluations as well as their impressions about the robot and its emotional behaviors are analyzed and discussed extensively.
Collapse
Affiliation(s)
- Pinar Uluer
- Department of Computer Engineering, Galatasaray University, Istanbul, Turkey.,Department of AI and Data Engineering, Istanbul Technical University, Istanbul, Turkey
| | - Hatice Kose
- Department of AI and Data Engineering, Istanbul Technical University, Istanbul, Turkey
| | - Elif Gumuslu
- Department of Electrical and Electronics Engineering, Yeditepe University, Istanbul, Turkey
| | - Duygun Erol Barkana
- Department of Electrical and Electronics Engineering, Yeditepe University, Istanbul, Turkey
| |
Collapse
|
40
|
Apicella A, Arpaia P, Mastrati G, Moccaldi N. EEG-based detection of emotional valence towards a reproducible measurement of emotions. Sci Rep 2021; 11:21615. [PMID: 34732756 PMCID: PMC8566577 DOI: 10.1038/s41598-021-00812-7] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2021] [Accepted: 09/20/2021] [Indexed: 11/09/2022] Open
Abstract
A methodological contribution to a reproducible Measurement of Emotions for an EEG-based system is proposed. Emotional Valence detection is the suggested use case. Valence detection occurs along the interval scale theorized by the Circumplex Model of emotions. The binary choice, positive valence vs negative valence, represents a first step towards the adoption of a metric scale with a finer resolution. EEG signals were acquired through a 8-channel dry electrode cap. An implicit-more controlled EEG paradigm was employed to elicit emotional valence through the passive view of standardized visual stimuli (i.e., Oasis dataset) in 25 volunteers without depressive disorders. Results from the Self Assessment Manikin questionnaire confirmed the compatibility of the experimental sample with that of Oasis. Two different strategies for feature extraction were compared: (i) based on a-priory knowledge (i.e., Hemispheric Asymmetry Theories), and (ii) automated (i.e., a pipeline of a custom 12-band Filter Bank and Common Spatial Pattern). An average within-subject accuracy of 96.1 %, was obtained by a shallow Artificial Neural Network, while k-Nearest Neighbors allowed to obtain a cross-subject accuracy equal to 80.2%.
Collapse
Affiliation(s)
- Andrea Apicella
- Laboratory of Augmented Reality for Health Monitoring (ARHeMLab), Department of Electrical Engineering and Information Technology, University of Naples Federico II, Naples, Italy
| | - Pasquale Arpaia
- Laboratory of Augmented Reality for Health Monitoring (ARHeMLab), Department of Electrical Engineering and Information Technology, University of Naples Federico II, Naples, Italy.
- Interdepartmental Center for Research on Management and Innovation in Healthcare (CIRMIS), University of Naples Federico II, Naples, Italy.
| | - Giovanna Mastrati
- Laboratory of Augmented Reality for Health Monitoring (ARHeMLab), Department of Electrical Engineering and Information Technology, University of Naples Federico II, Naples, Italy
| | - Nicola Moccaldi
- Laboratory of Augmented Reality for Health Monitoring (ARHeMLab), Department of Electrical Engineering and Information Technology, University of Naples Federico II, Naples, Italy
| |
Collapse
|
41
|
Chin ZY, Zhang Z, Wang C, Ang KK. An Affective Interaction System using Virtual Reality and Brain-Computer Interface. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2021; 2021:6183-6186. [PMID: 34892528 DOI: 10.1109/embc46164.2021.9630045] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Affective Computing is a multidisciplinary area of research that allows computers to perform human emotion recognition, with potential applications in areas such as healthcare, gaming and intuitive human computer interface design. Hence, this paper proposes an affective interaction system using dry EEG-based Brain-Computer Interface and Virtual Reality (BCI-VR). The proposed BCI-VR system integrates existing low-cost consumer devices such as an EEG headband with frontal and temporal dry electrodes for brain signal acquisition, and a low-cost VR headset that houses an Android handphone. The handphone executes an in-house developed software that connects wirelessly to the headband, processes the acquired EEG signals, and displays VR content to elicit emotional responses. The proposed BCI-VR system was used to collect EEG data from 13 subjects while they watched VR content that elicits positive or negative emotional responses. EEG bandpower features were extracted to train Linear Discriminant and Support Vector Machine classifiers. The classification performances of these classifiers on this dataset and the results of a public dataset (SEED-IV) are then evaluated. The results in classifying positive vs negative emotions in both datasets (~66% for 2-class) show promise that positive and negative emotions can be detected by the proposed low cost BCIVR system, yielding nearly the same performance on the public dataset that used wet EEG electrodes. Hence the results show promise of the proposed BCI-VR system for real-time affective interaction applications in future.
Collapse
|
42
|
Gioia F, Pascali MA, Greco A, Colantonio S, Scilingo EP. Discriminating Stress From Cognitive Load Using Contactless Thermal Imaging Devices. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2021; 2021:608-611. [PMID: 34891367 DOI: 10.1109/embc46164.2021.9630860] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
This study proposes long wave infrared technology as a contactless alternative to wearable devices for stress detection. To this aim, we studied the change in facial thermal distribution of 17 healthy subjects in response to different stressors (Stroop Test, Mental Arithmetic Test). During the experimental sessions the electrodermal activity (EDA) and the facial thermal response were simultaneously recorded from each subject. It is well known from the literature that EDA can be considered a reliable marker for the psychological state variation, therefore we used it as a reference signal to validate the thermal results. Statistical analysis was performed to evaluate significant differences in the thermal features between stress and non-stress conditions, as well as between stress and cognitive load. Our results are in line with the outcomes of previous studies and show significant differences in the temperature trends over time between stress and resting conditions. As a new result, we found that the mean temperature changes of some less studied facial regions, e.g., the right cheek, are able not only to significantly discriminate between resting and stressful conditions, but also allow to recognize the typology of stressors. This outcome not only directs future studies to consider the thermal patterns of less explored facial regions as possible correlates of mental states, but more importantly it suggests that different psychological states could potentially be discriminated in a contactless manner.
Collapse
|
43
|
Impact of Outdoor Temperature Variations on Thermal State in Experiments Using Immersive Virtual Environment. SUSTAINABILITY 2021. [DOI: 10.3390/su131910638] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
Abstract
Recent studies have established immersive virtual environments (IVEs) as promising tools for studying human thermal states and human–building interactions. One advantage of using immersive virtual environments is that experiments or data collection can be conducted at any time of the year. However, previous studies have confirmed the potential impact of outdoor temperature variations, such as seasonal variations on human thermal sensation. To the best of our knowledge, no study has looked into the potential impact of variations in outdoor temperatures on experiments using IVE. Thus, this study aimed to determine if different outdoor temperature conditions affected the thermal states in experiments using IVEs. Experiments were conducted using a head mounted display (HMD) in a climate chamber, and the data was analyzed under three temperature ranges. A total of seventy-two people participated in the experiments conducted in two contrasting outdoor temperature conditions, i.e., cold and warm outdoor conditions. The in situ experiments conducted in two cases, i.e., cooling in warm outdoor conditions and heating in cold outdoor conditions, were used as a baseline. The baseline in-situ experiments were then compared with the IVE experiments conducted in four cases, i.e., cooling in warm and cold outdoor conditions and heating in warm and cold outdoor conditions. The selection of cooling in cold outdoor conditions and heating in warm outdoor conditions for IVE experiments is particularly for studying the impact of outdoor temperature variations. Results showed that under the experimental and outdoor temperature conditions, outdoor temperature variations in most cases did not impact the results of IVE experiments, i.e., IVE experiments can replicate a temperature environment for participants compared to the ones in the in situ experiments. In addition, the participant’s thermal sensation vote was found to be a reliable indicator between IVE and in situ settings in all studied conditions. A few significantly different cases were related to thermal comfort, thermal acceptability, and overall skin temperature.
Collapse
|
44
|
Emotional Responses to the Visual Patterns of Urban Streets: Evidence from Physiological and Subjective Indicators. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2021; 18:ijerph18189677. [PMID: 34574601 PMCID: PMC8467209 DOI: 10.3390/ijerph18189677] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/04/2021] [Revised: 09/06/2021] [Accepted: 09/07/2021] [Indexed: 11/17/2022]
Abstract
Despite recent progress in the research of people's emotional response to the environment, the built-rather than natural-environment's emotional effects have not yet been thoroughly examined. In response to this knowledge gap, we recruited 26 participants and scrutinized their emotional response to various urban street scenes through an immersive exposure experiment using virtual reality. We utilized new physiological monitoring technologies that enable synchronized observation of the participants' electroencephalography, electrodermal activity, and heart rate, as well as their subjective indicators. With the newly introduced measurement for the global visual patterns of the built environment, we built statistical models to examine people's emotional response to the physical element configuration and color composition of street scenes. We found that more diverse and less fragmented scenes inspired positive emotional feelings. We also found (in)consistency among the physiological and subjective indicators, indicating a potentially interesting neural-physiological interpretation for the classic form-function dichotomy in architecture. Besides the practical implications on promoting physical environment design, this study combined objective physiology-monitoring technology and questionnaire-based research techniques to demonstrate a better approach to quantify environment-emotion relationships.
Collapse
|
45
|
Tian F, Hua M, Zhang W, Li Y, Yang X. Emotional arousal in 2D versus 3D virtual reality environments. PLoS One 2021; 16:e0256211. [PMID: 34499667 PMCID: PMC8428725 DOI: 10.1371/journal.pone.0256211] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2020] [Accepted: 08/02/2021] [Indexed: 11/18/2022] Open
Abstract
Previous studies have suggested that virtual reality (VR) can elicit emotions in different visual modes using 2D or 3D headsets. However, the effects on emotional arousal by using these two visual modes have not been comprehensively investigated, and the underlying neural mechanisms are not yet clear. This paper presents a cognitive psychological experiment that was conducted to analyze how these two visual modes impact emotional arousal. Forty volunteers were recruited and were randomly assigned to two groups. They were asked to watch a series of positive, neutral and negative short VR videos in 2D and 3D. Multichannel electroencephalograms (EEG) and skin conductance responses (SCR) were recorded simultaneously during their participation. The results indicated that emotional stimulation was more intense in the 3D environment due to the improved perception of the environment; greater emotional arousal was generated; and higher beta (21-30 Hz) EEG power was identified in 3D than in 2D. We also found that both hemispheres were involved in stereo vision processing and that brain lateralization existed in the processing.
Collapse
Affiliation(s)
- Feng Tian
- Shanghai Film Academy, Shanghai University, Shanghai, China
| | - Minlei Hua
- Shanghai Film Academy, Shanghai University, Shanghai, China
| | - Wenrui Zhang
- Shanghai Film Academy, Shanghai University, Shanghai, China
| | - Yingjie Li
- Shanghai Institute for Advanced Communication and Data Science, Shanghai University, Shanghai, China
- School of Communication and Information Engineering, Shanghai University, Shanghai, China
- * E-mail:
| | - Xiaoli Yang
- Department of Electrical and Computer Engineering, Purdue University Northwest, Hammond, Indiana, United States of America
| |
Collapse
|
46
|
Rdest M, Janas D. Carbon Nanotube Wearable Sensors for Health Diagnostics. SENSORS (BASEL, SWITZERLAND) 2021; 21:5847. [PMID: 34502734 PMCID: PMC8433779 DOI: 10.3390/s21175847] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/30/2021] [Revised: 08/20/2021] [Accepted: 08/26/2021] [Indexed: 11/25/2022]
Abstract
This perspective article highlights a recent surge of interest in the application of textiles containing carbon nanotube (CNT) sensors for human health monitoring. Modern life puts more and more pressure on humans, which translates into an increased number of various health disorders. Unfortunately, this effect either decreases the quality of life or shortens it prematurely. A possible solution to this problem is to employ sensors to monitor various body functions and indicate an upcoming disease likelihood at its early stage. A broad spectrum of materials is currently under investigation for this purpose, some of which already entered the market. One of the most promising materials in this field are CNTs. They are flexible and of high electrical conductivity, which can be modulated upon several forms of stimulation. The article begins with an illustration of techniques for how wearable sensors can be built from them. Then, their application potential for tracking various health parameters is presented. Finally, the article ends with a summary of this field's progress and a vision of the key directions to domesticate this concept.
Collapse
Affiliation(s)
- Monika Rdest
- Department of Materials Science and Metallurgy, University of Cambridge, 27 Charles Babbage Rd., Cambridge CB3 0FS, UK;
| | - Dawid Janas
- Department of Organic Chemistry, Bioorganic Chemistry and Biotechnology, Silesian University of Technology, B. Krzywoustego 4, 44-100 Gliwice, Poland
| |
Collapse
|
47
|
Marín-Morales J, Higuera-Trujillo JL, Guixeres J, Llinares C, Alcañiz M, Valenza G. Heart rate variability analysis for the assessment of immersive emotional arousal using virtual reality: Comparing real and virtual scenarios. PLoS One 2021; 16:e0254098. [PMID: 34197553 PMCID: PMC8248697 DOI: 10.1371/journal.pone.0254098] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2020] [Accepted: 06/19/2021] [Indexed: 11/18/2022] Open
Abstract
Many affective computing studies have developed automatic emotion recognition models, mostly using emotional images, audio and videos. In recent years, virtual reality (VR) has been also used as a method to elicit emotions in laboratory environments. However, there is still a need to analyse the validity of VR in order to extrapolate the results it produces and to assess the similarities and differences in physiological responses provoked by real and virtual environments. We investigated the cardiovascular oscillations of 60 participants during a free exploration of a real museum and its virtualisation viewed through a head-mounted display. The differences between the heart rate variability features in the high and low arousal stimuli conditions were analysed through statistical hypothesis testing; and automatic arousal recognition models were developed across the real and the virtual conditions using a support vector machine algorithm with recursive feature selection. The subjects' self-assessments suggested that both museums elicited low and high arousal levels. In addition, the real museum showed differences in terms of cardiovascular responses, differences in vagal activity, while arousal recognition reached 72.92% accuracy. However, we did not find the same arousal-based autonomic nervous system change pattern during the virtual museum exploration. The results showed that, while the direct virtualisation of a real environment might be self-reported as evoking psychological arousal, it does not necessarily evoke the same cardiovascular changes as a real arousing elicitation. These contribute to the understanding of the use of VR in emotion recognition research; future research is needed to study arousal and emotion elicitation in immersive VR.
Collapse
Affiliation(s)
- Javier Marín-Morales
- Instituto de Investigación e Innovación en Bioingeniería, Universitat Politècnica de València, València, Spain
| | - Juan Luis Higuera-Trujillo
- Instituto de Investigación e Innovación en Bioingeniería, Universitat Politècnica de València, València, Spain
| | - Jaime Guixeres
- Instituto de Investigación e Innovación en Bioingeniería, Universitat Politècnica de València, València, Spain
| | - Carmen Llinares
- Instituto de Investigación e Innovación en Bioingeniería, Universitat Politècnica de València, València, Spain
| | - Mariano Alcañiz
- Instituto de Investigación e Innovación en Bioingeniería, Universitat Politècnica de València, València, Spain
| | - Gaetano Valenza
- Bioengineering and Robotics Research Centre E Piaggio & Department of Information Engineering, University of Pisa, Pisa, Italy
| |
Collapse
|
48
|
Site Experience Enhancement and Perspective in Cultural Heritage Fruition—A Survey on New Technologies and Methodologies Based on a “Four-Pillars” Approach. FUTURE INTERNET 2021. [DOI: 10.3390/fi13040092] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023] Open
Abstract
This paper deals with innovative fruition modalities of cultural heritage sites. Based on two ongoing experiments, four pillars are considered, that is, User Localization, Multimodal Interaction, User Understanding and Gamification. A survey of the existing literature regarding one or more issues related to the four pillars is proposed. It aims to put in evidence the exploitation of these contributions to cultural heritage. It is discussed how a cultural site can be enriched, extended and transformed into an intelligent multimodal environment in this perspective. This new augmented environment can focus on the visitor, analyze his activity and behavior, and make his experience more satisfying, fulfilling and unique. After an in-depth overview of the existing technologies and methodologies for the fruition of cultural interest sites, the two experiments are described in detail and the authors’ vision of the future is proposed.
Collapse
|
49
|
Pham M, Do HM, Su Z, Bishop A, Sheng W. Negative Emotion Management Using a Smart Shirt and a Robot Assistant. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3067867] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
|
50
|
Higuera-Trujillo JL, Llinares C, Macagno E. The Cognitive-Emotional Design and Study of Architectural Space: A Scoping Review of Neuroarchitecture and Its Precursor Approaches. SENSORS (BASEL, SWITZERLAND) 2021; 21:2193. [PMID: 33801037 PMCID: PMC8004070 DOI: 10.3390/s21062193] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/22/2020] [Revised: 03/15/2021] [Accepted: 03/17/2021] [Indexed: 12/24/2022]
Abstract
Humans respond cognitively and emotionally to the built environment. The modern possibility of recording the neural activity of subjects during exposure to environmental situations, using neuroscientific techniques and virtual reality, provides a promising framework for future design and studies of the built environment. The discipline derived is termed "neuroarchitecture". Given neuroarchitecture's transdisciplinary nature, it progresses needs to be reviewed in a contextualised way, together with its precursor approaches. The present article presents a scoping review, which maps out the broad areas on which the new discipline is based. The limitations, controversies, benefits, impact on the professional sectors involved, and potential of neuroarchitecture and its precursors' approaches are critically addressed.
Collapse
Affiliation(s)
- Juan Luis Higuera-Trujillo
- Institute for Research and Innovation in Bioengineering (i3B), Universitat Politècnica de València, 46022 Valencia, Spain;
- Escuela de Arquitectura, Arte y Diseño (EAAD), Tecnologico de Monterrey, Monterrey 72453, Mexico
| | - Carmen Llinares
- Institute for Research and Innovation in Bioengineering (i3B), Universitat Politècnica de València, 46022 Valencia, Spain;
| | - Eduardo Macagno
- Division of Biological Sciences, University of California San Diego, La Jolla, CA 92093-0116, USA;
| |
Collapse
|