1
|
Yamasaki D, Nagai M. Emotion-gaze interaction affects time-to-collision estimates, but not preferred interpersonal distance towards looming faces. Front Psychol 2024; 15:1414702. [PMID: 39323584 PMCID: PMC11423545 DOI: 10.3389/fpsyg.2024.1414702] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2024] [Accepted: 06/10/2024] [Indexed: 09/27/2024] Open
Abstract
Estimating the time until impending collision (time-to-collision, TTC) of approaching or looming individuals and maintaining a comfortable distance from others (interpersonal distance, IPD) are commonly required in daily life and contribute to survival and social goals. Despite accumulating evidence that facial expressions and gaze direction interactively influence face processing, it remains unclear how these facial features affect the spatiotemporal processing of looming faces. We examined whether facial expressions (fearful vs. neutral) and gaze direction (direct vs. averted) interact on the judgments of TTC and IPD for looming faces, based on the shared signal hypothesis that fear signals the existence of threats in the environment when coupled with averted gaze. Experiment 1 demonstrated that TTC estimates were reduced for fearful faces compared to neutral ones only when the concomitant gaze was averted. In Experiment 2, the emotion-gaze interaction was not observed in the IPD regulation, which is arguably sensitive to affective responses to faces. The results suggest that fearful-averted faces modulate the cognitive extrapolation process of looming motion by communicating environmental threats rather than by altering subjective fear or perceived emotional intensity of faces. The TTC-specific effect may reflect an enhanced defensive response to unseen threats implied by looming fearful-averted faces. Our findings provide insight into how the visual system processes facial features to ensure bodily safety and comfortable interpersonal communication in dynamic environments.
Collapse
Affiliation(s)
- Daiki Yamasaki
- Research Organization of Open, Innovation and Collaboration, Ritsumeikan University, Osaka, Japan
- Japan Society for the Promotion of Science, Tokyo, Japan
| | - Masayoshi Nagai
- College of Comprehensive Psychology, Ritsumeikan University, Osaka, Japan
| |
Collapse
|
2
|
Denzer S, Diezig S, Achermann P, Mast FW, Koenig T. Electrophysiological (EEG) microstates during dream-like bizarre experiences in a naturalistic scenario using immersive virtual reality. Eur J Neurosci 2024. [PMID: 39258353 DOI: 10.1111/ejn.16530] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2023] [Revised: 07/22/2024] [Accepted: 08/26/2024] [Indexed: 09/12/2024]
Abstract
Monitoring the reality status of conscious experience is essential for a human being to interact successfully with the external world. Despite its importance for everyday functioning, reality monitoring can systematically become erroneous, for example, while dreaming or during hallucinatory experiences. To investigate brain processes associated with reality monitoring occurring online during an experience, i.e., perceptual reality monitoring, we assessed EEG microstates in healthy, young participants. In a within-subjects design, we compared the experience of reality when being confronted with dream-like bizarre elements versus realistic elements in an otherwise highly naturalistic real-world scenario in immersive virtual reality. Dream-like bizarreness induced changes in the subjective experience of reality and bizarreness, and led to an increase in the contribution of a specific microstate labelled C'. Microstate C' was related to the suspension of disbelief, i.e. the suppression of bizarre mismatches. Together with the functional interpretation of microstate C' as reported by previous studies, the findings of this study point to the importance of prefrontal meta-conscious control processes in perceptual reality monitoring.
Collapse
Affiliation(s)
- Simone Denzer
- Institute of Psychology, University of Bern, Bern, Switzerland
- Graduate School for Health Sciences, University of Bern, Bern, Switzerland
| | - Sarah Diezig
- Graduate School for Health Sciences, University of Bern, Bern, Switzerland
- Translational Research Center, University Hospital of Psychiatry, Bern, Switzerland
| | - Peter Achermann
- Institute of Pharmacology and Toxicology, University of Zurich, Zurich, Switzerland
| | - Fred W Mast
- Institute of Psychology, University of Bern, Bern, Switzerland
| | - Thomas Koenig
- Translational Research Center, University Hospital of Psychiatry, Bern, Switzerland
| |
Collapse
|
3
|
Sylvester S, Sagehorn M, Gruber T, Atzmueller M, Schöne B. SHAP value-based ERP analysis (SHERPA): Increasing the sensitivity of EEG signals with explainable AI methods. Behav Res Methods 2024; 56:6067-6081. [PMID: 38453828 PMCID: PMC11335964 DOI: 10.3758/s13428-023-02335-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/27/2023] [Indexed: 03/09/2024]
Abstract
Conventionally, event-related potential (ERP) analysis relies on the researcher to identify the sensors and time points where an effect is expected. However, this approach is prone to bias and may limit the ability to detect unexpected effects or to investigate the full range of the electroencephalography (EEG) signal. Data-driven approaches circumvent this limitation, however, the multiple comparison problem and the statistical correction thereof affect both the sensitivity and specificity of the analysis. In this study, we present SHERPA - a novel approach based on explainable artificial intelligence (XAI) designed to provide the researcher with a straightforward and objective method to find relevant latency ranges and electrodes. SHERPA is comprised of a convolutional neural network (CNN) for classifying the conditions of the experiment and SHapley Additive exPlanations (SHAP) as a post hoc explainer to identify the important temporal and spatial features. A classical EEG face perception experiment is employed to validate the approach by comparing it to the established researcher- and data-driven approaches. Likewise, SHERPA identified an occipital cluster close to the temporal coordinates for the N170 effect expected. Most importantly, SHERPA allows quantifying the relevance of an ERP for a psychological mechanism by calculating an "importance score". Hence, SHERPA suggests the presence of a negative selection process at the early and later stages of processing. In conclusion, our new method not only offers an analysis approach suitable in situations with limited prior knowledge of the effect in question but also an increased sensitivity capable of distinguishing neural processes with high precision.
Collapse
Affiliation(s)
- Sophia Sylvester
- Institute of Computer Science, Osnabrück University, Osnabrück, Germany
- Department of Mental Health, Norwegian University of Science and Technology, Trondheim, Norway
| | - Merle Sagehorn
- Institute of Psychology, Osnabrück University, Osnabrück, Germany
| | - Thomas Gruber
- Institute of Psychology, Osnabrück University, Osnabrück, Germany
| | - Martin Atzmueller
- Institute of Computer Science, Osnabrück University, Osnabrück, Germany
- German Research Center for Artificial Intelligence (DFKI), Osnabrück, Germany
| | - Benjamin Schöne
- Institute of Psychology, Osnabrück University, Osnabrück, Germany.
- Department of Psychology, Norwegian University of Science and Technology, Trondheim, Norway.
| |
Collapse
|
4
|
Karpov G, Lin MH, Headley DB, Baker TE. Oscillatory correlates of threat imminence during virtual navigation. Psychophysiology 2024; 61:e14551. [PMID: 38516942 DOI: 10.1111/psyp.14551] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2023] [Revised: 01/18/2024] [Accepted: 02/10/2024] [Indexed: 03/23/2024]
Abstract
The Predatory Imminence Continuum Theory proposes that defensive behaviors depend on the proximity of a threat. While the neural mechanisms underlying this proposal are well studied in animal models, it remains poorly understood in humans. To address this issue, we recorded EEG from 24 (15 female) young adults engaged in a first-person virtual reality Risk-Reward interaction task. On each trial, participants were placed in a virtual room and presented with either a threat or reward conditioned stimulus (CS) in the same room location (proximal) or different room location (distal). Behaviorally, all participants learned to avoid the threat-CS, with most using the optimal behavior to actively avoid the proximal threat-CS (88% accuracy) and passively avoid the distal threat-CS (69% accuracy). Similarly, participants learned to actively approach the distal reward-CS (82% accuracy) and to remain passive to the proximal reward-CS (72% accuracy). At an electrophysiological level, we observed a general increase in theta power (4-8 Hz) over the right posterior channel P8 across all conditions, with the proximal threat-CS evoking the largest theta response. By contrast, distal cues induced two bursts of gamma (30-60 Hz) power over midline-parietal channel Pz (200 msec post-cue) and right frontal channel Fp2 (300 msec post-cue). Interestingly, the first burst of gamma power was sensitive to the distal threat-CS and the second burst at channel Fp2 was sensitive to the distal reward-CS. Together, these findings demonstrate that oscillatory processes differentiate between the spatial proximity information during threat and reward encoding, likely optimizing the selection of the appropriate behavioral response.
Collapse
Affiliation(s)
- Galit Karpov
- Center for Molecular and Behavioral Neuroscience, Rutgers State University, Newark, New Jersey, USA
| | - Mei-Heng Lin
- Center for Molecular and Behavioral Neuroscience, Rutgers State University, Newark, New Jersey, USA
| | - Drew B Headley
- Center for Molecular and Behavioral Neuroscience, Rutgers State University, Newark, New Jersey, USA
| | - Travis E Baker
- Center for Molecular and Behavioral Neuroscience, Rutgers State University, Newark, New Jersey, USA
| |
Collapse
|
5
|
Sagehorn M, Johnsdorf M, Kisker J, Gruber T, Schöne B. Electrophysiological correlates of face and object perception: A comparative analysis of 2D laboratory and virtual reality conditions. Psychophysiology 2024; 61:e14519. [PMID: 38219244 DOI: 10.1111/psyp.14519] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2023] [Revised: 12/12/2023] [Accepted: 12/26/2023] [Indexed: 01/16/2024]
Abstract
Human face perception is a specialized visual process with inherent social significance. The neural mechanisms reflecting this intricate cognitive process have evolved in spatially complex and emotionally rich environments. Previous research using VR to transfer an established face perception paradigm to realistic conditions has shown that the functional properties of face-sensitive neural correlates typically observed in the laboratory are attenuated outside the original modality. The present study builds on these results by comparing the perception of persons and objects under conventional laboratory (PC) and realistic conditions in VR. Adhering to established paradigms, the PC- and VR modalities both featured images of persons and cars alongside standard control images. To investigate the individual stages of realistic face processing, response times, the typical face-sensitive N170 component, and relevant subsequent components (L1, L2; pre-, post-response) were analyzed within and between modalities. The between-modality comparison of response times and component latencies revealed generally faster processing under realistic conditions. However, the obtained N170 latency and amplitude differences showed reduced discriminative capacity under realistic conditions during this early stage. These findings suggest that the effects commonly observed in the lab are specific to monitor-based presentations. Analyses of later and response-locked components showed specific neural mechanisms for identification and evaluation are employed when perceiving the stimuli under realistic conditions, reflected in discernible amplitude differences in response to faces and objects beyond the basic perceptual features. Conversely, the results do not provide evidence for comparable stimulus-specific perceptual processing pathways when viewing pictures of the stimuli under conventional laboratory conditions.
Collapse
Affiliation(s)
- Merle Sagehorn
- Experimental Psychology I, Institute of Psychology, Osnabrück University, Osnabrück, Germany
| | - Marike Johnsdorf
- Experimental Psychology I, Institute of Psychology, Osnabrück University, Osnabrück, Germany
| | - Joanna Kisker
- Experimental Psychology I, Institute of Psychology, Osnabrück University, Osnabrück, Germany
| | - Thomas Gruber
- Experimental Psychology I, Institute of Psychology, Osnabrück University, Osnabrück, Germany
| | - Benjamin Schöne
- Experimental Psychology I, Institute of Psychology, Osnabrück University, Osnabrück, Germany
- Department of Psychology, Norwegian University of Science and Technology, Trondheim, Norway
| |
Collapse
|
6
|
Wu YT, Baillet S, Lamontagne A. Brain mechanisms involved in the perception of emotional gait: A combined magnetoencephalography and virtual reality study. PLoS One 2024; 19:e0299103. [PMID: 38551903 PMCID: PMC10980214 DOI: 10.1371/journal.pone.0299103] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2023] [Accepted: 02/05/2024] [Indexed: 04/01/2024] Open
Abstract
Brain processes associated with emotion perception from biological motion have been largely investigated using point-light displays that are devoid of pictorial information and not representative of everyday life. In this study, we investigated the brain signals evoked when perceiving emotions arising from body movements of virtual pedestrians walking in a community environment. Magnetoencephalography was used to record brain activation in 21 healthy young adults discriminating the emotional gaits (neutral, angry, happy) of virtual male/female pedestrians. Event-related responses in the posterior superior temporal sulcus (pSTS), fusiform body area (FBA), extrastriate body area (EBA), amygdala (AMG), and lateral occipital cortex (Occ) were examined. Brain signals were characterized by an early positive peak (P1;∼200ms) and a late positive potential component (LPP) comprising of an early (400-600ms), middle (600-1000ms) and late phase (1000-1500ms). Generalized estimating equations revealed that P1 amplitude was unaffected by emotion and gender of pedestrians. LPP amplitude showed a significant emotion X phase interaction in all regions of interest, revealing i) an emotion-dependent modulation starting in pSTS and Occ, followed by AMG, FBA and EBA, and ii) generally enhanced responses for angry vs. other gait stimuli in the middle LPP phase. LPP also showed a gender X phase interaction in pSTS and Occ, as gender affected the time course of the response to emotional gait. Present findings show that brain activation within areas associated with biological motion, form, and emotion processing is modulated by emotional gait stimuli rendered by virtual simulations representative of everyday life.
Collapse
Affiliation(s)
- Yu-Tzu Wu
- School of Physical and Occupational Therapy, McGill University, Montreal, Quebec, Canada
- Feil and Oberfeld Research Centre, Jewish Rehabilitation Hospital–Centre Intégré de Santé et de Services Sociaux de Laval, Centre for Interdisciplinary Research in Rehabilitation of Greater Montreal, Montreal, Quebec, Canada
| | - Sylvain Baillet
- McConnell Brain Imaging Centre, Montreal Neurological Institute-Hospital–Montreal, Montreal, Quebec, Canada
| | - Anouk Lamontagne
- School of Physical and Occupational Therapy, McGill University, Montreal, Quebec, Canada
- Feil and Oberfeld Research Centre, Jewish Rehabilitation Hospital–Centre Intégré de Santé et de Services Sociaux de Laval, Centre for Interdisciplinary Research in Rehabilitation of Greater Montreal, Montreal, Quebec, Canada
| |
Collapse
|
7
|
Sarzedas J, Lima CF, Roberto MS, Scott SK, Pinheiro AP, Conde T. Blindness influences emotional authenticity perception in voices: Behavioral and ERP evidence. Cortex 2024; 172:254-270. [PMID: 38123404 DOI: 10.1016/j.cortex.2023.11.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2023] [Revised: 10/31/2023] [Accepted: 11/10/2023] [Indexed: 12/23/2023]
Abstract
The ability to distinguish spontaneous from volitional emotional expressions is an important social skill. How do blind individuals perceive emotional authenticity? Unlike sighted individuals, they cannot rely on facial and body language cues, relying instead on vocal cues alone. Here, we combined behavioral and ERP measures to investigate authenticity perception in laughter and crying in individuals with early- or late-blindness onset. Early-blind, late-blind, and sighted control participants (n = 17 per group, N = 51) completed authenticity and emotion discrimination tasks while EEG data were recorded. The stimuli consisted of laughs and cries that were either spontaneous or volitional. The ERP analysis focused on the N1, P2, and late positive potential (LPP). Behaviorally, early-blind participants showed intact authenticity perception, but late-blind participants performed worse than controls. There were no group differences in the emotion discrimination task. In brain responses, all groups were sensitive to laughter authenticity at the P2 stage, and to crying authenticity at the early LPP stage. Nevertheless, only early-blind participants were sensitive to crying authenticity at the N1 and middle LPP stages, and to laughter authenticity at the early LPP stage. Furthermore, early-blind and sighted participants were more sensitive than late-blind ones to crying authenticity at the P2 and late LPP stages. Altogether, these findings suggest that early blindness relates to facilitated brain processing of authenticity in voices, both at early sensory and late cognitive-evaluative stages. Late-onset blindness, in contrast, relates to decreased sensitivity to authenticity at behavioral and brain levels.
Collapse
Affiliation(s)
- João Sarzedas
- CICPSI, Faculdade de Psicologia, Universidade de Lisboa, Lisboa, Portugal
| | - César F Lima
- Centro de Investigação e Intervenção Social (CIS-IUL), Instituto Universitário de Lisboa (ISCTE-IUL), Lisboa, Portugal; Institute of Cognitive Neuroscience, University College London, London, UK
| | - Magda S Roberto
- CICPSI, Faculdade de Psicologia, Universidade de Lisboa, Lisboa, Portugal
| | - Sophie K Scott
- Institute of Cognitive Neuroscience, University College London, London, UK
| | - Ana P Pinheiro
- CICPSI, Faculdade de Psicologia, Universidade de Lisboa, Lisboa, Portugal.
| | - Tatiana Conde
- CICPSI, Faculdade de Psicologia, Universidade de Lisboa, Lisboa, Portugal.
| |
Collapse
|
8
|
Martarelli CS, Chiquet S, Ertl M. Keeping track of reality: embedding visual memory in natural behaviour. Memory 2023; 31:1295-1305. [PMID: 37727126 DOI: 10.1080/09658211.2023.2260148] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2023] [Accepted: 07/21/2023] [Indexed: 09/21/2023]
Abstract
Since immersive virtual reality (IVR) emerged as a research method in the 1980s, the focus has been on the similarities between IVR and actual reality. In this vein, it has been suggested that IVR methodology might fill the gap between laboratory studies and real life. IVR allows for high internal validity (i.e., a high degree of experimental control and experimental replicability), as well as high external validity by letting participants engage with the environment in an almost natural manner. Despite internal validity being crucial to experimental designs, external validity also matters in terms of the generalizability of results. In this paper, we first highlight and summarise the similarities and differences between IVR, desktop situations (both non-immersive VR and computer experiments), and reality. In the second step, we propose that IVR is a promising tool for visual memory research in terms of investigating the representation of visual information embedded in natural behaviour. We encourage researchers to carry out experiments on both two-dimensional computer screens and in immersive virtual environments to investigate visual memory and validate and replicate the findings. IVR is valuable because of its potential to improve theoretical understanding and increase the psychological relevance of the findings.
Collapse
Affiliation(s)
| | - Sandra Chiquet
- Faculty of Psychology, UniDistance Suisse, Brig, Switzerland
| | - Matthias Ertl
- Department of Psychology, University of Bern, Bern, Switzerland
| |
Collapse
|
9
|
Monachesi B, Deruti A, Grecucci A, Vaes J. Electrophysiological, emotional and behavioural responses of female targets of sexual objectification. Sci Rep 2023; 13:5777. [PMID: 37031255 PMCID: PMC10082788 DOI: 10.1038/s41598-023-32379-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2022] [Accepted: 03/27/2023] [Indexed: 04/10/2023] Open
Abstract
Sexual objectification and the interiorized objectifying gaze (self-objectification) are dangerous phenomena for women's psychological wellness. However, their specific effects on women's socio-affective reactions are still poorly understood, and their neural activity has never been explored before. In the present study, we investigated women's emotional and electrophysiological responses during simulated computer-based objectifying social interactions, and we examined consequent punishing behaviours towards the perpetrator using the ultimatum game. Behavioural results (N = 36) showed that during objectifying encounters women generally felt angrier/disgusted and tended to punish the perpetrator in later interactions. However, the more the women self-objectified, the more they felt ashamed (p = 0.011) and tended to punish the perpetrators less (p = 0.008). At a neural level (N = 32), objectifying interactions modulated female participants' neural signal elicited during the processing of the perpetrator, increasing early (N170) and later (EPN, LPP) ERP components. In addition, only the amplitude of the LPP positively correlated with shame (p = 0.006) and the level of self-objectification (p = 0.018). This finding provides first evidence for the specific time-course of sexual objectification, self-objectification and its associated shame response, and proves that emotional and social consequences of sexual objectification in women may depend on their tendency to self-objectify.
Collapse
Affiliation(s)
- Bianca Monachesi
- Department of Psychology and Cognitive Science, University of Trento, Trento, Italy.
| | - Alice Deruti
- Department of Psychology and Cognitive Science, University of Trento, Trento, Italy
| | - Alessandro Grecucci
- Department of Psychology and Cognitive Science, University of Trento, Trento, Italy
- Centre for Medical Sciences, CISMed, University of Trento, Trento, Italy
| | - Jeroen Vaes
- Department of Psychology and Cognitive Science, University of Trento, Trento, Italy
| |
Collapse
|
10
|
Sagehorn M, Johnsdorf M, Kisker J, Sylvester S, Gruber T, Schöne B. Real-life relevant face perception is not captured by the N170 but reflected in later potentials: A comparison of 2D and virtual reality stimuli. Front Psychol 2023; 14:1050892. [PMID: 37057177 PMCID: PMC10086431 DOI: 10.3389/fpsyg.2023.1050892] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2022] [Accepted: 02/27/2023] [Indexed: 03/30/2023] Open
Abstract
The perception of faces is one of the most specialized visual processes in the human brain and has been investigated by means of the early event-related potential component N170. However, face perception has mostly been studied in the conventional laboratory, i.e., monitor setups, offering rather distal presentation of faces as planar 2D-images. Increasing spatial proximity through Virtual Reality (VR) allows to present 3D, real-life-sized persons at personal distance to participants, thus creating a feeling of social involvement and adding a self-relevant value to the presented faces. The present study compared the perception of persons under conventional laboratory conditions (PC) with realistic conditions in VR. Paralleling standard designs, pictures of unknown persons and standard control images were presented in a PC- and a VR-modality. To investigate how the mechanisms of face perception differ under realistic conditions from those under conventional laboratory conditions, the typical face-specific N170 and subsequent components were analyzed in both modalities. Consistent with previous laboratory research, the N170 lost discriminatory power when translated to realistic conditions, as it only discriminated faces and controls under laboratory conditions. Most interestingly, analysis of the later component [230–420 ms] revealed more differentiated face-specific processing in VR, as indicated by distinctive, stimulus-specific topographies. Complemented by source analysis, the results on later latencies show that face-specific neural mechanisms are applied only under realistic conditions (A video abstract is available in the Supplementary material and via YouTube: https://youtu.be/TF8wiPUrpSY).
Collapse
Affiliation(s)
- Merle Sagehorn
- Experimental Psychology I, Institute of Psychology, Osnabrück University, Osnabrück, Germany
- *Correspondence: Merle Sagehorn,
| | - Marike Johnsdorf
- Experimental Psychology I, Institute of Psychology, Osnabrück University, Osnabrück, Germany
| | - Joanna Kisker
- Experimental Psychology I, Institute of Psychology, Osnabrück University, Osnabrück, Germany
| | - Sophia Sylvester
- Semantic Information Systems Research Group, Institute of Computer Science, Osnabrück University, Osnabrück, Germany
| | - Thomas Gruber
- Experimental Psychology I, Institute of Psychology, Osnabrück University, Osnabrück, Germany
| | - Benjamin Schöne
- Experimental Psychology I, Institute of Psychology, Osnabrück University, Osnabrück, Germany
| |
Collapse
|
11
|
Andreatta M, Winkler MH, Collins P, Gromer D, Gall D, Pauli P, Gamer M. VR for Studying the Neuroscience of Emotional Responses. Curr Top Behav Neurosci 2023; 65:161-187. [PMID: 36592276 DOI: 10.1007/7854_2022_405] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
Emotions are frequently considered as the driving force of behavior, and psychopathology is often characterized by aberrant emotional responding. Emotional states are reflected on a cognitive-verbal, physiological-humoral, and motor-behavioral level but to date, human research lacks an experimental protocol for a comprehensive and ecologically valid characterization of such emotional states. Virtual reality (VR) might help to overcome this situation by allowing researchers to study mental processes and behavior in highly controlled but reality-like laboratory settings. In this chapter, we first elucidate the role of presence and immersion as requirements for eliciting emotional states in a virtual environment and discuss different VR methods for emotion induction. We then consider the organization of emotional states on a valence continuum (i.e., from negative to positive) and on this basis discuss the use of VR to study threat processing and avoidance as well as reward processing and approach behavior. Although the potential of VR has not been fully realized in laboratory and clinical settings yet, this technological tool can open up new avenues to better understand the neurobiological mechanisms of emotional responding in healthy and pathological conditions.
Collapse
Affiliation(s)
- Marta Andreatta
- Department of Psychology, Educational Sciences, and Child Studies, Erasmus University Rotterdam, Rotterdam, The Netherlands.
| | - Markus H Winkler
- Department of Psychology, University of Wuerzburg, Wuerzburg, Germany
| | - Peter Collins
- Department of Psychology, University of Wuerzburg, Wuerzburg, Germany
| | - Daniel Gromer
- Department of Psychology, University of Wuerzburg, Wuerzburg, Germany
| | - Dominik Gall
- Department of Psychology, University of Wuerzburg, Wuerzburg, Germany
| | - Paul Pauli
- Department of Psychology, University of Wuerzburg, Wuerzburg, Germany
| | - Matthias Gamer
- Department of Psychology, University of Wuerzburg, Wuerzburg, Germany.
| |
Collapse
|
12
|
Fear memory in humans is consolidated over time independently of sleep. COGNITIVE, AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2023; 23:100-113. [PMID: 36241964 PMCID: PMC9925495 DOI: 10.3758/s13415-022-01037-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Accepted: 09/21/2022] [Indexed: 02/15/2023]
Abstract
Fear memories can be altered after acquisition by processes, such as fear memory consolidation or fear extinction, even without further exposure to the fear-eliciting stimuli, but factors contributing to these processes are not well understood. Sleep is known to consolidate, strengthen, and change newly acquired declarative and procedural memories. However, evidence on the role of time and sleep in the consolidation of fear memories is inconclusive. We used highly sensitive electrophysiological measures to examine the development of fear-conditioned responses over time and sleep in humans. We assessed event-related brain potentials (ERP) in 18 healthy, young individuals during fear conditioning before and after a 2-hour afternoon nap or a corresponding wake interval in a counterbalanced within-subject design. The procedure involved pairing a neutral tone (CS+) with a highly unpleasant sound. As a control, another neutral tone (CS-) was paired with a neutral sound. Fear responses were examined before the interval during a habituation phase and an acquisition phase as well as after the interval during an extinction phase and a reacquisition phase. Differential fear conditioning during acquisition was evidenced by a more negative slow ERP component (stimulus-preceding negativity) developing before the unconditioned stimulus (loud noise). This differential fear response was even stronger after the interval during reacquisition compared with initial acquisition, but this effect was similarly pronounced after sleep and wakefulness. These findings suggest that fear memories are consolidated over time, with this effect being independent of intervening sleep.
Collapse
|
13
|
Ocklenburg S, Peterburs J. Monitoring Brain Activity in VR: EEG and Neuroimaging. Curr Top Behav Neurosci 2023; 65:47-71. [PMID: 37306852 DOI: 10.1007/7854_2023_423] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Virtual reality (VR) is increasingly used in neuroscientific research to increase ecological validity without sacrificing experimental control, to provide a richer visual and multisensory experience, and to foster immersion and presence in study participants, which leads to increased motivation and affective experience. But the use of VR, particularly when coupled with neuroimaging or neurostimulation techniques such as electroencephalography (EEG), functional magnetic resonance imaging (fMRI), or transcranial magnetic stimulation (TMS), also yields some challenges. These include intricacies of the technical setup, increased noise in the data due to movement, and a lack of standard protocols for data collection and analysis. This chapter examines current approaches to recording, pre-processing, and analyzing electrophysiological (stationary and mobile EEG), as well as neuroimaging data recorded during VR engagement. It also discusses approaches to synchronizing these data with other data streams. In general, previous research has used a range of different approaches to technical setup and data processing, and detailed reporting of procedures is urgently needed in future studies to ensure comparability and replicability. More support for open-source VR software as well as the development of consensus and best practice papers on issues such as the handling of movement artifacts in mobile EEG-VR will be essential steps in ensuring the continued success of this exciting and powerful technique in neuroscientific research.
Collapse
Affiliation(s)
- Sebastian Ocklenburg
- Department of Psychology, Faculty for Life Sciences, MSH Medical School Hamburg, Hamburg, Germany.
- ICAN Institute for Cognitive and Affective Neuroscience, MSH Medical School Hamburg, Hamburg, Germany.
- Faculty of Psychology, Institute of Cognitive Neuroscience, Biopsychology, Ruhr University Bochum, Bochum, Germany.
| | - Jutta Peterburs
- Institute of Systems Medicine & Department of Human Medicine, MSH Medical School Hamburg, Hamburg, Germany
| |
Collapse
|
14
|
Stolz C, Pickering AD, Mueller EM. Dissociable feedback valence effects on frontal midline theta during reward gain versus threat avoidance learning. Psychophysiology 2022; 60:e14235. [PMID: 36529988 DOI: 10.1111/psyp.14235] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2022] [Revised: 10/17/2022] [Accepted: 11/17/2022] [Indexed: 12/23/2022]
Abstract
While frontal midline theta (FMθ) has been associated with threat processing, with cognitive control in the context of anxiety, and with reinforcement learning, most reinforcement learning studies on FMθ have used reward rather than threat-related stimuli as reinforcer. Accordingly, the role of FMθ in threat-related reinforcement learning is largely unknown. Here, n = 23 human participants underwent one reward-, and one punishment-, based reversal learning task, which differed only with regard to the kind of reinforcers that feedback was tied to (i.e., monetary gain vs. loud noise burst, respectively). In addition to single-trial EEG, we assessed single-trial feedback expectations based on both a reinforcement learning computational model and trial-by-trial subjective feedback expectation ratings. While participants' performance and feedback expectations were comparable between the reward and punishment tasks, FMθ was more reliably amplified to negative vs. positive feedback in the reward vs. punishment task. Regressions with feedback valence, computationally derived, and self-reported expectations as predictors and FMθ as criterion further revealed that trial-by-trial variations in FMθ specifically relate to reward-related feedback-valence and not to threat-related feedback or to violated expectations/prediction errors. These findings suggest that FMθ as measured in reinforcement learning tasks may be less sensitive to the processing of events with direct relevance for fear and anxiety.
Collapse
Affiliation(s)
- Christopher Stolz
- Department of Psychology University of Marburg Marburg Germany
- Leibniz Institute for Neurobiology (LIN) Magdeburg Germany
- Department of Psychology Goldsmiths, University of London London UK
| | | | - Erik M. Mueller
- Department of Psychology University of Marburg Marburg Germany
| |
Collapse
|
15
|
Tanzilli A, Trentini C, Grecucci A, Carone N, Ciacchella C, Lai C, Sabogal-Rueda MD, Lingiardi V. Therapist reactions to patient personality: A pilot study of clinicians’ emotional and neural responses using three clinical vignettes from in treatment series. Front Hum Neurosci 2022; 16:1037486. [DOI: 10.3389/fnhum.2022.1037486] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2022] [Accepted: 11/02/2022] [Indexed: 11/29/2022] Open
Abstract
IntroductionTherapists’ responses to patients play a crucial role in psychotherapy and are considered a key component of the patient–clinician relationship, which promotes successful treatment outcomes. To date, no empirical research has ever investigated therapist response patterns to patients with different personality disorders from a neuroscience perspective.MethodsIn the present study, psychodynamic therapists (N = 14) were asked to complete a battery of instruments (including the Therapist Response Questionnaire) after watching three videos showing clinical interactions between a therapist and three patients with narcissistic, histrionic/borderline, and depressive personality disorders, respectively. Subsequently, participants’ high-density electroencephalography (hdEEG) was recorded as they passively viewed pictures of the patients’ faces, which were selected from the still images of the previously shown videos. Supervised machine learning (ML) was used to evaluate whether: (1) therapists’ responses predicted which patient they observed during the EEG task and whether specific clinician reactions were involved in distinguishing between patients with different personality disorders (using pairwise comparisons); and (2) therapists’ event-related potentials (ERPs) predicted which patient they observed during the laboratory experiment and whether distinct ERP components allowed this forecast.ResultsThe results indicated that therapists showed distinct patterns of criticized/devalued and sexualized reactions to visual depictions of patients with different personality disorders, at statistically systematic and clinically meaningful levels. Moreover, therapists’ late positive potentials (LPPs) in the hippocampus were able to determine which patient they observed during the EEG task, with high accuracy.DiscussionThese results, albeit preliminary, shed light on the role played by therapists’ memory processes in psychotherapy. Clinical and neuroscience implications of the empirical investigation of therapist responses are discussed.
Collapse
|
16
|
Stegmann Y, Andreatta M, Pauli P, Keil A, Wieser MJ. Investigating sustained attention in contextual threat using steady‐state
VEPs
evoked by flickering video stimuli. Psychophysiology 2022; 60:e14229. [PMID: 36416714 DOI: 10.1111/psyp.14229] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2022] [Revised: 10/17/2022] [Accepted: 11/09/2022] [Indexed: 11/24/2022]
Abstract
Anxiety is characterized by anxious anticipation and heightened vigilance to uncertain threat. However, if threat is not reliably indicated by a specific cue, the context in which threat was previously experienced becomes its best predictor, leading to anxiety. A suitable means to induce anxiety experimentally is context conditioning: In one context (CTX+), an unpredictable aversive stimulus (US) is repeatedly presented, in contrast to a second context (CTX-), in which no US is ever presented. In this EEG study, we investigated attentional mechanisms during acquisition and extinction learning in 38 participants, who underwent a context conditioning protocol. Flickering video stimuli (32 s clips depicting virtual offices representing CTX+/-) were used to evoke steady-state visual evoked potentials (ssVEPs) as an index of visuocortical engagement with the contexts. Analyses of the electrocortical responses suggest a successful induction of the ssVEP signal by video presentation in flicker mode. Furthermore, we found clear indices of context conditioning and extinction learning on a subjective level, while cortical processing of the CTX+ was unexpectedly reduced during video presentation. The differences between CTX+ and CTX- diminished during extinction learning. Together, these results indicate that the dynamic sensory input of the video presentation leads to disruptions in the ssVEP signal, which is greater for motivationally significant, threatening contexts.
Collapse
Affiliation(s)
- Yannik Stegmann
- Department of Psychology (Biological Psychology, Clinical Psychology, and Psychotherapy) University of Würzburg Würzburg Germany
| | - Marta Andreatta
- Department of Psychology (Biological Psychology, Clinical Psychology, and Psychotherapy) University of Würzburg Würzburg Germany
- Department of Psychology, Education, and Child Studies Erasmus University Rotterdam Rotterdam Netherlands
| | - Paul Pauli
- Department of Psychology (Biological Psychology, Clinical Psychology, and Psychotherapy) University of Würzburg Würzburg Germany
- Center for Mental Health, Medical Faculty University of Würzburg Würzburg Germany
| | - Andreas Keil
- Center for the Study of Emotion and Attention University of Florida Gainesville Florida USA
| | - Matthias J. Wieser
- Department of Psychology, Education, and Child Studies Erasmus University Rotterdam Rotterdam Netherlands
| |
Collapse
|
17
|
No trait anxiety influences on early and late differential neuronal responses to aversively conditioned faces across three different tasks. COGNITIVE, AFFECTIVE, & BEHAVIORAL NEUROSCIENCE 2022; 22:1157-1171. [PMID: 35352267 PMCID: PMC9458573 DOI: 10.3758/s13415-022-00998-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Accepted: 03/10/2022] [Indexed: 11/08/2022]
Abstract
AbstractThe human brain's ability to quickly detect dangerous stimuli is crucial in selecting appropriate responses to possible threats. Trait anxiety has been suggested to moderate these processes on certain processing stages. To dissociate such different information-processing stages, research using classical conditioning has begun to examine event-related potentials (ERPs) in response to fear-conditioned (CS +) faces. However, the impact of trait anxiety on ERPs to fear-conditioned faces depending on specific task conditions is unknown. In this preregistered study, we measured ERPs to faces paired with aversive loud screams (CS +) or neutral sounds (CS −) in a large sample (N = 80) under three different task conditions. Participants had to discriminate face-irrelevant perceptual information, the gender of the faces, or the CS category. Results showed larger amplitudes in response to aversively conditioned faces for all examined ERPs, whereas interactions with the attended feature occurred for the P1 and the early posterior negativity (EPN). For the P1, larger CS + effects were observed during the perceptual distraction task, while the EPN was increased for CS + faces when deciding about the CS association. Remarkably, we found no significant correlations between ERPs and trait anxiety. Thus, fear-conditioning potentiates all ERP amplitudes, some processing stages being further modulated by the task. However, the finding that these ERP differences were not affected by individual differences in trait anxiety does not support theoretical accounts assuming increased threat processing or reduced threat discrimination depending on trait anxiety.
Collapse
|
18
|
Aspiotis V, Miltiadous A, Kalafatakis K, Tzimourta KD, Giannakeas N, Tsipouras MG, Peschos D, Glavas E, Tzallas AT. Assessing Electroencephalography as a Stress Indicator: A VR High-Altitude Scenario Monitored through EEG and ECG. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22155792. [PMID: 35957348 PMCID: PMC9371026 DOI: 10.3390/s22155792] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/28/2022] [Revised: 07/28/2022] [Accepted: 08/01/2022] [Indexed: 05/28/2023]
Abstract
Over the last decade, virtual reality (VR) has become an increasingly accessible commodity. Head-mounted display (HMD) immersive technologies allow researchers to simulate experimental scenarios that would be unfeasible or risky in real life. An example is extreme heights exposure simulations, which can be utilized in research on stress system mobilization. Until recently, electroencephalography (EEG)-related research was focused on mental stress prompted by social or mathematical challenges, with only a few studies employing HMD VR techniques to induce stress. In this study, we combine a state-of-the-art EEG wearable device and an electrocardiography (ECG) sensor with a VR headset to provoke stress in a high-altitude scenarios while monitoring EEG and ECG biomarkers in real time. A robust pipeline for signal clearing is implemented to preprocess the noise-infiltrated (due to movement) EEG data. Statistical and correlation analysis is employed to explore the relationship between these biomarkers with stress. The participant pool is divided into two groups based on their heart rate increase, where statistically important EEG biomarker differences emerged between them. Finally, the occipital-region band power changes and occipital asymmetry alterations were found to be associated with height-related stress and brain activation in beta and gamma bands, which correlates with the results of the self-reported Perceived Stress Scale questionnaire.
Collapse
Affiliation(s)
- Vasileios Aspiotis
- Human Computer Interaction Laboratory (HCILab), Department of Informatics and Telecommunications, University of Ioannina, Kostakioi, 47100 Arta, Greece; (V.A.); (A.M.); (K.K.); (K.D.T.); (N.G.); (E.G.)
- Faculty of Medicine, University of Ioannina, 45110 Ioannina, Greece;
| | - Andreas Miltiadous
- Human Computer Interaction Laboratory (HCILab), Department of Informatics and Telecommunications, University of Ioannina, Kostakioi, 47100 Arta, Greece; (V.A.); (A.M.); (K.K.); (K.D.T.); (N.G.); (E.G.)
| | - Konstantinos Kalafatakis
- Human Computer Interaction Laboratory (HCILab), Department of Informatics and Telecommunications, University of Ioannina, Kostakioi, 47100 Arta, Greece; (V.A.); (A.M.); (K.K.); (K.D.T.); (N.G.); (E.G.)
- Institute of Health Science Education, Barts and the London School of Medicine & Dentistry, Queen Mary University of London (Malta Campus), VCT 2520 Victoria, Malta
| | - Katerina D. Tzimourta
- Human Computer Interaction Laboratory (HCILab), Department of Informatics and Telecommunications, University of Ioannina, Kostakioi, 47100 Arta, Greece; (V.A.); (A.M.); (K.K.); (K.D.T.); (N.G.); (E.G.)
- Department of Electrical and Computer Engineering, Faculty of Engineering, University of Western Macedonia, 50100 Kozani, Greece;
| | - Nikolaos Giannakeas
- Human Computer Interaction Laboratory (HCILab), Department of Informatics and Telecommunications, University of Ioannina, Kostakioi, 47100 Arta, Greece; (V.A.); (A.M.); (K.K.); (K.D.T.); (N.G.); (E.G.)
| | - Markos G. Tsipouras
- Department of Electrical and Computer Engineering, Faculty of Engineering, University of Western Macedonia, 50100 Kozani, Greece;
| | - Dimitrios Peschos
- Faculty of Medicine, University of Ioannina, 45110 Ioannina, Greece;
| | - Euripidis Glavas
- Human Computer Interaction Laboratory (HCILab), Department of Informatics and Telecommunications, University of Ioannina, Kostakioi, 47100 Arta, Greece; (V.A.); (A.M.); (K.K.); (K.D.T.); (N.G.); (E.G.)
| | - Alexandros T. Tzallas
- Human Computer Interaction Laboratory (HCILab), Department of Informatics and Telecommunications, University of Ioannina, Kostakioi, 47100 Arta, Greece; (V.A.); (A.M.); (K.K.); (K.D.T.); (N.G.); (E.G.)
| |
Collapse
|
19
|
Aksoy M, Ufodiama CE, Bateson AD, Martin S, Asghar AUR. A comparative experimental study of visual brain event-related potentials to a working memory task: virtual reality head-mounted display versus a desktop computer screen. Exp Brain Res 2021; 239:3007-3022. [PMID: 34347129 PMCID: PMC8536609 DOI: 10.1007/s00221-021-06158-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2021] [Accepted: 06/19/2021] [Indexed: 11/20/2022]
Abstract
Virtual reality head mounted display (VR HMD) systems are increasingly utilised in combination with electroencephalography (EEG) in the experimental study of cognitive tasks. The aim of our investigation was to determine the similarities/differences between VR HMD and the computer screen (CS) in response to an n-back working memory task by comparing visual electrophysiological event-related potential (ERP) waveforms (N1/P1/P3 components). The same protocol was undertaken for VR HMD and CS with participants wearing the same EEG headcap. ERP waveforms obtained with the VR HMD environment followed a similar time course to those acquired in CS. The P3 mean and peak amplitudes obtained in VR HMD were not significantly different to those obtained in CS. In contrast, the N1 component was significantly higher in mean and peak amplitudes for the VR HMD environment compared to CS at the frontal electrodes. Significantly higher P1 mean and peak amplitudes were found at the occipital region compared to the temporal for VR HMD. Our results show that successful acquisition of ERP components to a working memory task is achievable by combining VR HMD with EEG. In addition, the higher amplitude N1/P1 components seen in VR HMD indicates the potential utility of this VR modality in the investigation of early ERPs. In conclusion, the combination of VR HMD with EEG/ERP would be a useful approach to advance the study of cognitive function in experimental brain research.
Collapse
Affiliation(s)
- Murat Aksoy
- Centre for Anatomical and Human Sciences, Hull York Medical School, University of Hull, Hull, HU6 7RX, UK
| | - Chiedu E Ufodiama
- Centre for Anatomical and Human Sciences, Hull York Medical School, University of Hull, Hull, HU6 7RX, UK
| | - Anthony D Bateson
- Department of Engineering, Faculty Science and Engineering, University of Hull, Cottingham Road, Hull, HU6 7RX, UK
| | - Stewart Martin
- School of Education and Social Sciences, University of Hull, Cottingham Road, Hull, HU6 7RX, UK
| | - Aziz U R Asghar
- Centre for Anatomical and Human Sciences, Hull York Medical School, University of Hull, Hull, HU6 7RX, UK.
| |
Collapse
|
20
|
Edwards DJ, Trujillo LT. An Analysis of the External Validity of EEG Spectral Power in an Uncontrolled Outdoor Environment during Default and Complex Neurocognitive States. Brain Sci 2021; 11:330. [PMID: 33808022 PMCID: PMC7998369 DOI: 10.3390/brainsci11030330] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2021] [Revised: 02/26/2021] [Accepted: 03/03/2021] [Indexed: 12/20/2022] Open
Abstract
Traditionally, quantitative electroencephalography (QEEG) studies collect data within controlled laboratory environments that limit the external validity of scientific conclusions. To probe these validity limits, we used a mobile EEG system to record electrophysiological signals from human participants while they were located within a controlled laboratory environment and an uncontrolled outdoor environment exhibiting several moderate background influences. Participants performed two tasks during these recordings, one engaging brain activity related to several complex cognitive functions (number sense, attention, memory, executive function) and the other engaging two default brain states. We computed EEG spectral power over three frequency bands (theta: 4-7 Hz, alpha: 8-13 Hz, low beta: 14-20 Hz) where EEG oscillatory activity is known to correlate with the neurocognitive states engaged by these tasks. Null hypothesis significance testing yielded significant EEG power effects typical of the neurocognitive states engaged by each task, but only a beta-band power difference between the two background recording environments during the default brain state. Bayesian analysis showed that the remaining environment null effects were unlikely to reflect measurement insensitivities. This overall pattern of results supports the external validity of laboratory EEG power findings for complex and default neurocognitive states engaged within moderately uncontrolled environments.
Collapse
Affiliation(s)
- Dalton J. Edwards
- Department of Neuroscience, School of Behavioral and Brain Sciences, The University of Texas at Dallas, Dallas, TX 75080-3021, USA;
- Department of Psychology, Texas State University, San Marcos, TX 78666, USA
| | - Logan T. Trujillo
- Department of Psychology, Texas State University, San Marcos, TX 78666, USA
| |
Collapse
|
21
|
Marín-Morales J, Llinares C, Guixeres J, Alcañiz M. Emotion Recognition in Immersive Virtual Reality: From Statistics to Affective Computing. SENSORS (BASEL, SWITZERLAND) 2020; 20:E5163. [PMID: 32927722 PMCID: PMC7570837 DOI: 10.3390/s20185163] [Citation(s) in RCA: 36] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/23/2020] [Revised: 09/07/2020] [Accepted: 09/08/2020] [Indexed: 12/16/2022]
Abstract
Emotions play a critical role in our daily lives, so the understanding and recognition of emotional responses is crucial for human research. Affective computing research has mostly used non-immersive two-dimensional (2D) images or videos to elicit emotional states. However, immersive virtual reality, which allows researchers to simulate environments in controlled laboratory conditions with high levels of sense of presence and interactivity, is becoming more popular in emotion research. Moreover, its synergy with implicit measurements and machine-learning techniques has the potential to impact transversely in many research areas, opening new opportunities for the scientific community. This paper presents a systematic review of the emotion recognition research undertaken with physiological and behavioural measures using head-mounted displays as elicitation devices. The results highlight the evolution of the field, give a clear perspective using aggregated analysis, reveal the current open issues and provide guidelines for future research.
Collapse
Affiliation(s)
- Javier Marín-Morales
- Instituto de Investigación e Innovación en Bioingeniería, Universitat Politècnica de València, 46022 València, Spain; (C.L.); (J.G.); (M.A.)
| | | | | | | |
Collapse
|
22
|
Schubring D, Kraus M, Stolz C, Weiler N, Keim DA, Schupp H. Virtual Reality Potentiates Emotion and Task Effects of Alpha/Beta Brain Oscillations. Brain Sci 2020; 10:brainsci10080537. [PMID: 32784990 PMCID: PMC7465872 DOI: 10.3390/brainsci10080537] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Revised: 07/31/2020] [Accepted: 08/06/2020] [Indexed: 12/31/2022] Open
Abstract
The progress of technology has increased research on neuropsychological emotion and attention with virtual reality (VR). However, direct comparisons between conventional two-dimensional (2D) and VR stimulations are lacking. Thus, the present study compared electroencephalography (EEG) correlates of explicit task and implicit emotional attention between 2D and VR stimulation. Participants (n = 16) viewed angry and neutral faces with equal size and distance in both 2D and VR, while they were asked to count one of the two facial expressions. For the main effects of emotion (angry vs. neutral) and task (target vs. nontarget), established event related potentials (ERP), namely the late positive potential (LPP) and the target P300, were replicated. VR stimulation compared to 2D led to overall bigger ERPs but did not interact with emotion or task effects. In the frequency domain, alpha/beta-activity was larger in VR compared to 2D stimulation already in the baseline period. Of note, while alpha/beta event related desynchronization (ERD) for emotion and task conditions were seen in both VR and 2D stimulation, these effects were significantly stronger in VR than in 2D. These results suggest that enhanced immersion with the stimulus materials enabled by VR technology can potentiate induced brain oscillation effects to implicit emotion and explicit task effects.
Collapse
Affiliation(s)
- David Schubring
- Department of Psychology, University of Konstanz, 78457 Konstanz, Germany;
- Correspondence:
| | - Matthias Kraus
- Department of Computer and Information Science, University of Konstanz, 78457 Konstanz, Germany; (M.K.); (N.W.); (D.A.K.)
| | - Christopher Stolz
- Department of Psychology, University of Marburg, 35032 Marburg, Germany;
| | - Niklas Weiler
- Department of Computer and Information Science, University of Konstanz, 78457 Konstanz, Germany; (M.K.); (N.W.); (D.A.K.)
| | - Daniel A. Keim
- Department of Computer and Information Science, University of Konstanz, 78457 Konstanz, Germany; (M.K.); (N.W.); (D.A.K.)
| | - Harald Schupp
- Department of Psychology, University of Konstanz, 78457 Konstanz, Germany;
| |
Collapse
|
23
|
Neural indices of orienting, discrimination, and conflict monitoring after contextual fear and safety learning. COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2020; 20:917-927. [PMID: 32720204 DOI: 10.3758/s13415-020-00810-8] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Investigations of fear conditioning have recently begun to evaluate contextual factors that affect attention-related processes. However, much of the extant literature does not evaluate how contextual fear learning influences neural indicators of attentional processes during goal-directed activity. The current study evaluated how early attention for task-relevant stimuli and conflict monitoring were affected when presented within task-irrelevant safety and threat contexts after fear learning. Participants (N = 72) completed a Flanker task with modified context before and after context-dependent fear learning. Flanker stimuli were presented in the same threat and safety contexts utilized in the fear learning task while EEG was collected. Results indicated increased early attention (N1) to flankers appearing in threat contexts and later increased neural indicators (P2) of attention to flankers appearing in safety contexts. Results of this study indicate that contextual fear learning modulates early attentional processes for task-relevant stimuli that appear in the context of safety and threat. Theoretical and clinical implications are discussed.
Collapse
|
24
|
Burt AL, Crewther DP. The 4D Space-Time Dimensions of Facial Perception. Front Psychol 2020; 11:1842. [PMID: 32849084 PMCID: PMC7399249 DOI: 10.3389/fpsyg.2020.01842] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2020] [Accepted: 07/06/2020] [Indexed: 12/19/2022] Open
Abstract
Facial information is a powerful channel for human-to-human communication. Characteristically, faces can be defined as biological objects that are four-dimensional (4D) patterns, whereby they have concurrently a spatial structure and surface as well as temporal dynamics. The spatial characteristics of facial objects contain a volume and surface in three dimensions (3D), namely breadth, height and importantly, depth. The temporal properties of facial objects are defined by how a 3D facial structure and surface evolves dynamically over time; where time is referred to as the fourth dimension (4D). Our entire perception of another’s face, whether it be social, affective or cognitive perceptions, is therefore built on a combination of 3D and 4D visual cues. Counterintuitively, over the past few decades of experimental research in psychology, facial stimuli have largely been captured, reproduced and presented to participants with two dimensions (2D), while remaining largely static. The following review aims to advance and update facial researchers, on the recent revolution in computer-generated, realistic 4D facial models produced from real-life human subjects. We delve in-depth to summarize recent studies which have utilized facial stimuli that possess 3D structural and surface cues (geometry, surface and depth) and 4D temporal cues (3D structure + dynamic viewpoint and movement). In sum, we have found that higher-order perceptions such as identity, gender, ethnicity, emotion and personality, are critically influenced by 4D characteristics. In future, it is recommended that facial stimuli incorporate the 4D space-time perspective with the proposed time-resolved methods.
Collapse
Affiliation(s)
- Adelaide L Burt
- Centre for Human Psychopharmacology, Swinburne University of Technology, Melbourne, VIC, Australia
| | - David P Crewther
- Centre for Human Psychopharmacology, Swinburne University of Technology, Melbourne, VIC, Australia
| |
Collapse
|
25
|
Rischer KM, Savallampi M, Akwaththage A, Salinas Thunell N, Lindersson C, MacGregor O. In context: emotional intent and temporal immediacy of contextual descriptions modulate affective ERP components to facial expressions. Soc Cogn Affect Neurosci 2020; 15:551-560. [PMID: 32440673 PMCID: PMC7328032 DOI: 10.1093/scan/nsaa071] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2019] [Revised: 05/04/2020] [Accepted: 05/11/2020] [Indexed: 11/24/2022] Open
Abstract
In this study, we explored how contextual information about threat dynamics affected the electrophysiological correlates of face perception. Forty-six healthy native Swedish speakers read verbal descriptions signaling an immediate vs delayed intent to escalate or deescalate an interpersonal conflict. Each verbal description was followed by a face with an angry or neutral expression, for which participants rated valence and arousal. Affective ratings confirmed that the emotional intent expressed in the descriptions modulated emotional reactivity to the facial stimuli in the expected direction. The electrophysiological data showed that compared to neutral faces, angry faces resulted in enhanced early and late event-related potentials (VPP, P300 and LPP). Additionally, emotional intent and temporal immediacy modulated the VPP and P300 similarly across angry and neutral faces, suggesting that they influence early face perception independently of facial affect. By contrast, the LPP amplitude to faces revealed an interaction between facial expression and emotional intent. Deescalating descriptions eliminated the LPP differences between angry and neutral faces. Together, our results suggest that information about a person’s intentions modulates the processing of facial expressions.
Collapse
Affiliation(s)
- Katharina M Rischer
- Department of Behavioural and Cognitive Sciences, Research Institute of Health and Behaviour, University of Luxembourg, 4366 Esch-sur-Alzette, Luxembourg
| | - Mattias Savallampi
- Department of Clinical and Experimental Medicine (IKE), Center for Social and Affective Neuroscience (CSAN), Linköping University, 581 83 Linköping, Sweden
| | - Anushka Akwaththage
- Department of Cognitive Neuroscience and Philosophy, School of Bioscience, University of Skövde, 541 28 Skövde, Sweden
| | - Nicole Salinas Thunell
- Department of Cognitive Neuroscience and Philosophy, School of Bioscience, University of Skövde, 541 28 Skövde, Sweden
| | - Carl Lindersson
- Department of Cognitive Neuroscience and Philosophy, School of Bioscience, University of Skövde, 541 28 Skövde, Sweden
| | - Oskar MacGregor
- Department of Cognitive Neuroscience and Philosophy, School of Bioscience, University of Skövde, 541 28 Skövde, Sweden
| |
Collapse
|