1
|
Rutkowska JM, Ghilardi T, Vacaru SV, van Schaik JE, Meyer M, Hunnius S, Oostenveld R. Optimal processing of surface facial EMG to identify emotional expressions: A data-driven approach. Behav Res Methods 2024; 56:7331-7344. [PMID: 38773029 PMCID: PMC11362446 DOI: 10.3758/s13428-024-02421-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/07/2024] [Indexed: 05/23/2024]
Abstract
Surface facial electromyography (EMG) is commonly used to detect emotions from subtle facial expressions. Although there are established procedures for collecting EMG data and some aspects of their processing, there is little agreement among researchers about the optimal way to process the EMG signal, so that the study-unrelated variability (noise) is removed, and the emotion-related variability is best detected. The aim of the current paper was to establish an optimal processing pipeline for EMG data for identifying emotional expressions in facial muscles. We identified the most common processing steps from existing literature and created 72 processing pipelines that represented all the different processing choices. We applied these pipelines to a previously published dataset from a facial mimicry experiment, where 100 adult participants observed happy and sad facial expressions, whilst the activity of their facial muscles, zygomaticus major and corrugator supercilii, was recorded with EMG. We used a resampling approach and subsets of the original data to investigate the effect and robustness of different processing choices on the performance of a logistic regression model that predicted the mimicked emotion (happy/sad) from the EMG signal. In addition, we used a random forest model to identify the most important processing steps for the sensitivity of the logistic regression model. Three processing steps were found to be most impactful: baseline correction, standardisation within muscles, and standardisation within subjects. The chosen feature of interest and the signal averaging had little influence on the sensitivity to the effect. We recommend an optimal processing pipeline, share our code and data, and provide a step-by-step walkthrough for researchers.
Collapse
Affiliation(s)
- J M Rutkowska
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
- Department of Psychology, University of Zurich, Zurich, Switzerland
- Jacobs Center for Productive Youth Development, University of Zurich, Zurich, Switzerland
| | - T Ghilardi
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
- Centre for Brain and Cognitive Development, Department of Psychological Sciences, Birkbeck, University of London, London, UK
| | - S V Vacaru
- Vrije Universiteit Amsterdam, Amsterdam, the Netherlands
- Department of Psychology, New York University - Abu Dhabi, Abu Dhabi, United Arab Emirates
| | - J E van Schaik
- Behavioral Science Institute, Radboud University, Nijmegen, The Netherlands
| | - M Meyer
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
| | - S Hunnius
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
| | - R Oostenveld
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands.
- NatMEG, Karolinska Institutet, Stockholm, Sweden.
| |
Collapse
|
2
|
Aktar E, Venetikidi M, Bockstaele BV, Giessen DVD, Pérez-Edgar K. Pupillary Responses to Dynamic Negative Versus Positive Facial Expressions of Emotion in Children and Parents: Links to Depression and Anxiety. Dev Psychobiol 2024; 66:e22522. [PMID: 38967122 DOI: 10.1002/dev.22522] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2023] [Revised: 05/30/2024] [Accepted: 06/05/2024] [Indexed: 07/06/2024]
Abstract
Witnessing emotional expressions in others triggers physiological arousal in humans. The current study focused on pupil responses to emotional expressions in a community sample as a physiological index of arousal and attention. We explored the associations between parents' and offspring's responses to dynamic facial expressions of emotion, as well as the links between pupil responses and anxiety/depression. Children (N = 90, MAge = 10.13, range = 7.21-12.94, 47 girls) participated in this lab study with one of their parents (47 mothers). Pupil responses were assessed in a computer task with dynamic happy, angry, fearful, and sad expressions, while participants verbally labeled the emotion displayed on the screen as quickly as possible. Parents and children reported anxiety and depression symptoms in questionnaires. Both parents and children showed stronger pupillary responses to negative versus positive expressions, and children's responses were overall stronger than those of parents. We also found links between the pupil responses of parents and children to negative, especially to angry faces. Child pupil responses were related to their own and their parents' anxiety levels and to their parents' (but not their own) depression. We conclude that child pupils are sensitive to individual differences in parents' pupils and emotional dispositions in community samples.
Collapse
Affiliation(s)
- Evin Aktar
- Department of Clinical Psychology, Leiden University, Leiden, The Netherlands
| | - Marianna Venetikidi
- Department of Clinical Psychology, Leiden University, Leiden, The Netherlands
| | - Bram van Bockstaele
- Research Institute Child Development and Education, University of Amsterdam, Amsterdam, The Netherlands
| | - Danielle van der Giessen
- Research Institute Child Development and Education, University of Amsterdam, Amsterdam, The Netherlands
| | - Koraly Pérez-Edgar
- Child Study Center, The Pennsylvania State University, University Park, Pennsylvania, USA
| |
Collapse
|
3
|
Botta A, Zhao M, Samogin J, Pelosin E, Bonassi G, Lagravinese G, Mantini D, Avenanti A, Avanzino L. Early modulations of neural oscillations during the processing of emotional body language. Psychophysiology 2024; 61:e14436. [PMID: 37681463 DOI: 10.1111/psyp.14436] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2022] [Revised: 08/01/2023] [Accepted: 08/24/2023] [Indexed: 09/09/2023]
Abstract
The processing of threat-related emotional body language (EBL) has been shown to engage sensorimotor cortical areas early on and induce freezing in the observers' motor system, particularly when observing fearful EBL. To provide insights into the interplay between somatosensory and motor areas during observation of EBL, here, we used high-density electroencephalography (hd-EEG) in healthy humans while they observed EBL stimuli involving fearful and neutral expressions. To capture early sensorimotor brain response, we focused on P100 fronto-central event-related potentials (ERPs) and event-related desynchronization/synchronization (ERD/ERS) in the mu-alpha (8-13 Hz) and lower beta (13-20 Hz) bands over the primary motor (M1) and somatosensory (S1) cortices. Source-level ERP and ERD/ERS analyses were conducted using eLORETA. Results revealed higher P100 amplitudes in motor and premotor channels for 'Neutral' compared with 'Fear'. Additionally, analysis of ERD/ERS showed increased beta band desynchronization in M1 for 'Neutral', and the opposite pattern in S1. Source-level estimation showed significant differences between conditions mainly observed in the beta band over sensorimotor areas. These findings provide high-temporal resolution evidence suggesting that seeing fearful EBL induces early activation of somatosensory areas, which in turn could suppress M1 activity. These findings highlight early dynamics within the observer's sensorimotor system and hint at a sensorimotor mechanism supporting freezing during the processing of EBL.
Collapse
Affiliation(s)
| | - Mingqi Zhao
- Movement Control and Neuroplasticity Research Group, KU Leuven, Leuven, Belgium
| | - Jessica Samogin
- Movement Control and Neuroplasticity Research Group, KU Leuven, Leuven, Belgium
| | - Elisa Pelosin
- IRCCS Ospedale Policlinico San Martino, Genoa, Italy
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, and Maternal Child Health (DINOGMI), University of Genoa, Genoa, Italy
| | - Gaia Bonassi
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, and Maternal Child Health (DINOGMI), University of Genoa, Genoa, Italy
| | - Giovanna Lagravinese
- IRCCS Ospedale Policlinico San Martino, Genoa, Italy
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, and Maternal Child Health (DINOGMI), University of Genoa, Genoa, Italy
| | - Dante Mantini
- Movement Control and Neuroplasticity Research Group, KU Leuven, Leuven, Belgium
| | - Alessio Avenanti
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Campus Cesena, Alma Mater Studiorum Università di Bologna, Cesena, Italy
- Centro de Investigación en Neuropsicología y Neurociencias Cognitivas, Universidad Católica del Maule, Talca, Chile
| | - Laura Avanzino
- IRCCS Ospedale Policlinico San Martino, Genoa, Italy
- Department of Experimental Medicine (DIMES), Section of Human Physiology, University of Genoa, Genoa, Italy
| |
Collapse
|
4
|
Day SE, Krumhuber EG, Shore DM. The reciprocal relationship between smiles and situational contexts. Cogn Emot 2023; 37:1230-1247. [PMID: 37776238 DOI: 10.1080/02699931.2023.2258488] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2023] [Accepted: 08/14/2023] [Indexed: 10/02/2023]
Abstract
ABSTRACTSmiles provide information about a social partner's affect and intentions during social interaction. Although always encountered within a specific situation, the influence of contextual information on smile evaluation has not been widely investigated. Moreover, little is known about the reciprocal effect of smiles on evaluations of their accompanying situations. In this research, we assessed how different smile types and situational contexts affected participants' social evaluations. In Study 1, 85 participants rated reward, affiliation, and dominance smiles embedded within either enjoyable, polite, or negative (unpleasant) situations. Context had a strong effect on smile ratings, such that smiles in enjoyable situations were rated as more genuine and joyful, as well as indicating less superiority than those in negative situations. In Study 2, 200 participants evaluated the situations that these smiles were perceived within (rather than the smiles themselves). Although situations paired with reward (vs. affiliation) smiles tended to be rated more positively, this effect was absent for negative situations. Ultimately, the findings point toward a reciprocal relationship between smiles and contexts, whereby the face influences evaluations of the situation and vice versa.
Collapse
Affiliation(s)
- Samuel E Day
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| | - Eva G Krumhuber
- Department of Experimental Psychology, University College London, London, UK
| | - Danielle M Shore
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| |
Collapse
|
5
|
Presti P, Galasso GM, Ruzzon D, Avanzini P, Caruana F, Rizzolatti G, Vecchiato G. Architectural experience influences the processing of others' body expressions. Proc Natl Acad Sci U S A 2023; 120:e2302215120. [PMID: 37782807 PMCID: PMC10576150 DOI: 10.1073/pnas.2302215120] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2023] [Accepted: 08/28/2023] [Indexed: 10/04/2023] Open
Abstract
The interplay between space and cognition is a crucial issue in Neuroscience leading to the development of multiple research fields. However, the relationship between architectural space and the movement of the inhabitants and their interactions has been too often neglected, failing to provide a unifying view of architecture's capacity to modulate social cognition broadly. We bridge this gap by requesting participants to judge avatars' emotional expression (high vs. low arousal) at the end of their promenade inside high- or low-arousing architectures. Stimuli were presented in virtual reality to ensure a dynamic, naturalistic experience. High-density electroencephalography (EEG) was recorded to assess the neural responses to the avatar's presentation. Observing highly aroused avatars increased Late Positive Potentials (LPP), in line with previous evidence. Strikingly, 250 ms before the occurrence of the LPP, P200 amplitude increased due to the experience of low-arousing architectures, reflecting an early greater attention during the processing of body expressions. In addition, participants stared longer at the avatar's head and judged the observed posture as more arousing. Source localization highlighted a contribution of the dorsal premotor cortex to both P200 and LPP. In conclusion, the immersive and dynamic architectural experience modulates human social cognition. In addition, the motor system plays a role in processing architecture and body expressions suggesting that the space and social cognition interplay is rooted in overlapping neural substrates. This study demonstrates that the manipulation of mere architectural space is sufficient to influence human social cognition.
Collapse
Affiliation(s)
- Paolo Presti
- Institute of Neuroscience, National Research Council of Italy, Parma43125, Italy
- Department of Medicine and Surgery, University of Parma, Parma43125, Italy
| | - Gaia Maria Galasso
- Department of Medicine and Surgery, University of Parma, Parma43125, Italy
| | - Davide Ruzzon
- Dipartimento di Culture del Progetto, IUAV University, Venice30135, Italy
- TUNED, Lombardini22 s.p.a., Milan20143, Italy
| | - Pietro Avanzini
- Institute of Neuroscience, National Research Council of Italy, Parma43125, Italy
| | - Fausto Caruana
- Institute of Neuroscience, National Research Council of Italy, Parma43125, Italy
| | - Giacomo Rizzolatti
- Institute of Neuroscience, National Research Council of Italy, Parma43125, Italy
| | - Giovanni Vecchiato
- Institute of Neuroscience, National Research Council of Italy, Parma43125, Italy
- Department of Medicine and Surgery, University of Parma, Parma43125, Italy
| |
Collapse
|
6
|
Mulder MJ, Prummer F, Terburg D, Kenemans JL. Drift-diffusion modeling reveals that masked faces are preconceived as unfriendly. Sci Rep 2023; 13:16982. [PMID: 37813970 PMCID: PMC10562405 DOI: 10.1038/s41598-023-44162-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2022] [Accepted: 10/04/2023] [Indexed: 10/11/2023] Open
Abstract
During the COVID-19 pandemic, the use of face masks has become a daily routine. Studies have shown that face masks increase the ambiguity of facial expressions which not only affects (the development of) emotion recognition, but also interferes with social interaction and judgement. To disambiguate facial expressions, we rely on perceptual (stimulus-driven) as well as preconceptual (top-down) processes. However, it is unknown which of these two mechanisms accounts for the misinterpretation of masked expressions. To investigate this, we asked participants (N = 136) to decide whether ambiguous (morphed) facial expressions, with or without a mask, were perceived as friendly or unfriendly. To test for the independent effects of perceptual and preconceptual biases we fitted a drift-diffusion model (DDM) to the behavioral data of each participant. Results show that face masks induce a clear loss of information leading to a slight perceptual bias towards friendly choices, but also a clear preconceptual bias towards unfriendly choices for masked faces. These results suggest that, although face masks can increase the perceptual friendliness of faces, people have the prior preconception to interpret masked faces as unfriendly.
Collapse
Affiliation(s)
- Martijn J Mulder
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands.
| | - Franziska Prummer
- School of Computing and Communications, Lancaster University, Lancaster, UK
| | - David Terburg
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| | - J Leon Kenemans
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| |
Collapse
|
7
|
Dapor C, Sperandio I, Meconi F. Fading boundaries between the physical and the social world: Insights and novel techniques from the intersection of these two fields. Front Psychol 2023; 13:1028150. [PMID: 36861005 PMCID: PMC9969107 DOI: 10.3389/fpsyg.2022.1028150] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2022] [Accepted: 12/12/2022] [Indexed: 02/15/2023] Open
Abstract
This review focuses on the subtle interactions between sensory input and social cognition in visual perception. We suggest that body indices, such as gait and posture, can mediate such interactions. Recent trends in cognitive research are trying to overcome approaches that define perception as stimulus-centered and are pointing toward a more embodied agent-dependent perspective. According to this view, perception is a constructive process in which sensory inputs and motivational systems contribute to building an image of the external world. A key notion emerging from new theories on perception is that the body plays a critical role in shaping our perception. Depending on our arm's length, height and capacity of movement, we create our own image of the world based on a continuous compromise between sensory inputs and expected behavior. We use our bodies as natural "rulers" to measure both the physical and the social world around us. We point out the necessity of an integrative approach in cognitive research that takes into account the interplay between social and perceptual dimensions. To this end, we review long-established and novel techniques aimed at measuring bodily states and movements, and their perception, with the assumption that only by combining the study of visual perception and social cognition can we deepen our understanding of both fields.
Collapse
Affiliation(s)
- Cecilia Dapor
- Department of Psychology and Cognitive Science, University of Trento, Rovereto, Italy
| | | | | |
Collapse
|
8
|
Seinfeld S, Hortensius R, Arroyo-Palacios J, Iruretagoyena G, Zapata LE, de Gelder B, Slater M, Sanchez-Vives MV. Domestic Violence From a Child Perspective: Impact of an Immersive Virtual Reality Experience on Men With a History of Intimate Partner Violent Behavior. JOURNAL OF INTERPERSONAL VIOLENCE 2023; 38:2654-2682. [PMID: 35727942 DOI: 10.1177/08862605221106130] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Domestic violence has long-term negative consequences on children. In this study, men with a history of partner aggression and a control group of non-offenders were embodied in a child's body from a first-person perspective in virtual reality (VR). From this perspective, participants witnessed a scene of domestic violence where a male avatar assaulted a female avatar. We evaluated the impact of the experience on emotion recognition skills and heart rate deceleration responses. We found that the experience mainly impacted the recognition of angry facial expressions. The results also indicate that males with a history of partner aggression had larger physiological responses during an explicit violent event (when the virtual abuser threw a telephone) compared with controls, while their physiological reactions were less pronounced when the virtual abuser invaded the victim's personal space. We show that embodiment from a child's perspective during a conflict situation in VR impacts emotion recognition, physiological reactions, and attitudes towards violence. We provide initial evidence of the potential of VR in the rehabilitation and neuropsychological assessment of males with a history of domestic violence, especially in relation to children.
Collapse
Affiliation(s)
- Sofia Seinfeld
- 146245Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain
- EVENT Lab, Department of Clinical Psychology and Psychobiology, 207203University of Barcelona, Barcelona, Spain
- Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain
| | - Ruud Hortensius
- Brain and Emotion Laboratory, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands
- Department of Psychology, Utrecht University, Utrecht, Netherlands
| | - Jorge Arroyo-Palacios
- EVENT Lab, Department of Clinical Psychology and Psychobiology, 207203University of Barcelona, Barcelona, Spain
| | - Guillermo Iruretagoyena
- EVENT Lab, Department of Clinical Psychology and Psychobiology, 207203University of Barcelona, Barcelona, Spain
| | - Luis E Zapata
- 146245Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain
| | - Beatrice de Gelder
- Brain and Emotion Laboratory, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands
| | - Mel Slater
- EVENT Lab, Department of Clinical Psychology and Psychobiology, 207203University of Barcelona, Barcelona, Spain
- 207203Institute of Neurosciences of the University of Barcelona, Barcelona, Spain
| | - Maria V Sanchez-Vives
- 146245Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain
- EVENT Lab, Department of Clinical Psychology and Psychobiology, 207203University of Barcelona, Barcelona, Spain
- Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain
| |
Collapse
|
9
|
Zijlstra TW, van Berlo E, Kret ME. Attention Towards Pupil Size in Humans and Bonobos ( Pan paniscus). AFFECTIVE SCIENCE 2022; 3:761-771. [PMID: 36519142 PMCID: PMC9743857 DOI: 10.1007/s42761-022-00146-1] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/15/2022] [Accepted: 08/07/2022] [Indexed: 05/29/2023]
Abstract
Previous work has established that humans have an attentional bias towards emotional signals, and there is some evidence that this phenomenon is shared with bonobos, our closest relatives. Although many emotional signals are explicit and overt, implicit cues such as pupil size also contain emotional information for observers. Pupil size can impact social judgment and foster trust and social support, and is automatically mimicked, suggesting a communicative role. While an attentional bias towards more obvious emotional expressions has been shown, it is unclear whether this also extends to a more subtle implicit cue, like changes in pupil size. Therefore, the current study investigated whether attention is biased towards pupils of differing sizes in humans and bonobos. A total of 150 human participants (141 female), with a mean age of 19.13 (ranging from 18 to 32 years old), completed an online dot-probe task. Four female bonobos (6 to 17 years old) completed the dot-probe task presented via a touch screen. We used linear mixed multilevel models to examine the effect of pupil size on reaction times. In humans, our analysis showed a small but significant attentional bias towards dilated pupils compared to intermediate-sized pupils and intermediate-sized pupils when compared to small pupils. Our analysis did not show a significant effect in bonobos. These results suggest that the attentional bias towards emotions in humans can be extended to a subtle unconsciously produced signal, namely changes in pupil size. Due to methodological differences between the two experiments, more research is needed before drawing a conclusion regarding bonobos. Supplementary Information The online version contains supplementary material available at 10.1007/s42761-022-00146-1.
Collapse
Affiliation(s)
- T. W. Zijlstra
- Cognitive Psychology Unit, Institute of Psychology, Leiden University, Leiden, the Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden, the Netherlands
| | - E. van Berlo
- Cognitive Psychology Unit, Institute of Psychology, Leiden University, Leiden, the Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden, the Netherlands
- Institute for Biodiversity and Ecosystem Dynamics, University of Amsterdam, Amsterdam, the Netherlands
| | - M. E. Kret
- Cognitive Psychology Unit, Institute of Psychology, Leiden University, Leiden, the Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden, the Netherlands
| |
Collapse
|
10
|
Calbi M, Montalti M, Pederzani C, Arcuri E, Umiltà MA, Gallese V, Mirabella G. Emotional body postures affect inhibitory control only when task-relevant. Front Psychol 2022; 13:1035328. [PMID: 36405118 PMCID: PMC9669573 DOI: 10.3389/fpsyg.2022.1035328] [Citation(s) in RCA: 14] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2022] [Accepted: 10/10/2022] [Indexed: 08/05/2023] Open
Abstract
A classical theoretical frame to interpret motor reactions to emotional stimuli is that such stimuli, particularly those threat-related, are processed preferentially, i.e., they are capable of capturing and grabbing attention automatically. Research has recently challenged this view, showing that the task relevance of emotional stimuli is crucial to having a reliable behavioral effect. Such evidence indicated that emotional facial expressions do not automatically influence motor responses in healthy young adults, but they do so only when intrinsically pertinent to the ongoing subject's goals. Given the theoretical relevance of these findings, it is essential to assess their generalizability to different, socially relevant emotional stimuli such as emotional body postures. To address this issue, we compared the performance of 36 right-handed participants in two different versions of a Go/No-go task. In the Emotional Discrimination task, participants were required to withhold their responses at the display of emotional body postures (fearful or happy) and to move at the presentation of neutral postures. Differently, in the control task, the same images were shown, but participants had to respond according to the color of the actor/actress' t-shirt, disregarding the emotional content. Results showed that participants made more commission errors (instances in which they moved even though the No-go signal was presented) for happy than fearful body postures in the Emotional Discrimination task. However, this difference disappeared in the control task. Such evidence indicates that, like facial emotion, emotional body expressions do not influence motor control automatically, but only when they are task-relevant.
Collapse
Affiliation(s)
- Marta Calbi
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy
- Lab Neuroscience & Humanities, University of Parma, Parma, Italy
- Department of Philosophy, State University of Milan, Milan, Italy
| | - Martina Montalti
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy
- Lab Neuroscience & Humanities, University of Parma, Parma, Italy
- Department of Clinical and Experimental Sciences, University of Brescia, Brescia, Italy
| | - Carlotta Pederzani
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy
| | - Edoardo Arcuri
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy
- Lab Neuroscience & Humanities, University of Parma, Parma, Italy
| | - Maria Alessandra Umiltà
- Lab Neuroscience & Humanities, University of Parma, Parma, Italy
- Department of Food and Drug Sciences, University of Parma, Parma, Italy
| | - Vittorio Gallese
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy
- Lab Neuroscience & Humanities, University of Parma, Parma, Italy
| | - Giovanni Mirabella
- Department of Clinical and Experimental Sciences, University of Brescia, Brescia, Italy
- IRCCS Neuromed, Pozzilli, Italy
| |
Collapse
|
11
|
Albohn DN, Brandenburg JC, Kveraga K, Adams RB. The shared signal hypothesis: Facial and bodily expressions of emotion mutually inform one another. Atten Percept Psychophys 2022; 84:2271-2280. [PMID: 36045309 PMCID: PMC9509690 DOI: 10.3758/s13414-022-02548-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/27/2022] [Indexed: 11/08/2022]
Abstract
Decades of research show that contextual information from the body, visual scene, and voices can facilitate judgments of facial expressions of emotion. To date, most research suggests that bodily expressions of emotion offer context for interpreting facial expressions, but not vice versa. The present research aimed to investigate the conditions under which mutual processing of facial and bodily displays of emotion facilitate and/or interfere with emotion recognition. In the current two studies, we examined whether body and face emotion recognition are enhanced through integration of shared emotion cues, and/or hindered through mixed signals (i.e., interference). We tested whether faces and bodies facilitate or interfere with emotion processing by pairing briefly presented (33 ms), backward-masked presentations of faces with supraliminally presented bodies (Experiment 1) and vice versa (Experiment 2). Both studies revealed strong support for integration effects, but not interference. Integration effects are most pronounced for low-emotional clarity facial and bodily expressions, suggesting that when more information is needed in one channel, the other channel is recruited to disentangle any ambiguity. That this occurs for briefly presented, backward-masked presentations reveals low-level visual integration of shared emotional signal value.
Collapse
Affiliation(s)
- Daniel N Albohn
- Booth School of Business, The University of Chicago, Chicago, IL, USA.
| | - Joseph C Brandenburg
- Department of School Psychology, The Pennsylvania State University, University Park, PA, USA
| | - Kestutis Kveraga
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Boston, MA, USA
- Department of Radiology, Harvard Medical School, Boston, MA, USA
| | - Reginald B Adams
- Department of Psychology, The Pennsylvania State University, University Park, PA, USA.
| |
Collapse
|
12
|
Karim AKMR, Proulx MJ, de Sousa AA, Likova LT. Do we enjoy what we sense and perceive? A dissociation between aesthetic appreciation and basic perception of environmental objects or events. COGNITIVE, AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2022; 22:904-951. [PMID: 35589909 PMCID: PMC10159614 DOI: 10.3758/s13415-022-01004-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 03/27/2022] [Indexed: 05/06/2023]
Abstract
This integrative review rearticulates the notion of human aesthetics by critically appraising the conventional definitions, offerring a new, more comprehensive definition, and identifying the fundamental components associated with it. It intends to advance holistic understanding of the notion by differentiating aesthetic perception from basic perceptual recognition, and by characterizing these concepts from the perspective of information processing in both visual and nonvisual modalities. To this end, we analyze the dissociative nature of information processing in the brain, introducing a novel local-global integrative model that differentiates aesthetic processing from basic perceptual processing. This model builds on the current state of the art in visual aesthetics as well as newer propositions about nonvisual aesthetics. This model comprises two analytic channels: aesthetics-only channel and perception-to-aesthetics channel. The aesthetics-only channel primarily involves restricted local processing for quality or richness (e.g., attractiveness, beauty/prettiness, elegance, sublimeness, catchiness, hedonic value) analysis, whereas the perception-to-aesthetics channel involves global/extended local processing for basic feature analysis, followed by restricted local processing for quality or richness analysis. We contend that aesthetic processing operates independently of basic perceptual processing, but not independently of cognitive processing. We further conjecture that there might be a common faculty, labeled as aesthetic cognition faculty, in the human brain for all sensory aesthetics albeit other parts of the brain can also be activated because of basic sensory processing prior to aesthetic processing, particularly during the operation of the second channel. This generalized model can account not only for simple and pure aesthetic experiences but for partial and complex aesthetic experiences as well.
Collapse
Affiliation(s)
- A K M Rezaul Karim
- Department of Psychology, University of Dhaka, Dhaka, 1000, Bangladesh.
- Envision Research Institute, 610 N. Main St., Wichita, KS, USA.
- The Smith-Kettlewell Eye Research Institute, 2318 Fillmore St., San Francisco, CA, USA.
| | | | | | - Lora T Likova
- The Smith-Kettlewell Eye Research Institute, 2318 Fillmore St., San Francisco, CA, USA
| |
Collapse
|
13
|
Folz J, Fiacchino D, Nikolić M, van Steenbergen H, Kret ME. Reading Your Emotions in My Physiology? Reliable Emotion Interpretations in Absence of a Robust Physiological Resonance. AFFECTIVE SCIENCE 2022; 3:480-497. [PMID: 35282156 PMCID: PMC8901434 DOI: 10.1007/s42761-021-00083-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/16/2020] [Accepted: 09/12/2021] [Indexed: 11/26/2022]
Abstract
Affective states are expressed in an individual’s physical appearance, ranging from facial expressions and body postures, to indicators of physiological arousal (e.g., a blush). Confirming the claimed communicative function of these markers, humans are capable of distinguishing between a variety of discrete emotion displays. In an attempt to explain the underlying mechanism, characteristic bodily changes within the observer, including physiological arousal and mimicry, have been suggested to facilitate the interpretation of an expression. The current study aims to create a holistic picture of emotion perception by (1) using three different sources of emotional information (prototypical facial expressions, bodily expressions, and subtle facial cues) and (2) measuring changes in multiple physiological signals (facial electromyography, skin conductance level, skin temperature, and pupil size). While participants clearly discriminated between perceived emotional expressions, there was no overall 1–1 correspondence with their physiological responses. Some specific but robust effects were observed. Angry facial expressions were consistently responded to with a peak in skin conductance level. Furthermore, sad body expressions were associated with a drop in skin temperature. In addition to being the best recognized expression, viewing happy faces elicited congruent facial muscle responses, which supports the potential role of embodied simulation in emotion recognition. Lastly, tears were not only rated as highly emotional intense but also evoked a peak in skin conductance level in the observer. The absence of distinct physiological responses to other expressions could be explained by the lacking functionality of affect sharing in a non-interactive experimental context. Consequentially, emotional alignment in body and mind might especially take place in real social situations, which should be considered in future research.
Collapse
Affiliation(s)
- Julia Folz
- Department of Cognitive Psychology, Institute of Psychology, Leiden University, Leiden, 2333 AK The Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden University, Leiden, 2300 RC The Netherlands
| | - Donatella Fiacchino
- Department of Cognitive Psychology, Institute of Psychology, Leiden University, Leiden, 2333 AK The Netherlands
| | - Milica Nikolić
- Department of Cognitive Psychology, Institute of Psychology, Leiden University, Leiden, 2333 AK The Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden University, Leiden, 2300 RC The Netherlands
- Research Institute of Child Development and Education, University of Amsterdam, Amsterdam, 1018 WS The Netherlands
| | - Henk van Steenbergen
- Department of Cognitive Psychology, Institute of Psychology, Leiden University, Leiden, 2333 AK The Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden University, Leiden, 2300 RC The Netherlands
| | - Mariska E. Kret
- Department of Cognitive Psychology, Institute of Psychology, Leiden University, Leiden, 2333 AK The Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden University, Leiden, 2300 RC The Netherlands
| |
Collapse
|
14
|
Mattavelli S, Brambilla M, Kret ME. It Is Written in the Eyes: Inferences From Pupil Size and Gaze Orientation Shape Interpersonal Liking. SOCIAL COGNITION 2022. [DOI: 10.1521/soco.2022.40.1.88] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
Research has shown that pupil size shapes interpersonal impressions: Individuals with dilated pupils tend to be perceived more positively than those with constricted pupils. Untested so far is the role of cognitive processes in shaping the effects of pupil size. Two preregistered studies investigated whether the effect of pupil size was qualified by partner's attention allocation inferred from gaze orientation. In Experiment 1 (N = 50) partners with dilated pupils were more liked when gazing toward the participant, but less liked when gazing toward a disliked other. Experiment 2 (N = 50) unveiled the underlying mechanism of the pupil-gaze interplay. Pupillary changes led to inferences about the feelings held by the partner toward the gazed target: Larger pupils signaled positive feelings. Crucially, target identity moderated the response of the participants (i.e., liking toward the partner). This work shows the importance of considering the interplay of affective and cognitive eye-signals when studying person perception.
Collapse
|
15
|
Lange EB, Fünderich J, Grimm H. Multisensory integration of musical emotion perception in singing. PSYCHOLOGICAL RESEARCH 2022; 86:2099-2114. [PMID: 35001181 PMCID: PMC9470688 DOI: 10.1007/s00426-021-01637-9] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2021] [Accepted: 12/16/2021] [Indexed: 11/25/2022]
Abstract
We investigated how visual and auditory information contributes to emotion communication during singing. Classically trained singers applied two different facial expressions (expressive/suppressed) to pieces from their song and opera repertoire. Recordings of the singers were evaluated by laypersons or experts, presented to them in three different modes: auditory, visual, and audio–visual. A manipulation check confirmed that the singers succeeded in manipulating the face while keeping the sound highly expressive. Analyses focused on whether the visual difference or the auditory concordance between the two versions determined perception of the audio–visual stimuli. When evaluating expressive intensity or emotional content a clear effect of visual dominance showed. Experts made more use of the visual cues than laypersons. Consistency measures between uni-modal and multimodal presentations did not explain the visual dominance. The evaluation of seriousness was applied as a control. The uni-modal stimuli were rated as expected, but multisensory evaluations converged without visual dominance. Our study demonstrates that long-term knowledge and task context affect multisensory integration. Even though singers’ orofacial movements are dominated by sound production, their facial expressions can communicate emotions composed into the music, and observes do not rely on audio information instead. Studies such as ours are important to understand multisensory integration in applied settings.
Collapse
Affiliation(s)
- Elke B Lange
- Department of Music, Max Planck Institute for Empirical Aesthetics (MPIEA), Grüneburgweg 14, 60322, Frankfurt/M., Germany.
| | - Jens Fünderich
- Department of Music, Max Planck Institute for Empirical Aesthetics (MPIEA), Grüneburgweg 14, 60322, Frankfurt/M., Germany.,University of Erfurt, Erfurt, Germany
| | - Hartmut Grimm
- Department of Music, Max Planck Institute for Empirical Aesthetics (MPIEA), Grüneburgweg 14, 60322, Frankfurt/M., Germany
| |
Collapse
|
16
|
All that meets the eye: The contribution of reward processing and pupil mimicry on pupillary reactions to facial trustworthiness. CURRENT PSYCHOLOGY 2021. [DOI: 10.1007/s12144-021-02486-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
17
|
Aktar E, Nimphy CA, Kret ME, Pérez‐Edgar K, Bögels SM, Raijmakers MEJ. Pupil responses to dynamic negative facial expressions of emotion in infants and parents. Dev Psychobiol 2021; 63:e22190. [PMID: 34674251 PMCID: PMC9291579 DOI: 10.1002/dev.22190] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2020] [Revised: 07/24/2021] [Accepted: 08/01/2021] [Indexed: 11/09/2022]
Abstract
Observing others' emotions triggers physiological arousal in infants as well as in adults, reflected in dilated pupil sizes. This study is the first to examine parents' and infants' pupil responses to dynamic negative emotional facial expressions. Moreover, the links between pupil responses and negative emotional dispositions were explored among infants and parents. Infants' and one of their parent's pupil responses to negative versus neutral faces were measured via eye tracking in 222 infants (5- to 7-month-olds, n = 77, 11- to 13-month-olds, n = 78, and 17- to 19-month-olds, n = 67) and 229 parents. One parent contributed to the pupil data, whereas both parents were invited to fill in questionnaires on their own and their infant's negative emotional dispositions. Infants did not differentially respond to negative expressions, while parents showed stronger pupil responses to negative versus neutral expressions. There was a positive association between infants' and their parent's mean pupil responses and significant links between mothers' and fathers' stress levels and their infants' pupil responses. We conclude that a direct association between pupil responses in parents and offspring is observable already in infancy in typical development. Stress in parents is related to their infants' pupillary arousal to negative emotions.
Collapse
Affiliation(s)
- Evin Aktar
- Department of Psychology, Clinical Psychology UnitLeiden UniversityLeidenNetherlands
- Leiden Institute for Brain and Cognition (LIBC)Leiden UniversityLeidenNetherlands
- Department of Child Development and EducationUniversity of AmsterdamAmsterdamThe Netherlands
| | - Cosima A. Nimphy
- Department of Psychology, Clinical Psychology UnitLeiden UniversityLeidenNetherlands
- Leiden Institute for Brain and Cognition (LIBC)Leiden UniversityLeidenNetherlands
| | - Mariska E. Kret
- Leiden Institute for Brain and Cognition (LIBC)Leiden UniversityLeidenNetherlands
- Department of Psychology, Cognitive Psychology UnitLeiden UniversityLeidenNetherlands
| | - Koraly Pérez‐Edgar
- Department of PsychologyChild Study CenterThe Pennsylvania State UniversityPennsylvaniaUSA
| | - Susan M. Bögels
- Department of Child Development and EducationUniversity of AmsterdamAmsterdamThe Netherlands
- Department of Psychology, Developmental PsychologyUniversity of AmsterdamAmsterdamNetherlands
| | - Maartje E. J. Raijmakers
- Department of Psychology, Developmental PsychologyUniversity of AmsterdamAmsterdamNetherlands
- Department of Educational StudiesVrije Universiteit AmsterdamAmsterdamNetherlands
| |
Collapse
|
18
|
Emotional Processing and Experience in Amyotrophic Lateral Sclerosis: A Systematic and Critical Review. Brain Sci 2021; 11:brainsci11101356. [PMID: 34679420 PMCID: PMC8534224 DOI: 10.3390/brainsci11101356] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2021] [Revised: 10/10/2021] [Accepted: 10/12/2021] [Indexed: 11/16/2022] Open
Abstract
Even though increasing literature describes changes in emotional processing in Amyotrophic Lateral Sclerosis (ALS), efforts to summarize relevant findings are lacking in the field. A systematic literature review was performed to provide a critical and up-to-date account of emotional abilities in ALS. References were identified by searches of PubMed, Web of Science and Scopus (1980–2021, English literature), with the following key terms: (“Amyotrophic Lateral Sclerosis” or “Primary Lateral Sclerosis” or “Motor Neuron”) and “Emotion*” and (“Processing” or “Attribution” or “Elaboration” or “Perception” or “Recognition”). Studies concerning only caregivers, pseudobulbar affect, and social cognition were excluded. Forty-one articles were included, all concerning ALS, and seven topics were identified: Emotion recognition, Emotional responsiveness, Emotional reactivity, Faces approachability rating, Valence rating, Memory for emotional materials and Alexithymia. The majority of these aspects have only been sparsely addressed. The evidence confirms altered emotional processing in ALS. The most consistent findings regard the recognition of facial expressions for negative emotions, but also alterations in the subjective responsiveness to emotional stimuli (arousal, valence and approachability), in psychophysiological and cerebral reactivity and in emotional memory, together with alexithymia traits, were reported. According to this evidence, emotional abilities should be included in the clinical assessment and therapeutic interventions.
Collapse
|
19
|
Wang X, Han S. Processing of facial expressions of same-race and other-race faces: distinct and shared neural underpinnings. Soc Cogn Affect Neurosci 2021; 16:576-592. [PMID: 33624818 PMCID: PMC8138088 DOI: 10.1093/scan/nsab027] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2020] [Revised: 12/23/2020] [Accepted: 02/24/2021] [Indexed: 11/29/2022] Open
Abstract
People understand others’ emotions quickly from their facial expressions. However, facial expressions of ingroup and outgroup members may signal different social information and thus be mediated by distinct neural activities. We investigated whether there are distinct neuronal responses to fearful and happy expressions of same-race (SR) and other-race (OR) faces. We recorded electroencephalogram from Chinese adults when viewing an adaptor face (with fearful/neutral expressions in Experiment 1 but happy/neutral expressions in Experiment 2) and a target face (with fearful expressions in Experiment 1 but happy expressions in Experiment 2) presented in rapid succession. We found that both fearful and happy (vs neutral) adaptor faces increased the amplitude of a frontocentral positivity (P2). However, a fearful but not happy (vs neutral) adaptor face decreased the P2 amplitudes to target faces, and this repetition suppression (RS) effect occurred when adaptor and target faces were of the same race but not when of different races. RS was observed on two late parietal/central positive activities to fearful/happy target faces, which, however, occurred regardless of whether adaptor and target faces were of the same or different races. Our findings suggest that early affective processing of fearful expressions may engage distinct neural activities for SR and OR faces.
Collapse
Affiliation(s)
- Xuena Wang
- School of Psychological and Cognitive Sciences, PKU-IDG/McGovern Institute for Brain Research, Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing 100080, China
| | - Shihui Han
- School of Psychological and Cognitive Sciences, PKU-IDG/McGovern Institute for Brain Research, Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing 100080, China
| |
Collapse
|
20
|
Li X. Recognition Characteristics of Facial and Bodily Expressions: Evidence From ERPs. Front Psychol 2021; 12:680959. [PMID: 34290653 PMCID: PMC8287205 DOI: 10.3389/fpsyg.2021.680959] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2021] [Accepted: 06/07/2021] [Indexed: 11/24/2022] Open
Abstract
In the natural environment, facial and bodily expressions influence each other. Previous research has shown that bodily expressions significantly influence the perception of facial expressions. However, little is known about the cognitive processing of facial and bodily emotional expressions and its temporal characteristics. Therefore, this study presented facial and bodily expressions, both separately and together, to examine the electrophysiological mechanism of emotional recognition using event-related potential (ERP). Participants assessed the emotions of facial and bodily expressions that varied by valence (positive/negative) and consistency (matching/non-matching emotions). The results showed that bodily expressions induced a more positive P1 component and a shortened latency, whereas facial expressions triggered a more negative N170 and prolonged latency. Among N2 and P3, N2 was more sensitive to inconsistent emotional information and P3 was more sensitive to consistent emotional information. The cognitive processing of facial and bodily expressions had distinctive integrating features, with the interaction occurring in the early stage (N170). The results of the study highlight the importance of facial and bodily expressions in the cognitive processing of emotion recognition.
Collapse
Affiliation(s)
- Xiaoxiao Li
- Academy of Psychology and Behavior, Tianjin Normal University, Tianjin, China
| |
Collapse
|
21
|
Kret ME, van Berlo E. Attentional Bias in Humans Toward Human and Bonobo Expressions of Emotion. EVOLUTIONARY PSYCHOLOGY 2021; 19:14747049211032816. [PMID: 34318723 PMCID: PMC10358346 DOI: 10.1177/14747049211032816] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2020] [Accepted: 06/29/2021] [Indexed: 11/16/2022] Open
Abstract
Correctly recognizing and efficiently attending to emotional situations are highly valuable skills for social species such as humans and bonobos, humans' closest living relatives. In the current study, we investigated whether humans perceive a range of emotional situations differently when these involved other humans compared to bonobos. A large group of children and adults participated in an emotion perception task and rated scenes showing either bonobos or humans in situations depicting distressed or aggressive behavior, yawning, scratching, grooming, playing, sex scenes or neutral situations. A new group of people performed a dot-probe task to assess attentional biases toward these materials. The main finding is that humans perceive emotional scenes showing people similarly as emotional scenes of bonobos, a result reflecting a shared evolutionary origin of emotional expressions. Other results show that children interpreted bonobos' bared teeth displays as a positive signal. This signal is related to the human smile, but is frequently seen in distressed situations, as was the case in the current experiment. Children may still need to learn to use contextual cues when judging an ambiguous expression as positive or negative. Further, the sex scenes were rated very positively, especially by male participants. Even though they rated these more positively than women, their attention was captured similarly, surpassing all other emotion categories. Finally, humans' attention was captured more by human yawns than by bonobo yawns, which may be related to the highly contagious nature of yawns, especially when shown by close others. The current research adds to earlier work showing morphological, behavioral and genetic parallels between humans and bonobos by showing that their emotional expressions have a common origin too.
Collapse
Affiliation(s)
- Mariska E. Kret
- Leiden University, Institute of Psychology, Cognitive Psychology Unit, CoPAN lab, Wassenaarseweg, Leiden, Zuid-Holland, The Netherlands
| | - Evy van Berlo
- Leiden University, Institute of Psychology, Cognitive Psychology Unit, CoPAN lab, Wassenaarseweg, Leiden, Zuid-Holland, The Netherlands
| |
Collapse
|
22
|
Kret ME, Maitner AT, Fischer AH. Interpreting Emotions From Women With Covered Faces: A Comparison Between a Middle Eastern and Western-European Sample. Front Psychol 2021; 12:620632. [PMID: 34025499 PMCID: PMC8137903 DOI: 10.3389/fpsyg.2021.620632] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2020] [Accepted: 04/01/2021] [Indexed: 11/13/2022] Open
Abstract
While new regulations obligate or recommend people to wear medical masks at public places to prevent further spread of the Covid-19 virus, there are still open questions as to what face coverage does to social emotional communication. Previous research on the effects of wearing veils or face-covering niqabs showed that covering of the mouth led to the attribution of negative emotions and to the perception of less intense positive emotions. The current study compares a sample from the Netherlands with a sample from the United Arab Emirates on their perception of emotions from faces covered by a niqab, censoring black bars, or uncovered faces. The results show that covering the mouth area leads to greater anxiety in participants in both countries. Furthermore, although participants did not report greater decoding difficulties for faces that were covered as compared to fully visible, results show that face coverage did influence emotion perception. Specifically, happiness and anger were perceived as being less intense. Further, face coverage by a niqab, as compared to black bars, yielded lower emotional intensity ratings. We conclude that face coverage in particular can modulate the perception of emotions, but that affective contextual cues may play a role as well.
Collapse
Affiliation(s)
- Mariska E Kret
- Cognitive Psychology Unit, Institute of Psychology, Leiden University, Leiden, Netherlands.,Leiden Institute for Brain and Cognition (LIBC), Leiden, Netherlands
| | - Angela T Maitner
- Department of International Studies, American University of Sharjah, Sharjah, United Arab Emirates
| | - Agneta H Fischer
- Social Psychology Department, University of Amsterdam, Amsterdam, Netherlands
| |
Collapse
|
23
|
Mazzoni N, Ricciardelli P, Actis-Grosso R, Venuti P. Difficulties in Recognising Dynamic but not Static Emotional Body Movements in Autism Spectrum Disorder. J Autism Dev Disord 2021; 52:1092-1105. [PMID: 33866488 PMCID: PMC8854267 DOI: 10.1007/s10803-021-05015-7] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/06/2021] [Indexed: 01/03/2023]
Abstract
In this study, we investigated whether the difficulties in body motion (BM) perception may led to deficit in emotion recognition in Autism spectrum disorder (ASD). To this aim, individuals with high-functioning ASD were asked to recognise fearful, happy, and neutral BM depicted as static images or dynamic point-light and full-light displays. Results showed slower response times in participants with ASD only in recognising dynamic stimuli, but no group differences in accuracy. This suggests that i) a deficit in action chaining mechanism in ASD may prevent the recognition of dynamic BM automatically and rapidly, ii) individuals with ASD and high cognitive resources can develop alternative—but equally successful—strategies to recognise emotional body expressions. Implications for treatment are discussed
Collapse
Affiliation(s)
- Noemi Mazzoni
- OFDLab - Department of Psychology and Cognitive Science, University of Trento, Via Matteo del Ben, 5B, 38068 Rovereto, Italy
| | - Paola Ricciardelli
- Department of Psychology, University of Milano - Bicocca, Milan, Italy
- Milan Centre for Neuroscience, University of Milano-Bicocca, Piazza dell’Ateneo Nuovo 1, 20126 Milan, Italy
| | - Rossana Actis-Grosso
- Department of Psychology, University of Milano - Bicocca, Milan, Italy
- Milan Centre for Neuroscience, University of Milano-Bicocca, Piazza dell’Ateneo Nuovo 1, 20126 Milan, Italy
| | - Paola Venuti
- OFDLab - Department of Psychology and Cognitive Science, University of Trento, Via Matteo del Ben, 5B, 38068 Rovereto, Italy
| |
Collapse
|
24
|
Durbin KA, Rastegar S, Knight BG. Effects of age and mood on emotional face processing differ depending on the intensity of the facial expression. NEUROPSYCHOLOGY, DEVELOPMENT, AND COGNITION. SECTION B, AGING, NEUROPSYCHOLOGY AND COGNITION 2020; 27:902-917. [PMID: 31809671 PMCID: PMC7274884 DOI: 10.1080/13825585.2019.1700900] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/04/2019] [Accepted: 11/27/2019] [Indexed: 10/25/2022]
Abstract
Research suggests that mood can moderate age differences in recognizing facial emotion. In this study, we examined how an anxious versus calm mood state affected younger and older adults' processing of emotional faces. Older adults had greater difficulty identifying negative emotions, particularly when emotions were displayed at a low intensity level. However, an anxious mood did not affect age differences in emotional face recognition. In contrast, age, emotional intensity, and current mood state all affected the perceived intensity of emotion. The effects of age and mood on perceived emotional intensity were only observed for low intensity facial expressions. When induced into an anxious mood, younger adults perceived threatening emotions (i.e., fear, anger) as more emotionally intense, whereas older adults perceived anger and happiness to be more intense. These findings emphasize the need to consider both internal and external factors when investigating the effects of age on emotional face processing.
Collapse
Affiliation(s)
| | - Sarah Rastegar
- Department of Psychology, University of Southern California
| | - Bob G. Knight
- Department of Psychology, University of Southern California
- School of Psychology and Counseling, University of Southern Queensland
| |
Collapse
|
25
|
Visual exploration of emotional body language: a behavioural and eye-tracking study. PSYCHOLOGICAL RESEARCH 2020; 85:2326-2339. [PMID: 32920675 DOI: 10.1007/s00426-020-01416-y] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2020] [Accepted: 09/01/2020] [Indexed: 10/23/2022]
Abstract
Bodily postures are essential to correctly comprehend others' emotions and intentions. Nonetheless, very few studies focused on the pattern of eye movements implicated in the recognition of emotional body language (EBL), demonstrating significant differences in relation to different emotions. A yet unanswered question regards the presence of the "left-gaze bias" (i.e. the tendency to look first, to make more fixations and to spend more looking time on the left side of centrally presented stimuli) while scanning bodies. Hence, the present study aims at exploring both the presence of a left-gaze bias and the modulation of EBL visual exploration mechanisms, by investigating the fixation patterns (number of fixations and latency of the first fixation) of participants while judging the emotional intensity of static bodily postures (Angry, Happy and Neutral, without head). While results on the latency of first fixations demonstrate for the first time the presence of the left-gaze bias while scanning bodies, suggesting that it could be related to the stronger expressiveness of the left hand (from the observer's point of view), results on fixations' number only partially fulfil our hypothesis. Moreover, an opposite viewing pattern between Angry and Happy bodily postures is showed. In sum, the present results, by integrating the spatial and temporal dimension of gaze exploration patterns, shed new light on EBL visual exploration mechanisms.
Collapse
|
26
|
Borgomaneri S, Vitale F, Avenanti A. Early motor reactivity to observed human body postures is affected by body expression, not gender. Neuropsychologia 2020; 146:107541. [DOI: 10.1016/j.neuropsychologia.2020.107541] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2020] [Revised: 06/06/2020] [Accepted: 06/19/2020] [Indexed: 12/30/2022]
|
27
|
Woody ML, Vaughn-Coaxum RA, Siegle GJ, Price RB. Time course of pupillary response to threat words before and after attention bias modification for transdiagnostic anxiety disorders: A randomized controlled trial. Brain Behav 2020; 10:e01664. [PMID: 32633901 PMCID: PMC7428474 DOI: 10.1002/brb3.1664] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/27/2020] [Revised: 04/06/2020] [Accepted: 04/08/2020] [Indexed: 11/10/2022] Open
Abstract
INTRODUCTION Altered attention to threatening stimuli at initial and sustained stages of processing may be dissociable dimensions that influence the development and maintenance of transdiagnostic symptoms of anxiety, such as vigilance, and possibly require distinct intervention. Attention bias modification (ABM) interventions were created to implicitly train attention away from threatening stimuli and have shown efficacy in treating anxiety. ABM alters neurocognitive functioning during initial stages of threat processing, but less is known regarding effects of ABM on neural indices of threat processing at sustained (i.e., intermediate and late) stages, or if ABM-related neural changes relate to symptom response. The current study utilized pupillary response as a temporally sensitive and cost-effective peripheral marker of neurocognitive response to ABM. MATERIALS AND METHODS In a randomized controlled trial, 79 patients with transdiagnostic anxiety provided baseline data, 70 were randomized to receive eight sessions of twice-weekly ABM (n = 49) or sham training (n = 21), and 65 completed their assigned treatment condition and returned for post-training assessment. RESULTS Among ABM, but not sham, patients, pupillary response to threat words during initial and intermediate stages decreased from pre- to post-training. Pre- to post-training reductions in intermediate and late pupillary response to threat were positively correlated with reductions in patient-reported vigilance among ABM, but not sham, patients. CONCLUSIONS All measured stages of threat processing had relevance in understanding the neural mechanisms of ABM, with overlapping yet dissociable roles exhibited within a single neurophysiological marker across an initial-intermediate-late time continuum. Pupillometry may be well suited to measure both target engagement and treatment outcome following ABM.
Collapse
Affiliation(s)
- Mary L Woody
- Department of Psychiatry, University of Pittsburgh, Pittsburgh, PA, USA
| | | | - Greg J Siegle
- Department of Psychiatry, University of Pittsburgh, Pittsburgh, PA, USA
| | - Rebecca B Price
- Department of Psychiatry, University of Pittsburgh, Pittsburgh, PA, USA
| |
Collapse
|
28
|
Emotional expressions in human and non-human great apes. Neurosci Biobehav Rev 2020; 115:378-395. [DOI: 10.1016/j.neubiorev.2020.01.027] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2019] [Revised: 01/17/2020] [Accepted: 01/22/2020] [Indexed: 11/23/2022]
|
29
|
Pazhoohi F, Grammer K, Macedo AF, Arantes J. The effect of women’s leg posture on gazing behavior and perceived attractiveness. CURRENT PSYCHOLOGY 2020. [DOI: 10.1007/s12144-018-9821-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
30
|
Williams JHG, Huggins CF, Zupan B, Willis M, Van Rheenen TE, Sato W, Palermo R, Ortner C, Krippl M, Kret M, Dickson JM, Li CSR, Lowe L. A sensorimotor control framework for understanding emotional communication and regulation. Neurosci Biobehav Rev 2020; 112:503-518. [PMID: 32070695 PMCID: PMC7505116 DOI: 10.1016/j.neubiorev.2020.02.014] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2018] [Revised: 01/22/2020] [Accepted: 02/11/2020] [Indexed: 12/12/2022]
Abstract
Our research team was asked to consider the relationship of the neuroscience of sensorimotor control to the language of emotions and feelings. Actions are the principal means for the communication of emotions and feelings in both humans and other animals, and the allostatic mechanisms controlling action also apply to the regulation of emotional states by the self and others. We consider how motor control of hierarchically organised, feedback-based, goal-directed action has evolved in humans, within a context of consciousness, appraisal and cultural learning, to serve emotions and feelings. In our linguistic analysis, we found that many emotion and feelings words could be assigned to stages in the sensorimotor learning process, but the assignment was often arbitrary. The embodied nature of emotional communication means that action words are frequently used, but that the meanings or senses of the word depend on its contextual use, just as the relationship of an action to an emotion is also contextually dependent.
Collapse
Affiliation(s)
- Justin H G Williams
- University of Aberdeen, Institute of Medical Sciences, Foresterhill, AB25 2ZD, Scotland, United Kingdom.
| | - Charlotte F Huggins
- University of Aberdeen, Institute of Medical Sciences, Foresterhill, AB25 2ZD, Scotland, United Kingdom
| | - Barbra Zupan
- Central Queensland University, School of Health, Medical and Applied Sciences, Bruce Highway, Rockhampton, QLD 4702, Australia
| | - Megan Willis
- Australian Catholic University, School of Psychology, ARC Centre for Excellence in Cognition and its Disorders, Sydney, NSW 2060, Australia
| | - Tamsyn E Van Rheenen
- University of Melbourne, Melbourne Neuropsychiatry Centre, Department of Psychiatry, 161 Barry Street, Carlton, VIC 3053, Australia
| | - Wataru Sato
- Kyoto University, Kokoro Research Centre, 46 Yoshidashimoadachicho, Sakyo Ward, Kyoto, 606-8501, Japan
| | - Romina Palermo
- University of Western Australia, School of Psychological Science, Perth, WA, 6009, Australia
| | - Catherine Ortner
- Thompson Rivers University, Department of Psychology, 805 TRU Way, Kamloops, BC V2C 0C8, Canada
| | - Martin Krippl
- Otto von Guericke University Magdeburg, Faculty of Natural Sciences, Department of Psychology, Universitätsplatz 2, Magdeburg, 39106, Germany
| | - Mariska Kret
- Leiden University, Cognitive Psychology, Pieter de la Court, Waassenaarseweg 52, Leiden, 2333 AK, the Netherlands
| | - Joanne M Dickson
- Edith Cowan University, Psychology Department, School of Arts and Humanities, 270 Joondalup Dr, Joondalup, WA 6027, Australia
| | - Chiang-Shan R Li
- Yale University, Connecticut Mental Health Centre, S112, 34 Park Street, New Haven, CT 06519-1109, USA
| | - Leroy Lowe
- Neuroqualia, Room 229A, Forrester Hall, 36 Arthur Street, Truro, Nova Scotia, B2N 1X5, Canada
| |
Collapse
|
31
|
Ross P, Atkinson AP. Expanding Simulation Models of Emotional Understanding: The Case for Different Modalities, Body-State Simulation Prominence, and Developmental Trajectories. Front Psychol 2020; 11:309. [PMID: 32194476 PMCID: PMC7063097 DOI: 10.3389/fpsyg.2020.00309] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2019] [Accepted: 02/10/2020] [Indexed: 12/14/2022] Open
Abstract
Recent models of emotion recognition suggest that when people perceive an emotional expression, they partially activate the respective emotion in themselves, providing a basis for the recognition of that emotion. Much of the focus of these models and of their evidential basis has been on sensorimotor simulation as a basis for facial expression recognition - the idea, in short, that coming to know what another feels involves simulating in your brain the motor plans and associated sensory representations engaged by the other person's brain in producing the facial expression that you see. In this review article, we argue that simulation accounts of emotion recognition would benefit from three key extensions. First, that fuller consideration be given to simulation of bodily and vocal expressions, given that the body and voice are also important expressive channels for providing cues to another's emotional state. Second, that simulation of other aspects of the perceived emotional state, such as changes in the autonomic nervous system and viscera, might have a more prominent role in underpinning emotion recognition than is typically proposed. Sensorimotor simulation models tend to relegate such body-state simulation to a subsidiary role, despite the plausibility of body-state simulation being able to underpin emotion recognition in the absence of typical sensorimotor simulation. Third, that simulation models of emotion recognition be extended to address how embodied processes and emotion recognition abilities develop through the lifespan. It is not currently clear how this system of sensorimotor and body-state simulation develops and in particular how this affects the development of emotion recognition ability. We review recent findings from the emotional body recognition literature and integrate recent evidence regarding the development of mimicry and interoception to significantly expand simulation models of emotion recognition.
Collapse
Affiliation(s)
- Paddy Ross
- Department of Psychology, Durham University, Durham, United Kingdom
| | | |
Collapse
|
32
|
Hajdúk M, Klein HS, Bass EL, Springfield CR, Pinkham AE. Implicit and explicit processing of bodily emotions in schizophrenia. Cogn Neuropsychiatry 2020; 25:139-153. [PMID: 31870213 DOI: 10.1080/13546805.2019.1706465] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/08/2023]
Abstract
INTRODUCTION Disturbed emotion processing is well documented in schizophrenia, but the majority of studies evaluate processing of emotion only from facial expressions. Social cues are also communicated via body posture, and they are similarly relevant for successful social interactions. The aim of the current study was to thoroughly examine body perception abilities in individuals with schizophrenia. METHODS Fifty-nine patients with schizophrenia and 37 healthy controls completed two tasks of body processing. The first, which was based on the Affect Misattribution Procedure, evaluated implicit processing of bodily emotions, and the second utilised a traditional emotion identification paradigm to assess explicit emotion recognition. RESULTS Results revealed aberrant implicit processing, but more normative explicit processing, in individuals with schizophrenia. Moderate associations were found between processing of bodies and symptoms of paranoia. Performance on the tasks was not related to cognitive functioning but was associated with clinician-rated social functioning. CONCLUSIONS Collectively, these results provide information about disturbed processing of bodily emotions in schizophrenia and suggest that these disturbances are associated with the severity of positive symptoms and predict difficulties in everyday social activities and interpersonal relationships.
Collapse
Affiliation(s)
- Michal Hajdúk
- Department of Psychology, Faculty of Arts, Comenius University Bratislava, Slovak Republic.,Clinic of Psychiatry, Faculty of Medicine, Comenius University, Bratislava, Slovak Republic.,Center for Psychiatric Disorders Research - UK, Science Park, Comenius University in Bratislava, Slovak Republic
| | - Hans S Klein
- School of Behavioral and Brain Sciences, The University of Texas at Dallas, Richardson, TX, USA
| | - Emily L Bass
- School of Behavioral and Brain Sciences, The University of Texas at Dallas, Richardson, TX, USA
| | - Cassi R Springfield
- School of Behavioral and Brain Sciences, The University of Texas at Dallas, Richardson, TX, USA
| | - Amy E Pinkham
- School of Behavioral and Brain Sciences, The University of Texas at Dallas, Richardson, TX, USA.,Department of Psychiatry, University of Texas Southwestern Medical School, Dallas, TX, USA
| |
Collapse
|
33
|
Kulke L, Feyerabend D, Schacht A. A Comparison of the Affectiva iMotions Facial Expression Analysis Software With EMG for Identifying Facial Expressions of Emotion. Front Psychol 2020; 11:329. [PMID: 32184749 PMCID: PMC7058682 DOI: 10.3389/fpsyg.2020.00329] [Citation(s) in RCA: 53] [Impact Index Per Article: 13.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2019] [Accepted: 02/11/2020] [Indexed: 11/13/2022] Open
Abstract
Human faces express emotions, informing others about their affective states. In order to measure expressions of emotion, facial Electromyography (EMG) has widely been used, requiring electrodes and technical equipment. More recently, emotion recognition software has been developed that detects emotions from video recordings of human faces. However, its validity and comparability to EMG measures is unclear. The aim of the current study was to compare the Affectiva Affdex emotion recognition software by iMotions with EMG measurements of the zygomaticus mayor and corrugator supercilii muscle, concerning its ability to identify happy, angry and neutral faces. Twenty participants imitated these facial expressions while videos and EMG were recorded. Happy and angry expressions were detected by both the software and by EMG above chance, while neutral expressions were more often falsely identified as negative by EMG compared to the software. Overall, EMG and software values correlated highly. In conclusion, Affectiva Affdex software can identify facial expressions and its results are comparable to EMG findings.
Collapse
Affiliation(s)
- Louisa Kulke
- Affective Neuroscience and Psychophysiology Laboratory, University of Göttingen, Göttingen, Germany
- Leibniz ScienceCampus Primate Cognition, Göttingen, Germany
| | - Dennis Feyerabend
- Affective Neuroscience and Psychophysiology Laboratory, University of Göttingen, Göttingen, Germany
| | - Annekathrin Schacht
- Affective Neuroscience and Psychophysiology Laboratory, University of Göttingen, Göttingen, Germany
- Leibniz ScienceCampus Primate Cognition, Göttingen, Germany
| |
Collapse
|
34
|
Blocking facial mimicry affects recognition of facial and body expressions. PLoS One 2020; 15:e0229364. [PMID: 32078668 PMCID: PMC7032686 DOI: 10.1371/journal.pone.0229364] [Citation(s) in RCA: 44] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2019] [Accepted: 02/04/2020] [Indexed: 11/20/2022] Open
Abstract
Facial mimicry is commonly defined as the tendency to imitate-at a sub-threshold level-facial expressions of other individuals. Numerous studies support a role of facial mimicry in recognizing others' emotions. However, the underlying functional mechanism is unclear. A prominent hypothesis considers facial mimicry as based on an action-perception loop, leading to the prediction that facial mimicry should be observed only when processing others' facial expressions. Nevertheless, previous studies have also detected facial mimicry during observation of emotional bodily expressions. An emergent alternative hypothesis is that facial mimicry overtly reflects the simulation of an "emotion", rather than the reproduction of a specific observed motor pattern. In the present study, we tested whether blocking mimicry ("Bite") on the lower face disrupted recognition of happy expressions conveyed by either facial or body expressions. In Experiment 1, we tested participants' ability to identify happy, fearful and neutral expressions in the Bite condition and in two control conditions. In Experiment 2, to ensure that such a manipulation selectively affects emotion recognition, we tested participants' ability to recognize emotional expressions, as well as the actors' gender, under the Bite condition and a control condition. Finally, we investigated the relationship between dispositional empathy and emotion recognition under the condition of blocked mimicry. Our findings demonstrated that blocking mimicry on the lower face hindered recognition of happy facial and body expressions, while the recognition of neutral and fearful expressions was not affected by the mimicry manipulation. The mimicry manipulation did not affect the gender discrimination task. Furthermore, the impairment of happy expression recognition correlated with empathic traits. These results support the role of facial mimicry in emotion recognition and suggest that facial mimicry reflects a global sensorimotor simulation of others' emotions rather than a muscle-specific reproduction of an observed motor expression.
Collapse
|
35
|
Pazhoohi F, Arantes J, Kingstone A, Pinal D. Becoming sexy: Contrapposto pose increases attractiveness ratings and modulates observers' brain activity. Biol Psychol 2020; 151:107842. [PMID: 31958547 DOI: 10.1016/j.biopsycho.2020.107842] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2019] [Revised: 11/10/2019] [Accepted: 01/10/2020] [Indexed: 10/25/2022]
Abstract
Previous neurophysiological studies have revealed the neural correlates of human body form perception, as well as those related to the perception of attractive body sizes. In the current study we aimed to extend the neurophysiological studies regarding body perception by investigating the perception of human body posture to provide insights into the cognitive mechanisms responsive to bodily form, and the processing of its attractiveness. To achieve these aims, we used the contrapposto posture which creates an exaggeration of low waist to hip ratio (WHR), an indicator of women's attractiveness. Electroencephalogram (EEG) signals were recorded while participants completed both (i) an oddball task presenting female body forms differing in pose (contrapposto vs. standing) and viewing angle (anterior vs. posterior), and (ii) a subsequent active attractiveness judgement task. Behavioral results showed that a contrapposto pose is considered more attractive than a neutral standing pose. Results at the neural level showed that body posture modulates the visual information processing in early ERP components, indicating attentional variations depending on human body posture; as well as in late components, indicating further differences in attention and attractiveness judgement of stimuli varying in body pose. Furthermore, the LORETA results identified the middle temporal gyrus as well as angular gyrus as the key brain regions activated in association with the perception and attractiveness judgment of females' bodies with different body poses. Overall, the current paper suggests the evolutionary adaptive preference for lower WHRs as in the contrapposto pose activating brain regions associated with visual perception and attractiveness judgement.
Collapse
Affiliation(s)
- Farid Pazhoohi
- Department of Psychology, University of British Columbia, 2136 West Mall, Vancouver, British Columbia, V6T 1Z4, Canada.
| | - Joana Arantes
- Department of Basic Psychology, School of Psychology, University of Minho, Braga, Portugal
| | - Alan Kingstone
- Department of Psychology, University of British Columbia, 2136 West Mall, Vancouver, British Columbia, V6T 1Z4, Canada
| | - Diego Pinal
- Psychological Neuroscience Lab, CIPsi, School of Psychology, University of Minho, Braga, Portugal.
| |
Collapse
|
36
|
Krejtz I, Krejtz K, Wisiecka K, Abramczyk M, Olszanowski M, Duchowski AT. Attention Dynamics During Emotion Recognition by Deaf and Hearing Individuals. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2020; 25:10-21. [PMID: 31665493 DOI: 10.1093/deafed/enz036] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/10/2018] [Revised: 07/11/2019] [Accepted: 08/01/2019] [Indexed: 06/10/2023]
Abstract
The enhancement hypothesis suggests that deaf individuals are more vigilant to visual emotional cues than hearing individuals. The present eye-tracking study examined ambient-focal visual attention when encoding affect from dynamically changing emotional facial expressions. Deaf (n = 17) and hearing (n = 17) individuals watched emotional facial expressions that in 10-s animations morphed from a neutral expression to one of happiness, sadness, or anger. The task was to recognize emotion as quickly as possible. Deaf participants tended to be faster than hearing participants in affect recognition, but the groups did not differ in accuracy. In general, happy faces were more accurately and more quickly recognized than faces expressing anger or sadness. Both groups demonstrated longer average fixation duration when recognizing happiness in comparison to anger and sadness. Deaf individuals directed their first fixations less often to the mouth region than the hearing group. During the last stages of emotion recognition, deaf participants exhibited more focal viewing of happy faces than negative faces. This pattern was not observed among hearing individuals. The analysis of visual gaze dynamics, switching between ambient and focal attention, was useful in studying the depth of cognitive processing of emotional information among deaf and hearing individuals.
Collapse
Affiliation(s)
- Izabela Krejtz
- SWPS University of Social Sciences and Humanities, Chodakowska 19/31, Warsaw, Poland
| | - Krzysztof Krejtz
- SWPS University of Social Sciences and Humanities, Chodakowska 19/31, Warsaw, Poland
| | | | | | - Michał Olszanowski
- SWPS University of Social Sciences and Humanities, Chodakowska 19/31, Warsaw, Poland
| | | |
Collapse
|
37
|
Abstract
Pupillometry has been one of the most widely used response systems in psychophysiology. Changes in pupil size can reflect diverse cognitive and emotional states, ranging from arousal, interest and effort to social decisions, but they are also widely used in clinical practice to assess patients’ brain functioning. As a result, research involving pupil size measurements has been reported in practically all psychology, psychiatry, and psychophysiological research journals, and now it has found its way into the primatology literature as well as into more practical applications, such as using pupil size as a measure of fatigue or a safety index during driving. The different systems used for recording pupil size are almost as variable as its applications, and all yield, as with many measurement techniques, a substantial amount of noise in addition to the real pupillometry data. Before analyzing pupil size, it is therefore of crucial importance first to detect this noise and deal with it appropriately, even prior to (if need be) resampling and baseline-correcting the data. In this article we first provide a short review of the literature on pupil size measurements, then we highlight the most important sources of noise and show how these can be detected. Finally, we provide step-by-step guidelines that will help those interested in pupil size to preprocess their data correctly. These guidelines are accompanied by an open source MATLAB script (available at https://github.com/ElioS-S/pupil-size). Given that pupil diameter is easily measured by standard eyetracking technologies and can provide fundamental insights into cognitive and emotional processes, it is hoped that this article will further motivate scholars from different disciplines to study pupil size.
Collapse
|
38
|
Zhang M, Liu T, Jin Y, He W, Huang Y, Luo W. The asynchronous influence of facial expressions on bodily expressions. Acta Psychol (Amst) 2019; 200:102941. [PMID: 31677428 DOI: 10.1016/j.actpsy.2019.102941] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2018] [Revised: 08/19/2019] [Accepted: 09/20/2019] [Indexed: 10/25/2022] Open
Abstract
The ability to extract correct emotional information from facial and bodily expressions is fundamental for the development of social skills. Previous studies have shown that bodily expressions affect the recognition of basic facial expressions dramatically. However, few studies have considered the view that facial expressions may influence the recognition of bodily expressions. Further, previous studies have failed to consider a comprehensive set of emotional categories. The present study sought to examine whether facial expressions would impact the recognition of bodily expressions asynchronously, using four basic emotions. Participants performed an affective priming task, in which the priming stimuli included four facial expressions (happy, sad, fearful, and angry), and the target stimuli were bodily expressions matching the same emotions. The results indicated that the perception of affective facial expressions significantly influenced the accuracy and reaction time for body-based emotion categorization, particularly for bodily expression of happiness. The recognition accuracy of congruent expressions was higher, relative to that of incongruent expressions. The findings show that facial expressions influence the recognition of bodily expressions, despite the asynchrony.
Collapse
|
39
|
Brambilla M, Biella M, Kret ME. The power of pupils in predicting conforming behavior. SOCIAL INFLUENCE 2019. [DOI: 10.1080/15534510.2019.1637775] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Affiliation(s)
- Marco Brambilla
- Department of Psychology, University of Milano Bicocca, Milano, Italy
| | - Marco Biella
- Department of Psychology, University of Milano Bicocca, Milano, Italy
| | - Mariska E. Kret
- Department of Psychology, Leiden University, Leiden, Netherlands
| |
Collapse
|
40
|
Kamiloglu RG, Smeets MAM, de Groot JHB, Semin GR. Fear Odor Facilitates the Detection of Fear Expressions Over Other Negative Expressions. Chem Senses 2019; 43:419-426. [PMID: 29796589 DOI: 10.1093/chemse/bjy029] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022] Open
Abstract
In a double-blind experiment, participants were exposed to facial images of anger, disgust, fear, and neutral expressions under 2 body odor conditions: fear and neutral sweat. They had to indicate the valence of the gradually emerging facial image. Two alternative hypotheses were tested, namely a "general negative evaluative state" hypothesis and a "discrete emotion" hypothesis. These hypotheses suggest 2 distinctive data patterns for muscle activation and classification speed of facial expressions. The pattern of results that would support a "discrete emotions perspective" would be expected to reveal significantly increased activity in the medial frontalis (eyebrow raiser) and corrugator supercilii (frown) muscles associated with fear, and significantly decreased reaction times (RTs) to "only" fear faces in the fear odor condition. Conversely, a pattern of results characterized by only a significantly increased corrugator supercilii activity together with decreased RTs for fear, disgust, and anger faces in the fear odor condition would support an interpretation in line with a general negative evaluative state perspective. The data support the discrete emotion account for facial affect perception primed with fear odor. This study provides a first demonstration of perception of discrete negative facial expressions using olfactory priming.
Collapse
Affiliation(s)
- Roza G Kamiloglu
- Department of Social, Health and Organizational Psychology, Faculty of Social and Behavioral Sciences, Utrecht University, Utrecht, The Netherlands
| | - Monique A M Smeets
- Department of Social, Health and Organizational Psychology, Faculty of Social and Behavioral Sciences, Utrecht University, Utrecht, The Netherlands
| | - Jasper H B de Groot
- Department of Social, Health and Organizational Psychology, Faculty of Social and Behavioral Sciences, Utrecht University, Utrecht, The Netherlands.,Department of Neurology, University of Pennsylvania, Philadelphia, PA, USA
| | - Gün R Semin
- Department of Social, Health and Organizational Psychology, Faculty of Social and Behavioral Sciences, Utrecht University, Utrecht, The Netherlands.,William James Center for Research, ISPA Instituto Universitário, Lisbon, Portugal
| |
Collapse
|
41
|
White H, Jubran R, Heck A, Chroust A, Bhatt RS. Sex-specific scanning in infancy: Developmental changes in the use of face/head and body information. J Exp Child Psychol 2019; 182:126-143. [PMID: 30825728 PMCID: PMC6414250 DOI: 10.1016/j.jecp.2019.01.006] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2018] [Revised: 12/07/2018] [Accepted: 01/07/2019] [Indexed: 11/29/2022]
Abstract
The current investigation sought to differentiate between contrasting perspectives of body knowledge development by determining whether infants' adult-like scanning of male and female bodies is dependent on relevant information from the face/head alone, the body alone, or a combination of both sources. Scanning patterns of 3.5-, 6.5-, and 9-month-olds (N = 80) in response to images that contained information relevant to sex classification in either the face/head or the body were examined. The results indicate that sex-specific scanning in the presence of only one source of relevant information (i.e., face/head or body) is present only at 9 months. Thus, although sex-specific scanning of bodies emerges as early as 3.5 months, information from both faces/heads and bodies is required until sometime between 6.5 and 9 months of age. These findings constrain theories of the development of social perception by documenting the complex interplay between body and face/head processing early in life.
Collapse
Affiliation(s)
- Hannah White
- University of Kentucky, Lexington, KY 40506, USA
| | | | - Alison Heck
- University of Kentucky, Lexington, KY 40506, USA
| | | | | |
Collapse
|
42
|
Peinkhofer C, Knudsen GM, Moretti R, Kondziella D. Cortical modulation of pupillary function: systematic review. PeerJ 2019; 7:e6882. [PMID: 31119083 PMCID: PMC6510220 DOI: 10.7717/peerj.6882] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2019] [Accepted: 03/26/2019] [Indexed: 12/25/2022] Open
Abstract
BACKGROUND The pupillary light reflex is the main mechanism that regulates the pupillary diameter; it is controlled by the autonomic system and mediated by subcortical pathways. In addition, cognitive and emotional processes influence pupillary function due to input from cortical innervation, but the exact circuits remain poorly understood. We performed a systematic review to evaluate the mechanisms behind pupillary changes associated with cognitive efforts and processing of emotions and to investigate the cerebral areas involved in cortical modulation of the pupillary light reflex. METHODOLOGY We searched multiple databases until November 2018 for studies on cortical modulation of pupillary function in humans and non-human primates. Of 8,809 papers screened, 258 studies were included. RESULTS Most investigators focused on pupillary dilatation and/or constriction as an index of cognitive and emotional processing, evaluating how changes in pupillary diameter reflect levels of attention and arousal. Only few tried to correlate specific cerebral areas to pupillary changes, using either cortical activation models (employing micro-stimulation of cortical structures in non-human primates) or cortical lesion models (e.g., investigating patients with stroke and damage to salient cortical and/or subcortical areas). Results suggest the involvement of several cortical regions, including the insular cortex (Brodmann areas 13 and 16), the frontal eye field (Brodmann area 8) and the prefrontal cortex (Brodmann areas 11 and 25), and of subcortical structures such as the locus coeruleus and the superior colliculus. CONCLUSIONS Pupillary dilatation occurs with many kinds of mental or emotional processes, following sympathetic activation or parasympathetic inhibition. Conversely, pupillary constriction may occur with anticipation of a bright stimulus (even in its absence) and relies on a parasympathetic activation. All these reactions are controlled by subcortical and cortical structures that are directly or indirectly connected to the brainstem pupillary innervation system.
Collapse
Affiliation(s)
- Costanza Peinkhofer
- Department of Neurology, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark
- Medical Faculty, University of Trieste, Trieste, Italy
| | - Gitte M. Knudsen
- Department of Neurology, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark
- Neurobiology Research Unit, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark
- Faculty of Health and Medical Science, University of Copenhagen, Copenhagen, Denmark
| | - Rita Moretti
- Medical Faculty, University of Trieste, Trieste, Italy
- Department of Medical, Surgical and Health Sciences, Neurological Unit, Trieste University Hospital, Cattinara, Trieste, Italy
| | - Daniel Kondziella
- Department of Neurology, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark
- Faculty of Health and Medical Science, University of Copenhagen, Copenhagen, Denmark
- Department of Neuroscience, Norwegian University of Technology and Science, Trondheim, Norway
| |
Collapse
|
43
|
Quesque F, Behrens F, Kret ME. Pupils say more than a thousand words: Pupil size reflects how observed actions are interpreted. Cognition 2019; 190:93-98. [PMID: 31034971 DOI: 10.1016/j.cognition.2019.04.016] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2018] [Revised: 04/18/2019] [Accepted: 04/19/2019] [Indexed: 01/15/2023]
Abstract
Humans attend to others' facial expressions and body language to better understand their emotions and predict goals and intentions. The eyes and its pupils reveal important social information. Because pupil size is beyond voluntary control yet reflective of a range of cognitive and affective processes, pupils in principal have the potential to convey whether others' actions are interpreted correctly or not. Here, we measured pupil size while participants observed video-clips showing reach-to-grasp arm movements. Expressors in the video-clips were playing a board game and moved a dowel to a new position. Participants' task was to decide whether the dowel was repositioned with the intention to be followed up by another move of the same expressor (personal intention) or whether the arm movement carried the implicit message that expressor's turn was over (social intention). Replicating earlier findings, results showed that participants recognized expressors' intentions on the basis of their arm kinematics. Results further showed that participants' pupil size was larger when observing actions reflecting personal compared to social intentions. Most interestingly, before participants indicated how they interpreted the observed actions by choosing to press one of two keys (corresponding to the personal or social intention), their pupils within a split second, had already given away how they interpreted the expressor's movement. In sum, this study underscores the importance of nonverbal behavior in helping social messages get across quickly. Revealing how actions are interpreted, pupils may provide additional feedback for effective social interactions.
Collapse
Affiliation(s)
- François Quesque
- University of Lille, CNRS, UMR 9193 - SCALab - Sciences Cognitives et Sciences Affectives, F-59000 Lille, France
| | - Friederike Behrens
- Leiden University, Cognitive Psychology Unit, Leiden, the Netherlands; Leiden Institute for Brain and Cognition (LIBC), the Netherlands
| | - Mariska E Kret
- Leiden University, Cognitive Psychology Unit, Leiden, the Netherlands; Leiden Institute for Brain and Cognition (LIBC), the Netherlands.
| |
Collapse
|
44
|
Vetter P, Badde S, Phelps EA, Carrasco M. Emotional faces guide the eyes in the absence of awareness. eLife 2019; 8:43467. [PMID: 30735123 PMCID: PMC6382349 DOI: 10.7554/elife.43467] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2018] [Accepted: 02/07/2019] [Indexed: 12/14/2022] Open
Abstract
The ability to act quickly to a threat is a key skill for survival. Under awareness, threat-related emotional information, such as an angry or fearful face, has not only perceptual advantages but also guides rapid actions such as eye movements. Emotional information that is suppressed from awareness still confers perceptual and attentional benefits. However, it is unknown whether suppressed emotional information can directly guide actions, or whether emotional information has to enter awareness to do so. We suppressed emotional faces from awareness using continuous flash suppression and tracked eye gaze position. Under successful suppression, as indicated by objective and subjective measures, gaze moved towards fearful faces, but away from angry faces. Our findings reveal that: (1) threat-related emotional stimuli can guide eye movements in the absence of visual awareness; (2) threat-related emotional face information guides distinct oculomotor actions depending on the type of threat conveyed by the emotional expression.
Collapse
Affiliation(s)
- Petra Vetter
- Department of Psychology, Center for Neural Science, New York University, New York, United States.,Department of Psychology, Royal Holloway, University of London, Egham, United Kingdom
| | - Stephanie Badde
- Department of Psychology, Center for Neural Science, New York University, New York, United States
| | - Elizabeth A Phelps
- Department of Psychology, Center for Neural Science, New York University, New York, United States.,Department of Psychology, Harvard University, Cambridge, United States
| | - Marisa Carrasco
- Department of Psychology, Center for Neural Science, New York University, New York, United States
| |
Collapse
|
45
|
Watson R, de Gelder B. The representation and plasticity of body emotion expression. PSYCHOLOGICAL RESEARCH 2019; 84:1400-1406. [PMID: 30603865 DOI: 10.1007/s00426-018-1133-1] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2015] [Accepted: 12/10/2018] [Indexed: 11/29/2022]
Abstract
Emotions are expressed by the face, the voice and the whole body. Research on the face and the voice has not only demonstrated that emotions are perceived categorically, but that this perception can be manipulated. The purpose of this study was to investigate, via two separate experiments using adaptation and multisensory techniques, whether the perception of body emotion expressions also shows categorical effects and plasticity. We used an approach developed for studies investigating both face and voice emotion perception and created novel morphed affective body stimuli, which varied in small incremental steps between emotions. Participants were instructed to perform an emotion categorisation of these morphed bodies after adaptation to bodies conveying different expressions (Experiment 1), or while simultaneously hearing affective voices (Experiment 2). We show that not only is body expression perceived categorically, but that both adaptation to affective body expressions and concurrent presentation of vocal affective information can shift the categorical boundary between body expressions, specifically for the angry body expressions. Overall, our findings provide significant new insights into emotional body categorisation, which may prove important in gaining a deeper understanding of body expression perception in everyday social situations.
Collapse
Affiliation(s)
- Rebecca Watson
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, 6229 EV, Maastricht, The Netherlands
| | - Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, 6229 EV, Maastricht, The Netherlands.
| |
Collapse
|
46
|
Parental negative emotions are related to behavioral and pupillary correlates of infants’ attention to facial expressions of emotion. Infant Behav Dev 2018; 53:101-111. [DOI: 10.1016/j.infbeh.2018.07.004] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2018] [Revised: 07/21/2018] [Accepted: 07/29/2018] [Indexed: 11/20/2022]
|
47
|
Philip L, Martin JC, Clavel C. Suppression of Facial Mimicry of Negative Facial Expressions in an Incongruent Context. J PSYCHOPHYSIOL 2018. [DOI: 10.1027/0269-8803/a000191] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
Abstract. People react with Rapid Facial Reactions (RFRs) when presented with human facial emotional expressions. Recent studies show that RFRs are not always congruent with emotional cues. The processes underlying RFRs are still being debated. In our study described herein, we manipulate the context of perception and its influence on RFRs. We use a subliminal affective priming task with emotional labels. Facial electromyography (EMG) (frontalis, corrugator, zygomaticus, and depressor) was recorded while participants observed static facial expressions (joy, fear, anger, sadness, and neutral expression) preceded/not preceded by a subliminal word (JOY, FEAR, ANGER, SADNESS, or NEUTRAL). For the negative facial expressions, when the priming word was congruent with the facial expression, participants displayed congruent RFRs (mimicry). When the priming word was incongruent, we observed a suppression of mimicry. Happiness was not affected by the priming word. RFRs thus appear to be modulated by the context and type of emotion that is presented via facial expressions.
Collapse
|
48
|
Keil V, Hepach R, Vierrath S, Caffier D, Tuschen-Caffier B, Klein C, Schmitz J. Children with social anxiety disorder show blunted pupillary reactivity and altered eye contact processing in response to emotional faces: Insights from pupillometry and eye movements. J Anxiety Disord 2018; 58:61-69. [PMID: 30053635 DOI: 10.1016/j.janxdis.2018.07.001] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/06/2017] [Revised: 06/25/2018] [Accepted: 07/09/2018] [Indexed: 01/09/2023]
Abstract
Cognitive models and adult research associate social anxiety disorder (SAD) with hypervigilant-avoidant processing of social information, such as eye contact. However, processing biases in childhood SAD remain mostly unexplored. We examined 10- to 13-year-old children's eye contact processing and pupil dilation in response to happy, neutral, and angry faces in three groups: SAD (n = 31), mixed anxiety disorders (MAD; n = 30), and healthy controls (HC; n = 32). Compared to HC, SAD children showed faster first fixations on the eye region of neutral faces and shorter first fixation durations on the eye region of all faces. No differences between the two clinical groups emerged in eye movement results. SAD girls showed reduced pupil dilation in response to happy and angry faces compared to MAD and to happy faces compared to HC. SAD boys showed reduced pupil dilation in response to neutral faces compared to HC. Dimensionally, reduced pupil dilation was linked to social anxiety severity while eye movements were correlated with mixed anxiety and depressive severity. Results suggest that hypervigilant-avoidant eye contact processing and a blunted pupillary reactivity characterize children with SAD. Both transdiagnostic and disorder-specific processing biases are relevant for the understanding of childhood SAD.
Collapse
Affiliation(s)
- Verena Keil
- Department of Clinical Psychology and Psychotherapy, University of Freiburg, Germany.
| | - Robert Hepach
- Leipzig Research Center for Early Child Development, Leipzig University, Germany; Department of Research Methods in Early Child Development, Leipzig University, Germany
| | - Severin Vierrath
- Laboratory for MEMS Applications, IMTEK - Department of Microsystems Engineering, University of Freiburg, Germany
| | - Detlef Caffier
- Department of Clinical Psychology and Psychotherapy, University of Freiburg, Germany
| | | | - Christoph Klein
- Department of Child and Adolescent Psychiatry, Psychotherapy, and Psychosomatics, Medical Faculty, University of Freiburg, Germany; Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Medical Faculty, University of Cologne, Germany
| | - Julian Schmitz
- Department of Clinical Child and Adolescent Psychology, Leipzig University, Germany; Leipzig Research Center for Early Child Development, Leipzig University, Germany
| |
Collapse
|
49
|
The dot-probe task to measure emotional attention: A suitable measure in comparative studies? Psychon Bull Rev 2018; 24:1686-1717. [PMID: 28092078 DOI: 10.3758/s13423-016-1224-1] [Citation(s) in RCA: 69] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/02/2023]
Abstract
For social animals, attending to and recognizing the emotional expressions of other individuals is of crucial importance for their survival and likely has a deep evolutionary origin. Gaining insight into how emotional expressions evolved as adaptations over the course of evolution can be achieved by making direct cross-species comparisons. To that extent, experimental paradigms that are suitable for investigating emotional processing across species need to be developed and evaluated. The emotional dot-probe task, which measures attention allocation toward emotional stimuli, has this potential. The task is implicit, and subjects need minimal training to perform the task successfully. Findings in nonhuman primates, although scarce, show that they, like humans, have an attentional bias toward emotional stimuli. However, the wide literature on human studies has shown that different factors can have important moderating effects on the results. Due to the large heterogeneity of this literature, these moderating effects often remain unnoticed. We here review this literature and show that subject characteristics and differences in experimental designs affect the results of the dot-probe task. We conclude with specific recommendations regarding these issues that are particularly relevant to take into consideration when applying this paradigm to study animals.
Collapse
|
50
|
Coverage of Emotion Recognition for Common Wearable Biosensors. BIOSENSORS-BASEL 2018; 8:bios8020030. [PMID: 29587375 PMCID: PMC6023004 DOI: 10.3390/bios8020030] [Citation(s) in RCA: 34] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/20/2018] [Revised: 03/16/2018] [Accepted: 03/22/2018] [Indexed: 11/21/2022]
Abstract
The present research proposes a novel emotion recognition framework for the computer prediction of human emotions using common wearable biosensors. Emotional perception promotes specific patterns of biological responses in the human body, and this can be sensed and used to predict emotions using only biomedical measurements. Based on theoretical and empirical psychophysiological research, the foundation of autonomic specificity facilitates the establishment of a strong background for recognising human emotions using machine learning on physiological patterning. However, a systematic way of choosing the physiological data covering the elicited emotional responses for recognising the target emotions is not obvious. The current study demonstrates through experimental measurements the coverage of emotion recognition using common off-the-shelf wearable biosensors based on the synchronisation between audiovisual stimuli and the corresponding physiological responses. The work forms the basis of validating the hypothesis for emotional state recognition in the literature and presents coverage of the use of common wearable biosensors coupled with a novel preprocessing algorithm to demonstrate the practical prediction of the emotional states of wearers.
Collapse
|