1
|
Kroczek LOH, Lingnau A, Schwind V, Wolff C, Mühlberger A. Observers predict actions from facial emotional expressions during real-time social interactions. Behav Brain Res 2024; 471:115126. [PMID: 38950784 DOI: 10.1016/j.bbr.2024.115126] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2023] [Revised: 06/07/2024] [Accepted: 06/19/2024] [Indexed: 07/03/2024]
Abstract
In face-to-face social interactions, emotional expressions provide insights into the mental state of an interactive partner. This information can be crucial to infer action intentions and react towards another person's actions. Here we investigate how facial emotional expressions impact subjective experience and physiological and behavioral responses to social actions during real-time interactions. Thirty-two participants interacted with virtual agents while fully immersed in Virtual Reality. Agents displayed an angry or happy facial expression before they directed an appetitive (fist bump) or aversive (punch) social action towards the participant. Participants responded to these actions, either by reciprocating the fist bump or by defending the punch. For all interactions, subjective experience was measured using ratings. In addition, physiological responses (electrodermal activity, electrocardiogram) and participants' response times were recorded. Aversive actions were judged to be more arousing and less pleasant relative to appetitive actions. In addition, angry expressions increased heart rate relative to happy expressions. Crucially, interaction effects between facial emotional expression and action were observed. Angry expressions reduced pleasantness stronger for appetitive compared to aversive actions. Furthermore, skin conductance responses to aversive actions were increased for happy compared to angry expressions and reaction times were faster to aversive compared to appetitive actions when agents showed an angry expression. These results indicate that observers used facial emotional expression to generate expectations for particular actions. Consequently, the present study demonstrates that observers integrate information from facial emotional expressions with actions during social interactions.
Collapse
Affiliation(s)
- Leon O H Kroczek
- Department of Psychology, Clinical Psychology and Psychotherapy, University of Regensburg, Regensburg, Germany.
| | - Angelika Lingnau
- Department of Psychology, Cognitive Neuroscience, University of Regensburg, Regensburg, Germany
| | - Valentin Schwind
- Human Computer Interaction, University of Applied Sciences in Frankfurt a. M., Frankfurt a. M, Germany; Department of Media Informatics, University of Regensburg, Regensburg, Germany
| | - Christian Wolff
- Department of Media Informatics, University of Regensburg, Regensburg, Germany
| | - Andreas Mühlberger
- Department of Psychology, Clinical Psychology and Psychotherapy, University of Regensburg, Regensburg, Germany
| |
Collapse
|
2
|
Duken SB, Keessen L, Hoijtink H, Kindt M, van Ast VA. Bayesian evaluation of diverging theories of episodic and affective memory distortions in dysphoria. Nat Commun 2024; 15:1320. [PMID: 38351107 PMCID: PMC10864297 DOI: 10.1038/s41467-024-45203-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2020] [Accepted: 01/18/2024] [Indexed: 02/16/2024] Open
Abstract
People suffering from dysphoria retrieve autobiographical memories distorted in content and affect, which may contribute to the aetiology and maintenance of depression. However, key memory difficulties in dysphoria remain elusive because theories disagree how memories of different valence are altered. Here, we assessed the psychophysiological expression of affect and retrieved episodic detail while participants with dysphoria (but without a diagnosed mental illness) and participants without dysphoria relived positive, negative, and neutral memories. We show that participants with dysphoria retrieve positive memories with diminished episodic detail and negative memories with enhanced detail, compared to participants without dysphoria. This is in line with negativity bias but not overgeneral memory bias theories. According to confirmatory analyses, participants with dysphoria also express diminished positive affect and enhanced negative affect when retrieving happy memories, but exploratory analyses suggest that this increase in negative affect may not be robust. Further confirmatory analyses showed that affective responses to memories are not related to episodic detail and already present during the experience of new emotional events. Our results indicate that affective memory distortions may not emerge from mnemonic processes but from general distortions in positive affect, which challenges assumptions of memory theories and therapeutics. Protocol registration: The Stage 1 protocol for this Registered Report was accepted in principle on the 18rd of March 2021. The protocol, as accepted by the journal, can be found at https://doi.org/10.6084/m9.figshare.14605374.v1 .
Collapse
Affiliation(s)
- Sascha B Duken
- Department of Clinical Psychology, University of Amsterdam, Amsterdam, the Netherlands.
| | - Liza Keessen
- Amsterdam School of Communication Research, University of Amsterdam, Amsterdam, the Netherlands
| | - Herbert Hoijtink
- Department of Methodology and Statistics, Utrecht University, Utrecht, the Netherlands
| | - Merel Kindt
- Department of Clinical Psychology, University of Amsterdam, Amsterdam, the Netherlands
| | - Vanessa A van Ast
- Department of Clinical Psychology, University of Amsterdam, Amsterdam, the Netherlands.
| |
Collapse
|
3
|
Kabulska Z, Lingnau A. The cognitive structure underlying the organization of observed actions. Behav Res Methods 2023; 55:1890-1906. [PMID: 35788973 PMCID: PMC10250259 DOI: 10.3758/s13428-022-01894-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/26/2022] [Indexed: 11/08/2022]
Abstract
In daily life, we frequently encounter actions performed by other people. Here we aimed to examine the key categories and features underlying the organization of a wide range of actions in three behavioral experiments (N = 378 participants). In Experiment 1, we used a multi-arrangement task of 100 different actions. Inverse multidimensional scaling and hierarchical clustering revealed 11 action categories, including Locomotion, Communication, and Aggressive actions. In Experiment 2, we used a feature-listing paradigm to obtain a wide range of action features that were subsequently reduced to 59 key features and used in a rating study (Experiment 3). A direct comparison of the feature ratings obtained in Experiment 3 between actions belonging to the categories identified in Experiment 1 revealed a number of features that appear to be critical for the distinction between these categories, e.g., the features Harm and Noise for the category Aggressive actions, and the features Targeting a person and Contact with others for the category Interaction. Finally, we found that a part of the category-based organization is explained by a combination of weighted features, whereas a significant proportion of variability remained unexplained, suggesting that there are additional sources of information that contribute to the categorization of observed actions. The characterization of action categories and their associated features serves as an important extension of previous studies examining the cognitive structure of actions. Moreover, our results may serve as the basis for future behavioral, neuroimaging and computational modeling studies.
Collapse
Affiliation(s)
- Zuzanna Kabulska
- Department of Psychology, Faculty of Human Sciences, University of Regensburg, Universitätsstraße 31, 93053, Regensburg, Germany
| | - Angelika Lingnau
- Department of Psychology, Faculty of Human Sciences, University of Regensburg, Universitätsstraße 31, 93053, Regensburg, Germany.
| |
Collapse
|
4
|
Mennella R, Bavard S, Mentec I, Grèzes J. Spontaneous instrumental avoidance learning in social contexts. Sci Rep 2022; 12:17528. [PMID: 36266316 PMCID: PMC9585085 DOI: 10.1038/s41598-022-22334-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2022] [Accepted: 10/13/2022] [Indexed: 01/13/2023] Open
Abstract
Adaptation to our social environment requires learning how to avoid potentially harmful situations, such as encounters with aggressive individuals. Threatening facial expressions can evoke automatic stimulus-driven reactions, but whether their aversive motivational value suffices to drive instrumental active avoidance remains unclear. When asked to freely choose between different action alternatives, participants spontaneously-without instruction or monetary reward-developed a preference for choices that maximized the probability of avoiding angry individuals (sitting away from them in a waiting room). Most participants showed clear behavioral signs of instrumental learning, even in the absence of an explicit avoidance strategy. Inter-individual variability in learning depended on participants' subjective evaluations and sensitivity to threat approach feedback. Counterfactual learning best accounted for avoidance behaviors, especially in participants who developed an explicit avoidance strategy. Our results demonstrate that implicit defensive behaviors in social contexts are likely the product of several learning processes, including instrumental learning.
Collapse
Affiliation(s)
- Rocco Mennella
- grid.508487.60000 0004 7885 7602Laboratoire des Interactions Cognition, Action, Émotion (LICAÉ), Université Paris Nanterre, 200 Avenue de La République, 92001 Nanterre Cedex, France ,grid.440907.e0000 0004 1784 3645Cognitive and Computational Neuroscience Laboratory (LNC2), Inserm U960, Department of Cognitive Studies, École Normale Supérieure, PSL University, 29 Rue d’Ulm, 75005 Paris, France
| | - Sophie Bavard
- grid.440907.e0000 0004 1784 3645Cognitive and Computational Neuroscience Laboratory (LNC2), Inserm U960, Department of Cognitive Studies, École Normale Supérieure, PSL University, 29 Rue d’Ulm, 75005 Paris, France ,grid.9026.d0000 0001 2287 2617Department of Psychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany
| | - Inès Mentec
- grid.440907.e0000 0004 1784 3645Cognitive and Computational Neuroscience Laboratory (LNC2), Inserm U960, Department of Cognitive Studies, École Normale Supérieure, PSL University, 29 Rue d’Ulm, 75005 Paris, France
| | - Julie Grèzes
- grid.440907.e0000 0004 1784 3645Cognitive and Computational Neuroscience Laboratory (LNC2), Inserm U960, Department of Cognitive Studies, École Normale Supérieure, PSL University, 29 Rue d’Ulm, 75005 Paris, France
| |
Collapse
|
5
|
Zhuang T, Lingnau A. The characterization of actions at the superordinate, basic and subordinate level. PSYCHOLOGICAL RESEARCH 2021; 86:1871-1891. [PMID: 34907466 PMCID: PMC9363348 DOI: 10.1007/s00426-021-01624-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2020] [Accepted: 11/26/2021] [Indexed: 10/26/2022]
Abstract
Objects can be categorized at different levels of abstraction, ranging from the superordinate (e.g., fruit) and the basic (e.g., apple) to the subordinate level (e.g., golden delicious). The basic level is assumed to play a key role in categorization, e.g., in terms of the number of features used to describe these actions and the speed of processing. To which degree do these principles also apply to the categorization of observed actions? To address this question, we first selected a range of actions at the superordinate (e.g., locomotion), basic (e.g., to swim) and subordinate level (e.g., to swim breaststroke), using verbal material (Experiments 1-3). Experiments 4-6 aimed to determine the characteristics of these actions across the three taxonomic levels. Using a feature listing paradigm (Experiment 4), we determined the number of features that were provided by at least six out of twenty participants (common features), separately for the three different levels. In addition, we examined the number of shared (i.e., provided for more than one category) and distinct (i.e., provided for one category only) features. Participants produced the highest number of common features for actions at the basic level. Actions at the subordinate level shared more features with other actions at the same level than those at the superordinate level. Actions at the superordinate and basic level were described with more distinct features compared to those provided at the subordinate level. Using an auditory priming paradigm (Experiment 5), we observed that participants responded faster to action images preceded by a matching auditory cue corresponding to the basic and subordinate level, but not for superordinate level cues, suggesting that the basic level is the most abstract level at which verbal cues facilitate the processing of an upcoming action. Using a category verification task (Experiment 6), we found that participants were faster and more accurate to verify action categories (depicted as images) at the basic and subordinate level in comparison to the superordinate level. Together, in line with the object categorization literature, our results suggest that information about action categories is maximized at the basic level.
Collapse
Affiliation(s)
- Tonghe Zhuang
- Chair of Cognitive Neuroscience, Faculty of Human Sciences, Institute of Psychology, University of Regensburg, Universitätsstrasse 31, 93053, Regensburg, Germany
| | - Angelika Lingnau
- Chair of Cognitive Neuroscience, Faculty of Human Sciences, Institute of Psychology, University of Regensburg, Universitätsstrasse 31, 93053, Regensburg, Germany.
| |
Collapse
|