1
|
Cowen AS, Brooks JA, Prasad G, Tanaka M, Kamitani Y, Kirilyuk V, Somandepalli K, Jou B, Schroff F, Adam H, Sauter D, Fang X, Manokara K, Tzirakis P, Oh M, Keltner D. How emotion is experienced and expressed in multiple cultures: a large-scale experiment across North America, Europe, and Japan. Front Psychol 2024; 15:1350631. [PMID: 38966733 PMCID: PMC11223574 DOI: 10.3389/fpsyg.2024.1350631] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2023] [Accepted: 03/04/2024] [Indexed: 07/06/2024] Open
Abstract
Core to understanding emotion are subjective experiences and their expression in facial behavior. Past studies have largely focused on six emotions and prototypical facial poses, reflecting limitations in scale and narrow assumptions about the variety of emotions and their patterns of expression. We examine 45,231 facial reactions to 2,185 evocative videos, largely in North America, Europe, and Japan, collecting participants' self-reported experiences in English or Japanese and manual and automated annotations of facial movement. Guided by Semantic Space Theory, we uncover 21 dimensions of emotion in the self-reported experiences of participants in Japan, the United States, and Western Europe, and considerable cross-cultural similarities in experience. Facial expressions predict at least 12 dimensions of experience, despite massive individual differences in experience. We find considerable cross-cultural convergence in the facial actions involved in the expression of emotion, and culture-specific display tendencies-many facial movements differ in intensity in Japan compared to the U.S./Canada and Europe but represent similar experiences. These results quantitatively detail that people in dramatically different cultures experience and express emotion in a high-dimensional, categorical, and similar but complex fashion.
Collapse
Affiliation(s)
- Alan S. Cowen
- Hume AI, New York, NY, United States
- Department of Psychology, University of California, Berkeley, Berkeley, CA, United States
| | - Jeffrey A. Brooks
- Hume AI, New York, NY, United States
- Department of Psychology, University of California, Berkeley, Berkeley, CA, United States
| | | | - Misato Tanaka
- Advanced Telecommunications Research Institute, Kyoto, Japan
- Graduate School of Informatics, Kyoto University, Kyoto, Japan
| | - Yukiyasu Kamitani
- Advanced Telecommunications Research Institute, Kyoto, Japan
- Graduate School of Informatics, Kyoto University, Kyoto, Japan
| | | | - Krishna Somandepalli
- Google Research, Mountain View, CA, United States
- Department of Electrical Engineering, University of Southern California, Los Angeles, CA, United States
| | - Brendan Jou
- Google Research, Mountain View, CA, United States
| | | | - Hartwig Adam
- Google Research, Mountain View, CA, United States
| | - Disa Sauter
- Faculty of Social and Behavioural Sciences, University of Amsterdam, Amsterdam, Netherlands
| | - Xia Fang
- Zhejiang University, Zhejiang, China
| | - Kunalan Manokara
- Faculty of Social and Behavioural Sciences, University of Amsterdam, Amsterdam, Netherlands
| | | | - Moses Oh
- Hume AI, New York, NY, United States
| | - Dacher Keltner
- Hume AI, New York, NY, United States
- Department of Psychology, University of California, Berkeley, Berkeley, CA, United States
| |
Collapse
|
2
|
Tessier MH, Mazet JP, Gagner E, Marcoux A, Jackson PL. Facial representations of complex affective states combining pain and a negative emotion. Sci Rep 2024; 14:11686. [PMID: 38777852 PMCID: PMC11111784 DOI: 10.1038/s41598-024-62423-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2023] [Accepted: 05/16/2024] [Indexed: 05/25/2024] Open
Abstract
Pain is rarely communicated alone, as it is often accompanied by emotions such as anger or sadness. Communicating these affective states involves shared representations. However, how an individual conceptually represents these combined states must first be tested. The objective of this study was to measure the interaction between pain and negative emotions on two types of facial representations of these states, namely visual (i.e., interactive virtual agents; VAs) and sensorimotor (i.e., one's production of facial configurations). Twenty-eight participants (15 women) read short written scenarios involving only pain or a combined experience of pain and a negative emotion (anger, disgust, fear, or sadness). They produced facial configurations representing these experiences on the faces of the VAs and on their face (own production or imitation of VAs). The results suggest that affective states related to a direct threat to the body (i.e., anger, disgust, and pain) share a similar facial representation, while those that present no immediate danger (i.e., fear and sadness) differ. Although visual and sensorimotor representations of these states provide congruent affective information, they are differently influenced by factors associated with the communication cycle. These findings contribute to our understanding of pain communication in different affective contexts.
Collapse
Affiliation(s)
- Marie-Hélène Tessier
- School of Psychology, Université Laval, Québec City, Canada
- Centre for Interdisciplinary Research in Rehabilitation and Social Integration (Cirris), Québec City, Canada
- CERVO Brain Research Centre, Québec City, Canada
| | - Jean-Philippe Mazet
- Department of Computer Science and Software Engineering, Université Laval, Québec City, Canada
| | - Elliot Gagner
- School of Psychology, Université Laval, Québec City, Canada
- Centre for Interdisciplinary Research in Rehabilitation and Social Integration (Cirris), Québec City, Canada
- CERVO Brain Research Centre, Québec City, Canada
| | - Audrey Marcoux
- School of Psychology, Université Laval, Québec City, Canada
- Centre for Interdisciplinary Research in Rehabilitation and Social Integration (Cirris), Québec City, Canada
- CERVO Brain Research Centre, Québec City, Canada
| | - Philip L Jackson
- School of Psychology, Université Laval, Québec City, Canada.
- Centre for Interdisciplinary Research in Rehabilitation and Social Integration (Cirris), Québec City, Canada.
- CERVO Brain Research Centre, Québec City, Canada.
| |
Collapse
|
3
|
Goel S, Jara-Ettinger J, Ong DC, Gendron M. Face and context integration in emotion inference is limited and variable across categories and individuals. Nat Commun 2024; 15:2443. [PMID: 38499519 PMCID: PMC10948792 DOI: 10.1038/s41467-024-46670-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2022] [Accepted: 03/05/2024] [Indexed: 03/20/2024] Open
Abstract
The ability to make nuanced inferences about other people's emotional states is central to social functioning. While emotion inferences can be sensitive to both facial movements and the situational context that they occur in, relatively little is understood about when these two sources of information are integrated across emotion categories and individuals. In a series of studies, we use one archival and five empirical datasets to demonstrate that people could be integrating, but that emotion inferences are just as well (and sometimes better) captured by knowledge of the situation alone, while isolated facial cues are insufficient. Further, people integrate facial cues more for categories for which they most frequently encounter facial expressions in everyday life (e.g., happiness). People are also moderately stable over time in their reliance on situational cues and integration of cues and those who reliably utilize situation cues more also have better situated emotion knowledge. These findings underscore the importance of studying variability in reliance on and integration of cues.
Collapse
Affiliation(s)
- Srishti Goel
- Department of Psychology, Yale University, 100 College St, New Haven, CT, USA.
| | - Julian Jara-Ettinger
- Department of Psychology, Yale University, 100 College St, New Haven, CT, USA
- Wu Tsai Institute, Yale University, 100 College St, New Haven, CT, USA
| | - Desmond C Ong
- Department of Psychology, The University of Texas at Austin, 108 E Dean Keeton St, Austin, TX, USA
| | - Maria Gendron
- Department of Psychology, Yale University, 100 College St, New Haven, CT, USA.
| |
Collapse
|
4
|
Fischer H, Nilsson ME, Ebner NC. Why the Single-N Design Should Be the Default in Affective Neuroscience. AFFECTIVE SCIENCE 2024; 5:62-66. [PMID: 38495781 PMCID: PMC10942943 DOI: 10.1007/s42761-023-00182-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/21/2022] [Accepted: 02/07/2023] [Indexed: 03/19/2024]
Abstract
Many studies in affective neuroscience rely on statistical procedures designed to estimate population averages and base their main conclusions on group averages. However, the obvious unit of analysis in affective neuroscience is the individual, not the group, because emotions are individual phenomena that typically vary across individuals. Conclusions based on group averages may therefore be misleading or wrong, if interpreted as statements about emotions of an individual, or meaningless, if interpreted as statements about the group, which has no emotions. We therefore advocate the Single-N design as the default strategy in research on emotions, testing one or several individuals extensively with the primary purpose of obtaining results at the individual level. In neuroscience, the equivalent to the Single-N design is deep imaging, the emerging trend of extensive measurements of activity in single brains. Apart from the fact that individuals react differently to emotional stimuli, they also vary in shape and size of their brains. Group-based analysis of brain imaging data therefore refers to an "average brain" that was activated in a way that may not be representative of the physiology of any of the tested individual brains, nor of how these brains responded to the experimental stimuli. Deep imaging avoids such group-averaging artifacts by simply focusing on the individual brain. This methodological shift toward individual analysis has already opened new research areas in fields like vision science. Inspired by this, we call for a corresponding shift in affective neuroscience, away from group averages, and toward experimental designs targeting the individual.
Collapse
Affiliation(s)
- Håkan Fischer
- Department of Psychology, Stockholm University, 106 91 Stockholm, Sweden
- Stockholm University Brain Imaging Center (SUBIC), 106 91 Stockholm, Sweden
- Department of Psychology, University of Florida, Gainesville, FL 32611 USA
| | - Mats E. Nilsson
- Department of Psychology, Stockholm University, 106 91 Stockholm, Sweden
| | - Natalie C. Ebner
- Department of Psychology, University of Florida, Gainesville, FL 32611 USA
- Institute of Aging, University of Florida, Gainesville, FL 32611 USA
- Center for Cognitive Aging and Memory, Department of Clinical and Health Psychology, University of Florida, Gainesville, FL 32611 USA
- Florida Institute for Cybersecurity Research, University of Florida, Gainesville, FL 32610-0165 USA
| |
Collapse
|
5
|
Hsu CT, Sato W, Yoshikawa S. An investigation of the modulatory effects of empathic and autistic traits on emotional and facial motor responses during live social interactions. PLoS One 2024; 19:e0290765. [PMID: 38194416 PMCID: PMC10775989 DOI: 10.1371/journal.pone.0290765] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2021] [Accepted: 08/15/2023] [Indexed: 01/11/2024] Open
Abstract
A close relationship between emotional contagion and spontaneous facial mimicry has been theoretically proposed and is supported by empirical data. Facial expressions are essential in terms of both emotional and motor synchrony. Previous studies have demonstrated that trait emotional empathy enhanced spontaneous facial mimicry, but the relationship between autistic traits and spontaneous mimicry remained controversial. Moreover, previous studies presented faces that were static or videotaped, which may lack the "liveliness" of real-life social interactions. We addressed this limitation by using an image relay system to present live performances and pre-recorded videos of smiling or frowning dynamic facial expressions to 94 healthy female participants. We assessed their subjective experiential valence and arousal ratings to infer the amplitude of emotional contagion. We measured the electromyographic activities of the zygomaticus major and corrugator supercilii muscles to estimate spontaneous facial mimicry. Individual differences measures included trait emotional empathy (empathic concern) and the autism-spectrum quotient. We did not find that live performances enhanced the modulatory effect of trait differences on emotional contagion or spontaneous facial mimicry. However, we found that a high trait empathic concern was associated with stronger emotional contagion and corrugator mimicry. We found no two-way interaction between the autism spectrum quotient and emotional condition, suggesting that autistic traits did not modulate emotional contagion or spontaneous facial mimicry. Our findings imply that previous findings regarding the relationship between emotional empathy and emotional contagion/spontaneous facial mimicry using videos and photos could be generalized to real-life interactions.
Collapse
Affiliation(s)
- Chun-Ting Hsu
- Psychological Process Research Team, Guardian Robot Project, RIKEN, Soraku-gun, Kyoto, Japan
| | - Wataru Sato
- Psychological Process Research Team, Guardian Robot Project, RIKEN, Soraku-gun, Kyoto, Japan
| | - Sakiko Yoshikawa
- Institute of Philosophy and Human Values, Kyoto University of the Arts, Kyoto, Kyoto, Japan
| |
Collapse
|
6
|
Li Z, Lu H, Liu D, Yu ANC, Gendron M. Emotional event perception is related to lexical complexity and emotion knowledge. COMMUNICATIONS PSYCHOLOGY 2023; 1:45. [PMID: 39242918 PMCID: PMC11332234 DOI: 10.1038/s44271-023-00039-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/18/2023] [Accepted: 11/23/2023] [Indexed: 09/09/2024]
Abstract
Inferring emotion is a critical skill that supports social functioning. Emotion inferences are typically studied in simplistic paradigms by asking people to categorize isolated and static cues like frowning faces. Yet emotions are complex events that unfold over time. Here, across three samples (Study 1 N = 222; Study 2 N = 261; Study 3 N = 101), we present the Emotion Segmentation Paradigm to examine inferences about complex emotional events by extending cognitive paradigms examining event perception. Participants were asked to indicate when there were changes in the emotions of target individuals within continuous streams of activity in narrative film (Study 1) and documentary clips (Study 2, preregistered, and Study 3 test-retest sample). This Emotion Segmentation Paradigm revealed robust and reliable individual differences across multiple metrics. We also tested the constructionist prediction that emotion labels constrain emotion inference, which is traditionally studied by introducing emotion labels. We demonstrate that individual differences in active emotion vocabulary (i.e., readily accessible emotion words) correlate with emotion segmentation performance.
Collapse
Affiliation(s)
- Zhimeng Li
- Department of Psychology, Yale University, New Haven, Connecticut, USA.
| | - Hanxiao Lu
- Department of Psychology, New York University, New York, NY, USA
| | - Di Liu
- Department of Psychology, Johns Hopkins University, Baltimore, MD, USA
| | - Alessandra N C Yu
- Nash Family Department of Neuroscience, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Maria Gendron
- Department of Psychology, Yale University, New Haven, Connecticut, USA.
| |
Collapse
|
7
|
Namba S, Sato W, Namba S, Nomiya H, Shimokawa K, Osumi M. Development of the RIKEN database for dynamic facial expressions with multiple angles. Sci Rep 2023; 13:21785. [PMID: 38066065 PMCID: PMC10709572 DOI: 10.1038/s41598-023-49209-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2023] [Accepted: 12/05/2023] [Indexed: 12/18/2023] Open
Abstract
The development of facial expressions with sensing information is progressing in multidisciplinary fields, such as psychology, affective computing, and cognitive science. Previous facial datasets have not simultaneously dealt with multiple theoretical views of emotion, individualized context, or multi-angle/depth information. We developed a new facial database (RIKEN facial expression database) that includes multiple theoretical views of emotions and expressers' individualized events with multi-angle and depth information. The RIKEN facial expression database contains recordings of 48 Japanese participants captured using ten Kinect cameras at 25 events. This study identified several valence-related facial patterns and found them consistent with previous research investigating the coherence between facial movements and internal states. This database represents an advancement in developing a new sensing system, conducting psychological experiments, and understanding the complexity of emotional events.
Collapse
Affiliation(s)
- Shushi Namba
- RIKEN, Psychological Process Research Team, Guardian Robot Project, Kyoto, 6190288, Japan.
- Department of Psychology, Hiroshima University, Hiroshima, 7398524, Japan.
| | - Wataru Sato
- RIKEN, Psychological Process Research Team, Guardian Robot Project, Kyoto, 6190288, Japan.
| | - Saori Namba
- Department of Psychology, Hiroshima University, Hiroshima, 7398524, Japan
| | - Hiroki Nomiya
- Faculty of Information and Human Sciences, Kyoto Institute of Technology, Kyoto, 6068585, Japan
| | - Koh Shimokawa
- KOHINATA Limited Liability Company, Osaka, 5560020, Japan
| | - Masaki Osumi
- KOHINATA Limited Liability Company, Osaka, 5560020, Japan
| |
Collapse
|
8
|
Straulino E, Scarpazza C, Spoto A, Betti S, Chozas Barrientos B, Sartori L. The Spatiotemporal Dynamics of Facial Movements Reveals the Left Side of a Posed Smile. BIOLOGY 2023; 12:1160. [PMID: 37759560 PMCID: PMC10525663 DOI: 10.3390/biology12091160] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/18/2023] [Revised: 08/14/2023] [Accepted: 08/19/2023] [Indexed: 09/29/2023]
Abstract
Humans can recombine thousands of different facial expressions. This variability is due to the ability to voluntarily or involuntarily modulate emotional expressions, which, in turn, depends on the existence of two anatomically separate pathways. The Voluntary (VP) and Involuntary (IP) pathways mediate the production of posed and spontaneous facial expressions, respectively, and might also affect the left and right sides of the face differently. This is a neglected aspect in the literature on emotion, where posed expressions instead of genuine expressions are often used as stimuli. Two experiments with different induction methods were specifically designed to investigate the unfolding of spontaneous and posed facial expressions of happiness along the facial vertical axis (left, right) with a high-definition 3-D optoelectronic system. The results showed that spontaneous expressions were distinguished from posed facial movements as revealed by reliable spatial and speed key kinematic patterns in both experiments. Moreover, VP activation produced a lateralization effect: compared with the felt smile, the posed smile involved an initial acceleration of the left corner of the mouth, while an early deceleration of the right corner occurred in the second phase of the movement, after the velocity peak.
Collapse
Affiliation(s)
- Elisa Straulino
- Department of General Psychology, University of Padova, Via Venezia 8, 35131 Padova, Italy; (C.S.); (A.S.)
| | - Cristina Scarpazza
- Department of General Psychology, University of Padova, Via Venezia 8, 35131 Padova, Italy; (C.S.); (A.S.)
- Translational Neuroimaging and Cognitive Lab, IRCCS San Camillo Hospital, Via Alberoni 70, 30126 Venice, Italy
| | - Andrea Spoto
- Department of General Psychology, University of Padova, Via Venezia 8, 35131 Padova, Italy; (C.S.); (A.S.)
| | - Sonia Betti
- Department of Psychology, Centre for Studies and Research in Cognitive Neuroscience, University of Bologna, Viale Rasi e Spinelli 176, 47521 Cesena, Italy;
| | - Beatriz Chozas Barrientos
- Department of Chiropractic Medicine, University of Zurich, Balgrist University Hospital, Forchstrasse 340, 8008 Zürich, Switzerland;
| | - Luisa Sartori
- Department of General Psychology, University of Padova, Via Venezia 8, 35131 Padova, Italy; (C.S.); (A.S.)
- Padova Neuroscience Center, University of Padova, Via Giuseppe Orus 2, 35131 Padova, Italy
| |
Collapse
|
9
|
Long H, Peluso N, Baker CI, Japee S, Taubert J. A database of heterogeneous faces for studying naturalistic expressions. Sci Rep 2023; 13:5383. [PMID: 37012369 PMCID: PMC10070342 DOI: 10.1038/s41598-023-32659-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2022] [Accepted: 03/30/2023] [Indexed: 04/05/2023] Open
Abstract
Facial expressions are thought to be complex visual signals, critical for communication between social agents. Most prior work aimed at understanding how facial expressions are recognized has relied on stimulus databases featuring posed facial expressions, designed to represent putative emotional categories (such as 'happy' and 'angry'). Here we use an alternative selection strategy to develop the Wild Faces Database (WFD); a set of one thousand images capturing a diverse range of ambient facial behaviors from outside of the laboratory. We characterized the perceived emotional content in these images using a standard categorization task in which participants were asked to classify the apparent facial expression in each image. In addition, participants were asked to indicate the intensity and genuineness of each expression. While modal scores indicate that the WFD captures a range of different emotional expressions, in comparing the WFD to images taken from other, more conventional databases, we found that participants responded more variably and less specifically to the wild-type faces, perhaps indicating that natural expressions are more multiplexed than a categorical model would predict. We argue that this variability can be employed to explore latent dimensions in our mental representation of facial expressions. Further, images in the WFD were rated as less intense and more genuine than images taken from other databases, suggesting a greater degree of authenticity among WFD images. The strong positive correlation between intensity and genuineness scores demonstrating that even the high arousal states captured in the WFD were perceived as authentic. Collectively, these findings highlight the potential utility of the WFD as a new resource for bridging the gap between the laboratory and real world in studies of expression recognition.
Collapse
Affiliation(s)
- Houqiu Long
- The School of Psychology, The University of Queensland, St Lucia, QLD, Australia
| | - Natalie Peluso
- The School of Psychology, The University of Queensland, St Lucia, QLD, Australia
| | - Chris I Baker
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA
| | - Shruti Japee
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA
| | - Jessica Taubert
- The School of Psychology, The University of Queensland, St Lucia, QLD, Australia.
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA.
| |
Collapse
|
10
|
Ventura-Bort C, Panza D, Weymar M. Words matter when inferring emotions: a conceptual replication and extension. Cogn Emot 2023:1-15. [PMID: 36856025 DOI: 10.1080/02699931.2023.2183491] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/02/2023]
Abstract
It is long known that facial configurations play a critical role when inferring mental and emotional states from others. Nevertheless, there is still a scientific debate on how we infer emotions from facial configurations. The theory of constructed emotion (TCE) suggests that we may infer different emotions from the same facial configuration, depending on the context (e.g. provided by visual and lexical cues) in which they are perceived. For instance, a recent study found that participants were more accurate in inferring mental and emotional states across three different datasets (i.e. RMET, static and dynamic emojis) when words were provided (i.e. forced-choice task), compared to when they were not (i.e. free-labelling task), suggesting that words serve as contexts that modulate the inference from facial configurations. The goal of the current within-subject study was to replicate and extend these findings by adding a fourth dataset (KDEF-dyn), consisting of morphed human faces (to increase the ecological validity). Replicating previous findings, we observed that words increased accuracy across the three (previously used) datasets, an effect that was also observed for the facial morphed stimuli. Our findings are in line with the TCE, providing support for the importance of contextual verbal cues in emotion perception.
Collapse
Affiliation(s)
- C Ventura-Bort
- Department of Biological Psychology and Affective Science, Faculty of Human Sciences, University of Potsdam, Potsdam, Germany
| | - D Panza
- Department of Biological Psychology and Affective Science, Faculty of Human Sciences, University of Potsdam, Potsdam, Germany
| | - M Weymar
- Department of Biological Psychology and Affective Science, Faculty of Human Sciences, University of Potsdam, Potsdam, Germany.,Research Focus Cognitive Sciences, University of Potsdam, Potsdam, Germany.,Faculty of Health Sciences Brandenburg, University of Potsdam, Potsdam, Germany
| |
Collapse
|
11
|
Höfling TTA, Alpers GW. Automatic facial coding predicts self-report of emotion, advertisement and brand effects elicited by video commercials. Front Neurosci 2023; 17:1125983. [PMID: 37205049 PMCID: PMC10185761 DOI: 10.3389/fnins.2023.1125983] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2022] [Accepted: 02/10/2023] [Indexed: 05/21/2023] Open
Abstract
Introduction Consumers' emotional responses are the prime target for marketing commercials. Facial expressions provide information about a person's emotional state and technological advances have enabled machines to automatically decode them. Method With automatic facial coding we investigated the relationships between facial movements (i.e., action unit activity) and self-report of commercials advertisement emotion, advertisement and brand effects. Therefore, we recorded and analyzed the facial responses of 219 participants while they watched a broad array of video commercials. Results Facial expressions significantly predicted self-report of emotion as well as advertisement and brand effects. Interestingly, facial expressions had incremental value beyond self-report of emotion in the prediction of advertisement and brand effects. Hence, automatic facial coding appears to be useful as a non-verbal quantification of advertisement effects beyond self-report. Discussion This is the first study to measure a broad spectrum of automatically scored facial responses to video commercials. Automatic facial coding is a promising non-invasive and non-verbal method to measure emotional responses in marketing.
Collapse
|
12
|
Barrett LF. Context reconsidered: Complex signal ensembles, relational meaning, and population thinking in psychological science. AMERICAN PSYCHOLOGIST 2022; 77:894-920. [PMID: 36409120 PMCID: PMC9683522 DOI: 10.1037/amp0001054] [Citation(s) in RCA: 19] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/26/2023]
Abstract
This article considers the status and study of "context" in psychological science through the lens of research on emotional expressions. The article begins by updating three well-trod methodological debates on the role of context in emotional expressions to reconsider several fundamental assumptions lurking within the field's dominant methodological tradition: namely, that certain expressive movements have biologically prepared, inherent emotional meanings that issue from singular, universal processes which are independent of but interact with contextual influences. The second part of this article considers the scientific opportunities that await if we set aside this traditional understanding of "context" as a moderator of signals with inherent psychological meaning and instead consider the possibility that psychological events emerge in ecosystems of signal ensembles, such that the psychological meaning of any individual signal is entirely relational. Such a fundamental shift has radical implications not only for the science of emotion but for psychological science more generally. It offers opportunities to improve the validity and trustworthiness of psychological science beyond what can be achieved with improvements to methodological rigor alone. (PsycInfo Database Record (c) 2022 APA, all rights reserved).
Collapse
|
13
|
Rychlowska M, McKeown GJ, Sneddon I, Curran W. The Role of Contextual Information in Classifying Spontaneous Social Laughter. JOURNAL OF NONVERBAL BEHAVIOR 2022. [DOI: 10.1007/s10919-022-00412-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
AbstractLaughter is a ubiquitous and important social signal, but its nature is yet to be fully explored. One of the open empirical questions is about the role of context in the interpretation of laughter. Can laughs presented on their own convey specific feelings and social motives? How influential is social context when a person tries to understand the meaning of a laugh? Here we test the extent to which the classification of laughs produced in different situations is guided by knowing the context within which these laughs were produced. In the current study, stimuli were spontaneous laughs recorded in social situations engineered to elicit amusement, embarrassment, and schadenfreude. In a between-subjects design, participants classified these laughs being assigned to one of the four experimental conditions: audio only, audio-visual, side-by-side videos of two interactants, and side-by-side videos accompanied by a brief vignette. Participants’ task was to label each laugh as an instance of amusement, embarrassment, or schadenfreude laugh, or “other.” Laughs produced in situations inducing embarrassment were classified more accurately than laughs produced in other situations. Most importantly, eliminating information about the social settings in which laughs were produced decreased participants’ classification accuracy such that accuracy was no better than chance in the experimental conditions providing minimal contextual information. Our findings demonstrate the importance of context in the interpretation of laughter and highlight the complexity of experimental investigations of schadenfreude displays.
Collapse
|
14
|
Jungilligens J, Paredes-Echeverri S, Popkirov S, Barrett LF, Perez DL. A new science of emotion: implications for functional neurological disorder. Brain 2022; 145:2648-2663. [PMID: 35653495 PMCID: PMC9905015 DOI: 10.1093/brain/awac204] [Citation(s) in RCA: 62] [Impact Index Per Article: 31.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2022] [Revised: 04/28/2022] [Accepted: 05/20/2022] [Indexed: 01/11/2023] Open
Abstract
Functional neurological disorder reflects impairments in brain networks leading to distressing motor, sensory and/or cognitive symptoms that demonstrate positive clinical signs on examination incongruent with other conditions. A central issue in historical and contemporary formulations of functional neurological disorder has been the mechanistic and aetiological role of emotions. However, the debate has mostly omitted fundamental questions about the nature of emotions in the first place. In this perspective article, we first outline a set of relevant working principles of the brain (e.g. allostasis, predictive processing, interoception and affect), followed by a focused review of the theory of constructed emotion to introduce a new understanding of what emotions are. Building on this theoretical framework, we formulate how altered emotion category construction can be an integral component of the pathophysiology of functional neurological disorder and related functional somatic symptoms. In doing so, we address several themes for the functional neurological disorder field including: (i) how energy regulation and the process of emotion category construction relate to symptom generation, including revisiting alexithymia, 'panic attack without panic', dissociation, insecure attachment and the influential role of life experiences; (ii) re-interpret select neurobiological research findings in functional neurological disorder cohorts through the lens of the theory of constructed emotion to illustrate its potential mechanistic relevance; and (iii) discuss therapeutic implications. While we continue to support that functional neurological disorder is mechanistically and aetiologically heterogenous, consideration of how the theory of constructed emotion relates to the generation and maintenance of functional neurological and functional somatic symptoms offers an integrated viewpoint that cuts across neurology, psychiatry, psychology and cognitive-affective neuroscience.
Collapse
Affiliation(s)
- Johannes Jungilligens
- Department of Neurology, University Hospital Knappschaftskrankenhaus Bochum, Ruhr University Bochum, Bochum, Germany
- Functional Neurological Disorder Unit, Division of Cognitive Behavioral Neurology, Department of Neurology, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| | - Sara Paredes-Echeverri
- Functional Neurological Disorder Unit, Division of Cognitive Behavioral Neurology, Department of Neurology, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| | - Stoyan Popkirov
- Department of Neurology, University Hospital Knappschaftskrankenhaus Bochum, Ruhr University Bochum, Bochum, Germany
| | - Lisa Feldman Barrett
- Department of Psychology, Northeastern University, Boston, MA, USA
- Psychiatric Neuroimaging Division, Department of Psychiatry, Massachusetts General Hospital and Harvard Medical School, Boston, MA, USA
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| | - David L Perez
- Functional Neurological Disorder Unit, Division of Cognitive Behavioral Neurology, Department of Neurology, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
- Division of Neuropsychiatry, Department of Psychiatry, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| |
Collapse
|
15
|
Ruba AL, Pollak SD, Saffran JR. Acquiring Complex Communicative Systems: Statistical Learning of Language and Emotion. Top Cogn Sci 2022; 14:432-450. [PMID: 35398974 PMCID: PMC9465951 DOI: 10.1111/tops.12612] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2011] [Revised: 03/16/2022] [Accepted: 03/17/2022] [Indexed: 11/30/2022]
Abstract
During the early postnatal years, most infants rapidly learn to understand two naturally evolved communication systems: language and emotion. While these two domains include different types of content knowledge, it is possible that similar learning processes subserve their acquisition. In this review, we compare the learnable statistical regularities in language and emotion input. We then consider how domain-general learning abilities may underly the acquisition of language and emotion, and how this process may be constrained in each domain. This comparative developmental approach can advance our understanding of how humans learn to communicate with others.
Collapse
Affiliation(s)
- Ashley L. Ruba
- Department of PsychologyUniversity of Wisconsin – Madison
| | - Seth D. Pollak
- Department of PsychologyUniversity of Wisconsin – Madison
| | | |
Collapse
|
16
|
Shaffer C, Westlin C, Quigley KS, Whitfield-Gabrieli S, Barrett LF. Allostasis, Action, and Affect in Depression: Insights from the Theory of Constructed Emotion. Annu Rev Clin Psychol 2022; 18:553-580. [PMID: 35534123 PMCID: PMC9247744 DOI: 10.1146/annurev-clinpsy-081219-115627] [Citation(s) in RCA: 29] [Impact Index Per Article: 14.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
The theory of constructed emotion is a systems neuroscience approach to understanding the nature of emotion. It is also a general theoretical framework to guide hypothesis generation for how actions and experiences are constructed as the brain continually anticipates metabolic needs and attempts to meet those needs before they arise (termed allostasis). In this review, we introduce this framework and hypothesize that allostatic dysregulation is a trans-disorder vulnerability for mental and physical illness. We then review published findings consistent with the hypothesis that several symptoms in major depressive disorder (MDD), such as fatigue, distress, context insensitivity, reward insensitivity, and motor retardation, are associated with persistent problems in energy regulation. Our approach transforms the current understanding of MDD as resulting from enhanced emotional reactivity combined with reduced cognitive control and, in doing so, offers novel hypotheses regarding the development, progression, treatment, and prevention of MDD.
Collapse
Affiliation(s)
- Clare Shaffer
- Department of Psychology, Northeastern University, Boston, Massachusetts, USA; ,
| | - Christiana Westlin
- Department of Psychology, Northeastern University, Boston, Massachusetts, USA; ,
| | - Karen S Quigley
- Department of Psychology, Northeastern University, Boston, Massachusetts, USA; ,
- VA Bedford Healthcare System, Bedford, Massachusetts, USA
| | - Susan Whitfield-Gabrieli
- Department of Psychology, Northeastern University, Boston, Massachusetts, USA; ,
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, Massachusetts, USA
| | - Lisa Feldman Barrett
- Department of Psychology, Northeastern University, Boston, Massachusetts, USA; ,
- Department of Psychiatry and the Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital and Harvard Medical School, Charlestown, Massachusetts, USA
| |
Collapse
|
17
|
Camacho MC, Williams EM, Balser D, Kamojjala R, Sekar N, Steinberger D, Yarlagadda S, Perlman SB, Barch DM. EmoCodes: a Standardized Coding System for Socio-emotional Content in Complex Video Stimuli. AFFECTIVE SCIENCE 2022; 3:168-181. [PMID: 36046099 PMCID: PMC9383008 DOI: 10.1007/s42761-021-00100-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/19/2021] [Accepted: 12/21/2021] [Indexed: 06/10/2023]
Abstract
UNLABELLED Social information processing is vital for inferring emotional states in others, yet affective neuroscience has only begun to scratch the surface of how we represent emotional information in the brain. Most previous affective neuroscience work has used isolated stimuli such as static images of affective faces or scenes to probe affective processing. While this work has provided rich insight to the initial stages of emotion processing (encoding cues), activation to isolated stimuli provides limited insight into later phases of emotion processing such as interpretation of cues or interactions between cues and established cognitive schemas. Recent work has highlighted the potential value of using complex video stimuli to probe socio-emotional processing, highlighting the need to develop standardized video coding schemas as this exciting field expands. Toward that end, we present a standardized and open-source coding system for complex videos, two fully coded videos, and a video and code processing Python library. The EmoCodes manual coding system provides an externally validated and replicable system for coding complex cartoon stimuli, with future plans to validate the system for other video types. The emocodes Python library provides automated tools for extracting low-level features from video files as well as tools for summarizing and analyzing the manual codes for suitability of use in neuroimaging analysis. Materials can be freely accessed at https://emocodes.org/. These tools represent an important step toward replicable and standardized study of socio-emotional processing using complex video stimuli. SUPPLEMENTARY INFORMATION The online version contains supplementary material available at 10.1007/s42761-021-00100-7.
Collapse
Affiliation(s)
- M. Catalina Camacho
- Department of Psychological and Brain Sciences, Washington University in St. Louis, One Brookings Drive, St. Louis, MO 63130 USA
| | - Elizabeth M. Williams
- Department of Psychological and Brain Sciences, Washington University in St. Louis, One Brookings Drive, St. Louis, MO 63130 USA
| | - Dori Balser
- Department of Psychological and Brain Sciences, Washington University in St. Louis, One Brookings Drive, St. Louis, MO 63130 USA
| | - Ruchika Kamojjala
- Department of Psychological and Brain Sciences, Washington University in St. Louis, One Brookings Drive, St. Louis, MO 63130 USA
| | - Nikhil Sekar
- Department of Psychological and Brain Sciences, Washington University in St. Louis, One Brookings Drive, St. Louis, MO 63130 USA
| | - David Steinberger
- Department of Psychological and Brain Sciences, Washington University in St. Louis, One Brookings Drive, St. Louis, MO 63130 USA
| | - Sishir Yarlagadda
- Department of Psychological and Brain Sciences, Washington University in St. Louis, One Brookings Drive, St. Louis, MO 63130 USA
| | - Susan B. Perlman
- Department of Psychiatry, Washington University in St. Louis, 4444 Forest Park Drive, MO 63110 St. Louis, USA
| | - Deanna M. Barch
- Department of Psychological and Brain Sciences, Washington University in St. Louis, One Brookings Drive, St. Louis, MO 63130 USA
| |
Collapse
|