1
|
Fang X, van Kleef GA, Kawakami K, Sauter DA. Registered report "Categorical perception of facial expressions of anger and disgust across cultures". Cogn Emot 2024:1-17. [PMID: 38973174 DOI: 10.1080/02699931.2024.2370667] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2021] [Accepted: 06/13/2024] [Indexed: 07/09/2024]
Abstract
Previous research has demonstrated that individuals from Western cultures exhibit categorical perception (CP) in their judgments of emotional faces. However, the extent to which this phenomenon characterises the judgments of facial expressions among East Asians remains relatively unexplored. Building upon recent findings showing that East Asians are more likely than Westerners to see a mixture of emotions in facial expressions of anger and disgust, the present research aimed to investigate whether East Asians also display CP for angry and disgusted faces. To address this question, participants from Canada and China were recruited to discriminate pairs of faces along the anger-disgust continuum. The results revealed the presence of CP in both cultural groups, as participants consistently exhibited higher accuracy and faster response latencies when discriminating between-category pairs of expressions compared to within-category pairs. Moreover, the magnitude of CP did not vary significantly across cultures. These findings provide novel evidence supporting the existence of CP for facial expressions in both East Asian and Western cultures, suggesting that CP is a perceptual phenomenon that transcends cultural boundaries. This research contributes to the growing literature on cross-cultural perceptions of facial expressions by deepening our understanding of how facial expressions are perceived categorically across cultures.
Collapse
Affiliation(s)
- Xia Fang
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou, People's Republic of China
| | - Gerben A van Kleef
- Department of Social Psychology, University of Amsterdam, Amsterdam, the Netherlands
| | - Kerry Kawakami
- Department of Social Psychology, York University, Toronto, Canada
| | - Disa A Sauter
- Department of Social Psychology, University of Amsterdam, Amsterdam, the Netherlands
| |
Collapse
|
2
|
Yang Z, Wu Y, Liu S, Zhao L, Fan C, He W. Ensemble Coding of Crowd with Cross-Category Facial Expressions. Behav Sci (Basel) 2024; 14:508. [PMID: 38920840 PMCID: PMC11201231 DOI: 10.3390/bs14060508] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2024] [Revised: 05/26/2024] [Accepted: 06/14/2024] [Indexed: 06/27/2024] Open
Abstract
Ensemble coding allows observers to form an average to represent a set of elements. However, it is unclear whether observers can extract an average from a cross-category set. Previous investigations on this issue using low-level stimuli yielded contradictory results. The current study addressed this issue by presenting high-level stimuli (i.e., a crowd of facial expressions) simultaneously (Experiment 1) or sequentially (Experiment 2), and asked participants to complete a member judgment task. The results showed that participants could extract average information from a group of cross-category facial expressions with a short perceptual distance. These findings demonstrate cross-category ensemble coding of high-level stimuli, contributing to the understanding of ensemble coding and providing inspiration for future research.
Collapse
Affiliation(s)
- Zhi Yang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; (Z.Y.); (Y.W.); (S.L.); (L.Z.); (W.H.)
- Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, China
| | - Yifan Wu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; (Z.Y.); (Y.W.); (S.L.); (L.Z.); (W.H.)
- Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, China
| | - Shuaicheng Liu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; (Z.Y.); (Y.W.); (S.L.); (L.Z.); (W.H.)
- Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, China
| | - Lili Zhao
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; (Z.Y.); (Y.W.); (S.L.); (L.Z.); (W.H.)
- Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, China
| | - Cong Fan
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; (Z.Y.); (Y.W.); (S.L.); (L.Z.); (W.H.)
- Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, China
| | - Weiqi He
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; (Z.Y.); (Y.W.); (S.L.); (L.Z.); (W.H.)
- Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, China
| |
Collapse
|
3
|
Ikeda S. Interoceptive sensitivity and perception of others' emotions: an investigation based on a two-stage model. Cogn Process 2024; 25:229-239. [PMID: 38383909 DOI: 10.1007/s10339-024-01176-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2023] [Accepted: 01/04/2024] [Indexed: 02/23/2024]
Abstract
Recent research shows that sensitivity to interoceptive sensitivity is associated with a more granular experience of emotions. These studies suggest that individuals sensitive to their interoceptive signals can better perceive somatic physiological changes as compared to their counterparts. Therefore, they discriminate among a wide and subtle range of emotions. Further, the perception of others' emotions could be based on our own emotional experiences. However, whether interoceptive sensitivity is related to the perception of others' emotions remains unclear. Therefore, this study examined the relationship between interoceptive sensitivity and emotional perception. Considering the model that emotion perception comprises two processes, categorization of facial expressions and approach-avoidance responses, this study examined both categorizations of facial expressions and approach-avoidance responses. The results showed no relationship between interoceptive sensitivity and the perception of emotion, which suggests that interoceptive sensitivity is related to the experience of emotion but does not affect the granularity of emotional perception. Future studies should diversely and empirically examine the role of the body in emotional perception from the perspective of interoceptive sensitivity.
Collapse
Affiliation(s)
- Shinnosuke Ikeda
- Human and Social Administration Department, Kanazawa University, Kanazawa University Kakuma-Machi, Kanazawa, Ishikawa Prefecture, 920-1192, Japan.
| |
Collapse
|
4
|
Pounder Z, Eardley AF, Loveday C, Evans S. No clear evidence of a difference between individuals who self-report an absence of auditory imagery and typical imagers on auditory imagery tasks. PLoS One 2024; 19:e0300219. [PMID: 38568916 PMCID: PMC10990234 DOI: 10.1371/journal.pone.0300219] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2023] [Accepted: 02/25/2024] [Indexed: 04/05/2024] Open
Abstract
Aphantasia is characterised by the inability to create mental images in one's mind. Studies investigating impairments in imagery typically focus on the visual domain. However, it is possible to generate many different forms of imagery including imagined auditory, kinesthetic, tactile, motor, taste and other experiences. Recent studies show that individuals with aphantasia report a lack of imagery in modalities, other than vision, including audition. However, to date, no research has examined whether these reductions in self-reported auditory imagery are associated with decrements in tasks that require auditory imagery. Understanding the extent to which visual and auditory imagery deficits co-occur can help to better characterise the core deficits of aphantasia and provide an alternative perspective on theoretical debates on the extent to which imagery draws on modality-specific or modality-general processes. In the current study, individuals that self-identified as being aphantasic and matched control participants with typical imagery performed two tasks: a musical pitch-based imagery and voice-based categorisation task. The majority of participants with aphantasia self-reported significant deficits in both auditory and visual imagery. However, we did not find a concomitant decrease in performance on tasks which require auditory imagery, either in the full sample or only when considering those participants that reported significant deficits in both domains. These findings are discussed in relation to the mechanisms that might obscure observation of imagery deficits in auditory imagery tasks in people that report reduced auditory imagery.
Collapse
Affiliation(s)
- Zoë Pounder
- Department of Psychology, School of Social Sciences, University of Westminster, London, United Kingdom
- Department of Experimental Psychology, University of Oxford, Oxford, United Kingdom
| | - Alison F. Eardley
- Department of Psychology, School of Social Sciences, University of Westminster, London, United Kingdom
| | - Catherine Loveday
- Department of Psychology, School of Social Sciences, University of Westminster, London, United Kingdom
| | - Samuel Evans
- Department of Psychology, School of Social Sciences, University of Westminster, London, United Kingdom
- Neuroimaging, King’s College London, London, United Kingdom
| |
Collapse
|
5
|
Xu Y, Cao C. Interpersonal Stress and Late Adolescent Depressive Symptoms: Moderation by Perceptual Sensitivity to Facial Expression of Anger. J Youth Adolesc 2023; 52:2592-2605. [PMID: 37642781 DOI: 10.1007/s10964-023-01849-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2023] [Accepted: 08/16/2023] [Indexed: 08/31/2023]
Abstract
The interpersonal theories of depression highlight the role of interpersonal stress and individual's sensitivity to social rejection in the development of depression. However, previous research has tested their respective effects, whereas whether or not these two factors interact to affect depression, especially in ways of differential susceptibility or diathesis-stress, remains unknown. By adopting a morphed facial expressions recognition paradigm, the current study investigated the potential moderating role of perceptual sensitivity to facial expressions, especially that to angry expression which signaled social rejection, in the association between interpersonal stress and adolescent depressive symptoms. A total of 186 Chinese late adolescents (Mage = 21.16 ± 1.81 years; 73.7% females) participated in this study. The results demonstrated that perceptual sensitivity for angry faces, but not sad or happy faces, functioned as a plasticity factor significantly moderating the effect of interpersonal stress on depressive symptoms, which was consistent with hypothesis of differential susceptibility rather than diathesis-stress. No interactions were observed regarding non-interpersonal dimensions. These results were robust and survived a series of sensitivity analyses, including k-fold cross-validation test. The current findings highlight the crucial role of perceptual sensitivity to angry expression in explaining individual differences behind the links between interpersonal stress and adolescent depressive symptoms.
Collapse
Affiliation(s)
- Yajing Xu
- School of Nursing and Rehabilitation, Shandong University, Jinan, China
| | - Cong Cao
- School of Nursing and Rehabilitation, Shandong University, Jinan, China.
| |
Collapse
|
6
|
Zhang M, Zhang H, Tang E, Ding H, Zhang Y. Evaluating the Relative Perceptual Salience of Linguistic and Emotional Prosody in Quiet and Noisy Contexts. Behav Sci (Basel) 2023; 13:800. [PMID: 37887450 PMCID: PMC10603920 DOI: 10.3390/bs13100800] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2023] [Revised: 09/22/2023] [Accepted: 09/25/2023] [Indexed: 10/28/2023] Open
Abstract
How people recognize linguistic and emotional prosody in different listening conditions is essential for understanding the complex interplay between social context, cognition, and communication. The perception of both lexical tones and emotional prosody depends on prosodic features including pitch, intensity, duration, and voice quality. However, it is unclear which aspect of prosody is perceptually more salient and resistant to noise. This study aimed to investigate the relative perceptual salience of emotional prosody and lexical tone recognition in quiet and in the presence of multi-talker babble noise. Forty young adults randomly sampled from a pool of native Mandarin Chinese with normal hearing listened to monosyllables either with or without background babble noise and completed two identification tasks, one for emotion recognition and the other for lexical tone recognition. Accuracy and speed were recorded and analyzed using generalized linear mixed-effects models. Compared with emotional prosody, lexical tones were more perceptually salient in multi-talker babble noise. Native Mandarin Chinese participants identified lexical tones more accurately and quickly than vocal emotions at the same signal-to-noise ratio. Acoustic and cognitive dissimilarities between linguistic prosody and emotional prosody may have led to the phenomenon, which calls for further explorations into the underlying psychobiological and neurophysiological mechanisms.
Collapse
Affiliation(s)
- Minyue Zhang
- Speech-Language-Hearing Center, School of Foreign Languages, Shanghai Jiao Tong University, Shanghai 200240, China; (M.Z.); (H.Z.); (E.T.)
| | - Hui Zhang
- Speech-Language-Hearing Center, School of Foreign Languages, Shanghai Jiao Tong University, Shanghai 200240, China; (M.Z.); (H.Z.); (E.T.)
| | - Enze Tang
- Speech-Language-Hearing Center, School of Foreign Languages, Shanghai Jiao Tong University, Shanghai 200240, China; (M.Z.); (H.Z.); (E.T.)
| | - Hongwei Ding
- Speech-Language-Hearing Center, School of Foreign Languages, Shanghai Jiao Tong University, Shanghai 200240, China; (M.Z.); (H.Z.); (E.T.)
| | - Yang Zhang
- Department of Speech-Language-Hearing Sciences and Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis, MN 55455, USA
| |
Collapse
|
7
|
Ventura-Bort C, Panza D, Weymar M. Words matter when inferring emotions: a conceptual replication and extension. Cogn Emot 2023:1-15. [PMID: 36856025 DOI: 10.1080/02699931.2023.2183491] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/02/2023]
Abstract
It is long known that facial configurations play a critical role when inferring mental and emotional states from others. Nevertheless, there is still a scientific debate on how we infer emotions from facial configurations. The theory of constructed emotion (TCE) suggests that we may infer different emotions from the same facial configuration, depending on the context (e.g. provided by visual and lexical cues) in which they are perceived. For instance, a recent study found that participants were more accurate in inferring mental and emotional states across three different datasets (i.e. RMET, static and dynamic emojis) when words were provided (i.e. forced-choice task), compared to when they were not (i.e. free-labelling task), suggesting that words serve as contexts that modulate the inference from facial configurations. The goal of the current within-subject study was to replicate and extend these findings by adding a fourth dataset (KDEF-dyn), consisting of morphed human faces (to increase the ecological validity). Replicating previous findings, we observed that words increased accuracy across the three (previously used) datasets, an effect that was also observed for the facial morphed stimuli. Our findings are in line with the TCE, providing support for the importance of contextual verbal cues in emotion perception.
Collapse
Affiliation(s)
- C Ventura-Bort
- Department of Biological Psychology and Affective Science, Faculty of Human Sciences, University of Potsdam, Potsdam, Germany
| | - D Panza
- Department of Biological Psychology and Affective Science, Faculty of Human Sciences, University of Potsdam, Potsdam, Germany
| | - M Weymar
- Department of Biological Psychology and Affective Science, Faculty of Human Sciences, University of Potsdam, Potsdam, Germany.,Research Focus Cognitive Sciences, University of Potsdam, Potsdam, Germany.,Faculty of Health Sciences Brandenburg, University of Potsdam, Potsdam, Germany
| |
Collapse
|
8
|
Jacques C, Caharel S. The time course of categorical perception of facial expressions. Neuropsychologia 2022; 177:108424. [PMID: 36400243 DOI: 10.1016/j.neuropsychologia.2022.108424] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2021] [Revised: 10/13/2022] [Accepted: 11/14/2022] [Indexed: 11/17/2022]
Abstract
Decoding emotions on others' faces is one of the most important functions of the human brain, which has been widely studied in cognitive neuroscience. However, the precise time course of facial expression categorization in the human brain is still a matter of debate. Here we used an original paradigm to measure categorical perception of facial expression changes during event-related potentials (ERPs) recording, in which a face stimulus dynamically switched either to a different expression (between-category condition) or to the same expression (within-category condition), the physical distance between the two successive faces being equal across conditions. The switch between faces generated a negative differential potential peaking at around 160 ms over occipito-temporal regions, similar in term of latency and topography to the well-known face-selective N170 component. This response was larger in the condition where the switch occurred between faces that were perceived as having different facial expressions compared to the same expression. In addition, happy expressions were categorized around 20 ms faster than fearful expressions (respectively, 135 and 156 ms). These findings provide evidence that changes of facial expressions are categorically perceived as early as 160 ms following stimulus onset over the occipito-temporal cortex.
Collapse
Affiliation(s)
- Corentin Jacques
- Université Catholique de Louvain, Psychological Science Research Institute (IPSY), Louvain-La-Neuve, Belgium.
| | | |
Collapse
|
9
|
Zupan B, Eskritt M. Validation of Affective Sentences: Extending Beyond Basic Emotion Categories. JOURNAL OF PSYCHOLINGUISTIC RESEARCH 2022; 51:1409-1429. [PMID: 35953648 PMCID: PMC9646620 DOI: 10.1007/s10936-022-09906-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 07/09/2022] [Indexed: 06/15/2023]
Abstract
We use nonverbal and verbal emotion cues to determine how others are feeling. Most studies in vocal emotion perception do not consider the influence of verbal content, using sentences with nonsense words or words that carry no emotional meaning. These online studies aimed to validate 95 sentences with verbal content intended to convey 10 emotions. Participants were asked to select the emotion that best described the emotional meaning of the sentence. Study 1 included 436 participants and Study 2 included 193. The Simpson diversity index was applied as a measure of dispersion of responses. Across the two studies, 38 sentences were labelled as representing 10 emotion categories with a low degree of diversity in participant responses. Expanding current databases beyond basic emotion categories is important for researchers exploring the interaction between tone of voice and verbal content, and/or people's capacity to make subtle distinctions between their own and others' emotions.
Collapse
Affiliation(s)
- Barbra Zupan
- Speech Pathology, College of Health Science, School of Health, Medical and Applied Sciences, Central Queensland University, Bruce Highway, North Rockhampton, QLD, 4702, Australia.
| | | |
Collapse
|
10
|
Zhang Y, Zhou W, Huang J, Hong B, Wang X. Neural correlates of perceived emotions in human insula and amygdala for auditory emotion recognition. Neuroimage 2022; 260:119502. [PMID: 35878727 DOI: 10.1016/j.neuroimage.2022.119502] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2022] [Revised: 06/21/2022] [Accepted: 07/21/2022] [Indexed: 11/28/2022] Open
Abstract
The emotional status of a speaker is an important non-linguistic cue carried by human voice and can be perceived by a listener in vocal communication. Understanding the neural circuits involved in processing emotions carried by human voice is crucial for understanding the neural basis of social interaction. Previous studies have shown that human insula and amygdala responded more selectively to emotional sounds than non-emotional sounds. However, it is not clear whether the neural selectivity to emotional sounds in these brain structures is determined by the emotion presented by a speaker which is associated with the acoustic properties of the sounds or by the emotion perceived by a listener. In this study, we recorded intracranial electroencephalography (iEEG) responses to emotional human voices while subjects performed emotion recognition tasks. We found that the iEEG responses of Heschl's gyrus (HG) and posterior insula were determined by the presented emotion, whereas the iEEG responses of anterior insula and amygdala were driven by the perceived emotion. These results suggest that the anterior insula and amygdala play a crucial role in conscious perception of emotions carried by human voice.
Collapse
Affiliation(s)
- Yang Zhang
- Tsinghua Laboratory of Brain and Intelligence (THBI) and Department of Biomedical Engineering, Tsinghua University, Beijing 100084, PR China; Department of Biomedical Engineering, the Johns Hopkins University, Baltimore, MD 21205, United States.
| | - Wenjing Zhou
- Department of Epilepsy Center, Tsinghua University Yuquan Hospital, Beijing 100040, PR China
| | - Juan Huang
- Department of Biomedical Engineering, the Johns Hopkins University, Baltimore, MD 21205, United States
| | - Bo Hong
- Tsinghua Laboratory of Brain and Intelligence (THBI) and Department of Biomedical Engineering, Tsinghua University, Beijing 100084, PR China.
| | - Xiaoqin Wang
- Tsinghua Laboratory of Brain and Intelligence (THBI) and Department of Biomedical Engineering, Tsinghua University, Beijing 100084, PR China; Department of Biomedical Engineering, the Johns Hopkins University, Baltimore, MD 21205, United States.
| |
Collapse
|
11
|
Ikeda S. Approach-avoidance responses and categorical perception of ambiguous facial expressions. INTERNATIONAL JOURNAL OF PSYCHOLOGY 2021; 57:227-239. [PMID: 34405403 DOI: 10.1002/ijop.12803] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2020] [Accepted: 08/04/2021] [Indexed: 11/11/2022]
Abstract
Emotion perception of facial expressions involves two processes: quick approach-avoidance responses and subsequent sorting into emotional categories (i.e., happiness, anger), considering the context. Sorting of morphed ambiguous facial expressions is known to occur categorically, but the occurrence of approach-avoidance responses for morphed facial expressions is yet to be investigated. The present study used morphed angry and fearful facial expressions and measured approach-avoidance responses among Japanese university students (Experiment 1, n = 29). Similar experiments with linguistic load (Experiment 2, n = 28) and visual load (Experiment 3, n = 29) were conducted. The results indicated categorical perception in the sorting of facial expressions but no approach-avoidance response for morphed expressions. Furthermore, linguistic load affected the categorisation of facial expressions, but neither linguistic load nor visual load affected the approach-avoidance response. These results support the idea that the non-linguistic approach-avoidance response and the linguistic categorisation of facial expressions are two different processes. The nature of the emotional perception process is also discussed.
Collapse
Affiliation(s)
- Shinnosuke Ikeda
- Faculty of Humanities, Kyoto University of Advanced Science, Kyoto, Japan
| |
Collapse
|
12
|
Baus C, Ruiz-Tada E, Escera C, Costa A. Early detection of language categories in face perception. Sci Rep 2021; 11:9715. [PMID: 33958663 PMCID: PMC8102523 DOI: 10.1038/s41598-021-89007-8] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2020] [Accepted: 04/09/2021] [Indexed: 02/03/2023] Open
Abstract
Does language categorization influence face identification? The present study addressed this question by means of two experiments. First, to establish language categorization of faces, the memory confusion paradigm was used to create two language categories of faces, Spanish and English. Subsequently, participants underwent an oddball paradigm, in which faces that had been previously paired with one of the two languages (Spanish or English), were presented. We measured EEG perceptual differences (vMMN) between standard and two types of deviant faces: within-language category (faces sharing language with standards) or between-language category (faces paired with the other language). Participants were more likely to confuse faces within the language category than between categories, an index that faces were categorized by language. At the neural level, early vMMN were obtained for between-language category faces, but not for within-language category faces. At a later stage, however, larger vMMNs were obtained for those faces from the same language category. Our results showed that language is a relevant social cue that individuals used to categorize others and this categorization subsequently affects face perception.
Collapse
Affiliation(s)
- Cristina Baus
- Department of Cognition, Development and Educational Psychology, University of Barcelona, 08035, Barcelona, Spain.
- Center for Brain and Cognition, CBC, Pompeu Fabra University, Barcelona, Spain.
| | | | - Carles Escera
- Brainlab-Cognitive Neuroscience Research Group, Department of Clinical Psychology and Psychobiology, University of Barcelona, Barcelona, Spain
- Institute of Neurosciences, University of Barcelona, Barcelona, Spain
- Institut de Recerca Sant Joan de Déu, Esplugues de Llobregat, Barcelona, Spain
| | - Albert Costa
- Center for Brain and Cognition, CBC, Pompeu Fabra University, Barcelona, Spain
| |
Collapse
|
13
|
Dricu M, Frühholz S. A neurocognitive model of perceptual decision-making on emotional signals. Hum Brain Mapp 2020; 41:1532-1556. [PMID: 31868310 PMCID: PMC7267943 DOI: 10.1002/hbm.24893] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2019] [Revised: 11/18/2019] [Accepted: 11/29/2019] [Indexed: 01/09/2023] Open
Abstract
Humans make various kinds of decisions about which emotions they perceive from others. Although it might seem like a split-second phenomenon, deliberating over which emotions we perceive unfolds across several stages of decisional processing. Neurocognitive models of general perception postulate that our brain first extracts sensory information about the world then integrates these data into a percept and lastly interprets it. The aim of the present study was to build an evidence-based neurocognitive model of perceptual decision-making on others' emotions. We conducted a series of meta-analyses of neuroimaging data spanning 30 years on the explicit evaluations of others' emotional expressions. We find that emotion perception is rather an umbrella term for various perception paradigms, each with distinct neural structures that underline task-related cognitive demands. Furthermore, the left amygdala was responsive across all classes of decisional paradigms, regardless of task-related demands. Based on these observations, we propose a neurocognitive model that outlines the information flow in the brain needed for a successful evaluation of and decisions on other individuals' emotions. HIGHLIGHTS: Emotion classification involves heterogeneous perception and decision-making tasks Decision-making processes on emotions rarely covered by existing emotions theories We propose an evidence-based neuro-cognitive model of decision-making on emotions Bilateral brain processes for nonverbal decisions, left brain processes for verbal decisions Left amygdala involved in any kind of decision on emotions.
Collapse
Affiliation(s)
- Mihai Dricu
- Department of PsychologyUniversity of BernBernSwitzerland
| | - Sascha Frühholz
- Cognitive and Affective Neuroscience Unit, Department of PsychologyUniversity of ZurichZurichSwitzerland
- Neuroscience Center Zurich (ZNZ)University of Zurich and ETH ZurichZurichSwitzerland
- Center for Integrative Human Physiology (ZIHP)University of ZurichZurichSwitzerland
| |
Collapse
|
14
|
Ruba AL, Meltzoff AN, Repacholi BM. Superordinate categorization of negative facial expressions in infancy: The influence of labels. Dev Psychol 2020; 56:671-685. [PMID: 31999185 PMCID: PMC7060120 DOI: 10.1037/dev0000892] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
Accurate perception of emotional (facial) expressions is an essential social skill. It is currently debated whether emotion categorization in infancy emerges in a "broad-to-narrow" pattern and the degree to which language influences this process. We used an habituation paradigm to explore (a) whether 14- and 18-month-old infants perceive different facial expressions (anger, sad, disgust) as belonging to a superordinate category of negative valence and (b) how verbal labels influence emotion category formation. Results indicated that infants did not spontaneously form a superordinate category of negative valence (Experiments 1 and 3). However, when a novel label ("toma") was added to each event during habituation trials (Experiments 2 and 4), infants formed this superordinate valance category when habituated to disgust and sad expressions (but not when habituated to anger and sadness). These labeling effects were obtained with two stimuli sets (Radboud Face Database and NimStim), even when controlling for the presence of teeth in the expressions. The results indicate that infants, at 14 and 18 months of age, show limited superordinate categorization based on the valence of different negative facial expressions. Specifically, infants only form this abstract emotion category when labels were provided, and the labeling effect depends on which emotions are presented during habituation. These findings have important implications for developmental theories of emotion. (PsycINFO Database Record (c) 2020 APA, all rights reserved).
Collapse
|
15
|
Fugate JMB, MacDonald C, O'Hare AJ. Emotion Words' Effect on Visual Awareness and Attention of Emotional Faces. Front Psychol 2020; 10:2896. [PMID: 32010012 PMCID: PMC6974626 DOI: 10.3389/fpsyg.2019.02896] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2019] [Accepted: 12/06/2019] [Indexed: 11/13/2022] Open
Abstract
To explore whether the meaning of a word changes visual processing of emotional faces (i.e., visual awareness and visual attention), we performed two complementary studies. In Experiment 1, we presented participants with emotion and control words and then tracked their visual awareness for two competing emotional faces using a binocular rivalry paradigm. Participants experienced the emotional face congruent with the emotion word for longer than a word-incongruent emotional face, as would be expected if the word was biasing awareness toward the (unseen) face. In Experiment 2, we similarly presented participants with emotion and control words prior to presenting emotional faces using a divided visual field paradigm. Emotion words were congruent with either the emotional face in the right or left visual field. After the presentation of faces, participants saw a dot in either the left or right visual field. Participants were slower to identify the location of the dot when it appeared in the same visual field as the emotional face congruent with the emotion word. The effect was limited to the left hemisphere (RVF), as would be expected for linguistic integration of the word with the face. Since the task was not linguistic, but rather a simple dot-probe task, participants were slower in their responses under these conditions because they likely had to disengage from the additional linguistic processing caused by the word-face integration. These findings indicate that emotion words bias visual awareness for congruent emotional faces, as well as shift attention toward congruent emotional faces.
Collapse
Affiliation(s)
- Jennifer M B Fugate
- Department of Psychology, University of Massachusetts Dartmouth, Dartmouth, MA, United States
| | - Cameron MacDonald
- Department of Psychology, University of Massachusetts Dartmouth, Dartmouth, MA, United States
| | - Aminda J O'Hare
- Department of Psychology, Weber State University, Ogden, UT, United States
| |
Collapse
|
16
|
Satpute AB, Lindquist KA. The Default Mode Network's Role in Discrete Emotion. Trends Cogn Sci 2019; 23:851-864. [PMID: 31427147 PMCID: PMC7281778 DOI: 10.1016/j.tics.2019.07.003] [Citation(s) in RCA: 125] [Impact Index Per Article: 25.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2019] [Revised: 07/11/2019] [Accepted: 07/12/2019] [Indexed: 12/11/2022]
Abstract
Emotions are often assumed to manifest in subcortical limbic and brainstem structures. While these areas are clearly important for representing affect (e.g., valence and arousal), we propose that the default mode network (DMN) is additionally important for constructing discrete emotional experiences (of anger, fear, disgust, etc.). Findings from neuroimaging studies, invasive electrical stimulation studies, and lesion studies support this proposal. Importantly, our framework builds on a constructionist theory of emotion to explain how instances involving diverse physiological and behavioral patterns can be conceptualized as belonging to the same emotion category. We argue that this ability requires abstraction (from concrete features to broad mental categories), which the DMN is well positioned to support, and we make novel predictions from our proposed framework.
Collapse
Affiliation(s)
- Ajay B Satpute
- Department of Psychology, Northeastern University, Boston, MA, USA.
| | - Kristen A Lindquist
- Department of Psychology and Neuroscience, University of North Carolina, Chapel Hill, NC, USA
| |
Collapse
|
17
|
Ruba AL, Repacholi BM. Do Preverbal Infants Understand Discrete Facial Expressions of Emotion? EMOTION REVIEW 2019. [DOI: 10.1177/1754073919871098] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/28/2022]
Abstract
An ongoing debate in affective science concerns whether certain discrete, “basic” emotions have evolutionarily based signals (facial expressions) that are easily, universally, and (perhaps) innately identified. Studies with preverbal infants (younger than 24 months) have the potential to shed light on this debate. This review summarizes what is known about preverbal infants’ understanding of discrete emotional facial expressions. Overall, while many studies suggest that preverbal infants differentiate positive and negative facial expressions, few studies have tested whether infants understand discrete emotions (e.g., anger vs. disgust). Moreover, results vary greatly based on methodological factors. This review also (a) discusses how language may influence the development of emotion understanding, and (b) proposes a new developmental hypothesis for infants’ discrete emotion understanding.
Collapse
|
18
|
Fugate JMB, Franco CL. What Color Is Your Anger? Assessing Color-Emotion Pairings in English Speakers. Front Psychol 2019; 10:206. [PMID: 30863330 PMCID: PMC6399154 DOI: 10.3389/fpsyg.2019.00206] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2018] [Accepted: 01/21/2019] [Indexed: 11/16/2022] Open
Abstract
Do English-speakers think about anger as “red” and sadness as “blue”? Some theories of emotion suggests that color(s)—like other biologically-derived signals- should be reliably paired with an emotion, and that colors should differentiate across emotions. We assessed consistency and specificity for color-emotion pairings among English-speaking adults. In study 1, participants (n = 73) completed an online survey in which they could select up to three colors from 23 colored swatches (varying hue, saturation, and light) for each of ten emotion words. In study 2, different participants (n = 52) completed a similar online survey except that we added additional emotions and colors (which better sampled color space). Participants in both studies indicated the strength of the relationship between a selected color(s) and the emotion. In study 1, four of the ten emotions showed consistency, and about one-third of the colors showed specificity, yet agreement was low-to-moderate among raters even in these cases. When we resampled our data, however, none of these effects were likely to replicate with statistical confidence. In study 2, only two of 20 emotions showed consistency, and three colors showed specificity. As with the first study, no color-emotion pairings were both specific and consistent. In addition, in study 2, we found that saturation and lightness, and to a lesser extent hue, predicted color-emotion agreement rather than perceived color. The results suggest that previous studies which report emotion-color pairings are likely best thought of experiment-specific. The results are discussed with respect to constructionist theories of emotion.
Collapse
Affiliation(s)
| | - Courtny L Franco
- Psychology, University of Massachusetts Dartmouth, Dartmouth, MA, United States
| |
Collapse
|
19
|
|
20
|
Concurrent emotional response and semantic unification: An event-related potential study. COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2018; 19:154-164. [PMID: 30357658 DOI: 10.3758/s13415-018-00652-5] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Using event-related potentials, in this study we examined how implied emotion is derived from sentences. In the same sentential context, different emotionally neutral words rendered the whole sentence emotionally neutral and semantically congruent, emotionally negative and semantically congruent, or emotionally neutral and semantically incongruent. Relative to the words in the neutral-congruent condition, the words in the neutral-incongruent condition elicited a larger N400, indicating increased semantic processing, whereas the words in the negative-congruent condition elicited a long-lasting positivity between 300 and 1,000 ms, indicating an emotional response. The overlapping time windows of semantic processing and the emotional response suggest that the construction of emotional meaning operates concurrently with semantic unification. The results indicate that the implied emotional processing of sentences may be a result of unification operations but does not necessarily involve causal appraisal of a sentence's mental representation.
Collapse
|
21
|
Wang Y, Zhu Z, Chen B, Fang F. Perceptual learning and recognition confusion reveal the underlying relationships among the six basic emotions. Cogn Emot 2018; 33:754-767. [PMID: 29962270 DOI: 10.1080/02699931.2018.1491831] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
Abstract
The six basic emotions (disgust, anger, fear, happiness, sadness, and surprise) have long been considered discrete categories that serve as the primary units of the emotion system. Yet recent evidence indicated underlying connections among them. Here we tested the underlying relationships among the six basic emotions using a perceptual learning procedure. This technique has the potential of causally changing participants' emotion detection ability. We found that training on detecting a facial expression improved the performance not only on the trained expression but also on other expressions. Such a transfer effect was consistently demonstrated between disgust and anger detection as well as between fear and surprise detection in two experiments (Experiment 1A, n = 70; Experiment 1B, n = 42). Notably, training on any of the six emotions could improve happiness detection, while sadness detection could only be improved by training on sadness itself, suggesting the uniqueness of happiness and sadness. In an emotion recognition test using a large sample of Chinese participants (n = 1748), the confusion between disgust and anger as well as between fear and surprise was further confirmed. Taken together, our study demonstrates that the "basic" emotions share some common psychological components, which might be the more basic units of the emotion system.
Collapse
Affiliation(s)
- Yingying Wang
- a Academy for Advanced Interdisciplinary Studies, Peking-Tsinghua Center for Life Sciences, Peking University , Beijing , P. R. People's Republic of China.,b IDG/McGovern Institute for Brain Research, Peking University , Beijing , P. R. People's Republic of China
| | - Zijian Zhu
- a Academy for Advanced Interdisciplinary Studies, Peking-Tsinghua Center for Life Sciences, Peking University , Beijing , P. R. People's Republic of China.,b IDG/McGovern Institute for Brain Research, Peking University , Beijing , P. R. People's Republic of China
| | - Biqing Chen
- a Academy for Advanced Interdisciplinary Studies, Peking-Tsinghua Center for Life Sciences, Peking University , Beijing , P. R. People's Republic of China.,b IDG/McGovern Institute for Brain Research, Peking University , Beijing , P. R. People's Republic of China
| | - Fang Fang
- a Academy for Advanced Interdisciplinary Studies, Peking-Tsinghua Center for Life Sciences, Peking University , Beijing , P. R. People's Republic of China.,b IDG/McGovern Institute for Brain Research, Peking University , Beijing , P. R. People's Republic of China.,c School of Psychological and Cognitive Sciences, Peking University , Beijing , P. R. People's Republic of China.,d Beijing Key Laboratory of Behavior and Mental Health, Peking University , Beijing , P. R. People's Republic of China.,e Key Laboratory of Machine Perception (Ministry of Education) , Peking University , Beijing , P. R. People's Republic of China
| |
Collapse
|
22
|
Qiu F, Han M, Zhai Y, Jia S. Categorical perception of facial expressions in individuals with non-clinical social anxiety. J Behav Ther Exp Psychiatry 2018; 58:78-85. [PMID: 28910609 DOI: 10.1016/j.jbtep.2017.09.001] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/05/2017] [Revised: 09/02/2017] [Accepted: 09/04/2017] [Indexed: 01/30/2023]
Abstract
BACKGROUND AND OBJECTIVES According to the well-established categorical perception (CP) of facial expressions, we decode complicated expression signals into simplified categories to facilitate expression processing. Expression processing deficits have been widely described in social anxiety (SA), but it remains to be investigated whether CP of expressions are affected by SA. The present study examined whether individuals with SA had an interpretation bias when processing ambiguous expressions and whether the sensitivity of their CP was affected by their SA. METHODS Sixty-four participants (high SA, 30; low SA, 34) were selected from 658 undergraduates using the Interaction Anxiousness Scale (IAS). With the CP paradigm, specifically with the analysis method of the logistic function model, we derived the categorical boundaries (reflecting interpretation bias) and slopes (reflecting sensitivity of CP) of both high- and low-SA groups while recognizing angry-fearful, happy-angry, and happy-fearful expression continua. RESULTS Based on a comparison of the categorical boundaries and slopes between the high- and low-SA groups, the results showed that the categorical boundaries between the two groups were not different for any of the three continua, which means that the SA does not affect the interpretation bias for any of the three continua. The slopes for the high-SA group were flatter than those for the low-SA group for both the angry-fearful and happy-angry continua, indicating that the high-SA group is insensitive to the subtle changes that occur from angry to fearful faces and from happy to angry faces. LIMITATIONS Since participants were selected from a sample of undergraduates based on their IAS scores, the results cannot be directly generalized to individuals with clinical SA disorder. CONCLUSIONS The study indicates that SA does not affect interpretation biases in the processing of anger, fear, and happiness, but does modulate the sensitivity of individuals' CP when anger appears. High-SA individuals perceive angry expressions in a less categorical manner than the low-SA group, but no such difference was found in the perception of happy or fearful expressions.
Collapse
Affiliation(s)
- Fanghui Qiu
- School of Psychology, Shandong Normal University, Jinan, PR China; School of Kinesiology, Shanghai University of Sport, Shanghai, PR China.
| | - Mingxiu Han
- School of Psychology, Shandong Normal University, Jinan, PR China.
| | - Yu Zhai
- School of Psychology, Shandong Normal University, Jinan, PR China.
| | - Shiwei Jia
- School of Psychology, Shandong Normal University, Jinan, PR China.
| |
Collapse
|
23
|
Newly learned categories induce pre-attentive categorical perception of faces. Sci Rep 2017; 7:14006. [PMID: 29070897 PMCID: PMC5656585 DOI: 10.1038/s41598-017-14104-6] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2017] [Accepted: 10/02/2017] [Indexed: 11/24/2022] Open
Abstract
Face perception is modulated by categorical information in faces, which is known as categorical perception (CP) of faces. However, it remains unknown whether CP of faces is humans’ inborn capability or the result of acquired categories. Here, we examined whether and when newly learned categories affect face perception. A short-term training method was employed in which participants learned new categories of face stimuli. Behaviorally, using an AB-X discrimination task, we found that the discrimination accuracy of face pairs from different learned categories was significantly higher than that of faces from the same category. Neurally, using a visual oddball task, we found that deviant stimuli whose category differed from standard stimuli evoked a larger N170. Importantly, the visual mismatch negativity (vMMN), starting from 140 ms after stimuli onset, was stronger with the between-category deviants than with the within-category deviants under the unattended condition. Altogether, our study provides empirical evidence indicating that CP of faces could be induced by newly learned categories, and this effect occurs automatically during an early stage of processing.
Collapse
|
24
|
Wegrzyn M, Westphal S, Kissler J. In your face: the biased judgement of fear-anger expressions in violent offenders. BMC Psychol 2017; 5:16. [PMID: 28499409 PMCID: PMC5429544 DOI: 10.1186/s40359-017-0186-z] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2016] [Accepted: 04/24/2017] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Why is it that certain violent criminals repeatedly find themselves engaged in brawls? Many inmates report having felt provoked or threatened by their victims, which might be due to a tendency to ascribe malicious intentions when faced with ambiguous social signals, termed hostile attribution bias. METHODS The present study presented morphed fear-anger faces to prison inmates with a history of violent crimes, a history of child sexual abuse, and to matched controls form the general population. Participants performed a fear-anger decision task. Analyses compared both response frequencies and measures derived from psychophysical functions fitted to the data. In addition, a test to distinguish basic facial expressions and questionnaires for aggression, psychopathy and personality disorders were administered. RESULTS Violent offenders present with a reliable hostile attribution bias, in that they rate ambiguous fear-anger expressions as more angry, compared to both the control population and perpetrators of child sexual abuse. Psychometric functions show a lowered threshold to detect anger in violent offenders compared to the general population. This effect is especially pronounced for male faces, correlates with self-reported aggression and presents in absence of a general emotion recognition impairment. CONCLUSIONS The results indicate that a hostile attribution, related to individual level of aggression and pronounced for male faces, might be one mechanism mediating physical violence.
Collapse
Affiliation(s)
- Martin Wegrzyn
- Department of Psychology, Bielefeld University, Postfach 10 01 31, 33501 Bielefeld, Germany
- Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| | - Sina Westphal
- Department of Psychology, Bielefeld University, Postfach 10 01 31, 33501 Bielefeld, Germany
| | - Johanna Kissler
- Department of Psychology, Bielefeld University, Postfach 10 01 31, 33501 Bielefeld, Germany
- Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| |
Collapse
|
25
|
Pollux PMJ, Hermens F, Willmott AP. Age-congruency and contact effects in body expression recognition from point-light displays (PLD). PeerJ 2016; 4:e2796. [PMID: 27994986 PMCID: PMC5157186 DOI: 10.7717/peerj.2796] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2016] [Accepted: 11/15/2016] [Indexed: 11/25/2022] Open
Abstract
Recognition of older people’s body expressions is a crucial social skill. We here investigate how age, not just of the observer, but also of the observed individual, affects this skill. Age may influence the ability to recognize other people’s body expressions by changes in one’s own ability to perform certain action over the life-span (i.e., an own-age bias may occur, with best recognition for one’s own age). Whole body point light displays of children, young adults and older adults (>70 years) expressing six different emotions were presented to observers of the same three age-groups. Across two variations of the paradigm, no evidence for the predicted own-age bias (a cross-over interaction between one’s own age and the observed person’s age) was found. Instead, experience effects were found with children better recognizing older actors’ expressions of ‘active emotions,’ such as anger and happiness with greater exposure in daily life. Together, the findings suggest that age-related changes in one own’s mobility only influences body expression categorization in young children who interact frequently with older adults.
Collapse
Affiliation(s)
- Petra M J Pollux
- School of Psychology, University of Lincoln , Lincoln , United Kingdom
| | - Frouke Hermens
- School of Psychology, University of Lincoln , Lincoln , United Kingdom
| | - Alexander P Willmott
- School of Sport and Exercise Science, University of Lincoln , Lincoln , United Kingdom
| |
Collapse
|
26
|
Satpute AB, Nook EC, Narayanan S, Shu J, Weber J, Ochsner KN. Emotions in "Black and White" or Shades of Gray? How We Think About Emotion Shapes Our Perception and Neural Representation of Emotion. Psychol Sci 2016; 27:1428-1442. [PMID: 27670663 DOI: 10.1177/0956797616661555] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2015] [Accepted: 07/07/2015] [Indexed: 11/16/2022] Open
Abstract
The demands of social life often require categorically judging whether someone's continuously varying facial movements express "calm" or "fear," or whether one's fluctuating internal states mean one feels "good" or "bad." In two studies, we asked whether this kind of categorical, "black and white," thinking can shape the perception and neural representation of emotion. Using psychometric and neuroimaging methods, we found that (a) across participants, judging emotions using a categorical, "black and white" scale relative to judging emotions using a continuous, "shades of gray," scale shifted subjective emotion perception thresholds; (b) these shifts corresponded with activity in brain regions previously associated with affective responding (i.e., the amygdala and ventral anterior insula); and (c) connectivity of these regions with the medial prefrontal cortex correlated with the magnitude of categorization-related shifts. These findings suggest that categorical thinking about emotions may actively shape the perception and neural representation of the emotions in question.
Collapse
Affiliation(s)
- Ajay B Satpute
- Department of Psychology, Pomona College .,Department of Neuroscience, Pomona College
| | - Erik C Nook
- Department of Psychology, Harvard University
| | | | - Jocelyn Shu
- Department of Psychology, Columbia University
| | | | | |
Collapse
|
27
|
Takehara T, Ochiai F, Suzuki N. A small-world network model of facial emotion recognition. Q J Exp Psychol (Hove) 2015; 69:1508-29. [PMID: 26315136 DOI: 10.1080/17470218.2015.1086393] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
Various models have been proposed to increase understanding of the cognitive basis of facial emotions. Despite those efforts, interactions between facial emotions have received minimal attention. If collective behaviours relating to each facial emotion in the comprehensive cognitive system could be assumed, specific facial emotion relationship patterns might emerge. In this study, we demonstrate that the frameworks of complex networks can effectively capture those patterns. We generate 81 facial emotion images (6 prototypes and 75 morphs) and then ask participants to rate degrees of similarity in 3240 facial emotion pairs in a paired comparison task. A facial emotion network constructed on the basis of similarity clearly forms a small-world network, which features an extremely short average network distance and close connectivity. Further, even if two facial emotions have opposing valences, they are connected within only two steps. In addition, we show that intermediary morphs are crucial for maintaining full network integration, whereas prototypes are not at all important. These results suggest the existence of collective behaviours in the cognitive systems of facial emotions and also describe why people can efficiently recognize facial emotions in terms of information transmission and propagation. For comparison, we construct three simulated networks--one based on the categorical model, one based on the dimensional model, and one random network. The results reveal that small-world connectivity in facial emotion networks is apparently different from those networks, suggesting that a small-world network is the most suitable model for capturing the cognitive basis of facial emotions.
Collapse
Affiliation(s)
- Takuma Takehara
- a Department of Psychology , Doshisha University , Kyoto , Japan.,b Department of Psychology , University of Cincinnati , Cincinnati , OH , USA
| | - Fumio Ochiai
- c Department of Business Administration , Tezukayama University , Nara , Japan
| | - Naoto Suzuki
- a Department of Psychology , Doshisha University , Kyoto , Japan
| |
Collapse
|
28
|
Wegrzyn M, Bruckhaus I, Kissler J. Categorical Perception of Fear and Anger Expressions in Whole, Masked and Composite Faces. PLoS One 2015; 10:e0134790. [PMID: 26263000 PMCID: PMC4532458 DOI: 10.1371/journal.pone.0134790] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2015] [Accepted: 07/15/2015] [Indexed: 12/05/2022] Open
Abstract
Human observers are remarkably proficient at recognizing expressions of emotions and at readily grouping them into distinct categories. When morphing one facial expression into another, the linear changes in low-level features are insufficient to describe the changes in perception, which instead follow an s-shaped function. Important questions are, whether there are single diagnostic regions in the face that drive categorical perception for certain parings of emotion expressions, and how information in those regions interacts when presented together. We report results from two experiments with morphed fear-anger expressions, where (a) half of the face was masked or (b) composite faces made up of different expressions were presented. When isolated upper and lower halves of faces were shown, the eyes were found to be almost as diagnostic as the whole face, with the response function showing a steep category boundary. In contrast, the mouth allowed for a substantially lesser amount of accuracy and responses followed a much flatter psychometric function. When a composite face consisting of mismatched upper and lower halves was used and observers were instructed to exclusively judge either the expression of mouth or eyes, the to-be-ignored part always influenced perception of the target region. In line with experiment 1, the eye region exerted a much stronger influence on mouth judgements than vice versa. Again, categorical perception was significantly more pronounced for upper halves of faces. The present study shows that identification of fear and anger in morphed faces relies heavily on information from the upper half of the face, most likely the eye region. Categorical perception is possible when only the upper face half is present, but compromised when only the lower part is shown. Moreover, observers tend to integrate all available features of a face, even when trying to focus on only one part.
Collapse
Affiliation(s)
- Martin Wegrzyn
- Department of Psychology, Bielefeld University, Bielefeld, Germany
- Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- * E-mail:
| | | | - Johanna Kissler
- Department of Psychology, Bielefeld University, Bielefeld, Germany
- Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| |
Collapse
|
29
|
Calvo MG, Nummenmaa L. Perceptual and affective mechanisms in facial expression recognition: An integrative review. Cogn Emot 2015. [PMID: 26212348 DOI: 10.1080/02699931.2015.1049124] [Citation(s) in RCA: 124] [Impact Index Per Article: 13.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
Facial expressions of emotion involve a physical component of morphological changes in a face and an affective component conveying information about the expresser's internal feelings. It remains unresolved how much recognition and discrimination of expressions rely on the perception of morphological patterns or the processing of affective content. This review of research on the role of visual and emotional factors in expression recognition reached three major conclusions. First, behavioral, neurophysiological, and computational measures indicate that basic expressions are reliably recognized and discriminated from one another, albeit the effect may be inflated by the use of prototypical expression stimuli and forced-choice responses. Second, affective content along the dimensions of valence and arousal is extracted early from facial expressions, although this coarse affective representation contributes minimally to categorical recognition of specific expressions. Third, the physical configuration and visual saliency of facial features contribute significantly to expression recognition, with "emotionless" computational models being able to reproduce some of the basic phenomena demonstrated in human observers. We conclude that facial expression recognition, as it has been investigated in conventional laboratory tasks, depends to a greater extent on perceptual than affective information and mechanisms.
Collapse
Affiliation(s)
- Manuel G Calvo
- a Department of Cognitive Psychology , University of La Laguna , Tenerife , Spain
| | - Lauri Nummenmaa
- b School of Science , Aalto University , Espoo , Finland.,c Department of Psychology and Turku PET Centre , University of Turku , Turku , Finland
| |
Collapse
|
30
|
Gerhardsson A, Högman L, Fischer H. Viewing distance matter to perceived intensity of facial expressions. Front Psychol 2015; 6:944. [PMID: 26191035 PMCID: PMC4488603 DOI: 10.3389/fpsyg.2015.00944] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2015] [Accepted: 06/22/2015] [Indexed: 11/17/2022] Open
Abstract
In our daily perception of facial expressions, we depend on an ability to generalize across the varied distances at which they may appear. This is important to how we interpret the quality and the intensity of the expression. Previous research has not investigated whether this so called perceptual constancy also applies to the experienced intensity of facial expressions. Using a psychophysical measure (Borg CR100 scale) the present study aimed to further investigate perceptual constancy of happy and angry facial expressions at varied sizes, which is a proxy for varying viewing distances. Seventy-one (42 females) participants rated the intensity and valence of facial expressions varying in distance and intensity. The results demonstrated that the perceived intensity (PI) of the emotional facial expression was dependent on the distance of the face and the person perceiving it. An interaction effect was noted, indicating that close-up faces are perceived as more intense than faces at a distance and that this effect is stronger the more intense the facial expression truly is. The present study raises considerations regarding constancy of the PI of happy and angry facial expressions at varied distances.
Collapse
Affiliation(s)
- Andreas Gerhardsson
- Stress Research Institute, Stockholm University, Stockholm Sweden ; Department of Psychology, Stockholm University, Stockholm Sweden
| | - Lennart Högman
- Department of Psychology, Stockholm University, Stockholm Sweden
| | - Håkan Fischer
- Department of Psychology, Stockholm University, Stockholm Sweden
| |
Collapse
|
31
|
Guo K, Shaw H. Face in profile view reduces perceived facial expression intensity: an eye-tracking study. Acta Psychol (Amst) 2015; 155:19-28. [PMID: 25531122 DOI: 10.1016/j.actpsy.2014.12.001] [Citation(s) in RCA: 32] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2014] [Revised: 11/28/2014] [Accepted: 12/03/2014] [Indexed: 10/24/2022] Open
Abstract
Recent studies measuring the facial expressions of emotion have focused primarily on the perception of frontal face images. As we frequently encounter expressive faces from different viewing angles, having a mechanism which allows invariant expression perception would be advantageous to our social interactions. Although a couple of studies have indicated comparable expression categorization accuracy across viewpoints, it is unknown how perceived expression intensity and associated gaze behaviour change across viewing angles. Differences could arise because diagnostic cues from local facial features for decoding expressions could vary with viewpoints. Here we manipulated orientation of faces (frontal, mid-profile, and profile view) displaying six common facial expressions of emotion, and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. In comparison with frontal faces, profile faces slightly reduced identification rates for disgust and sad expressions, but significantly decreased perceived intensity for all tested expressions. Although quantitatively viewpoint had expression-specific influence on the proportion of fixations directed at local facial features, the qualitative gaze distribution within facial features (e.g., the eyes tended to attract the highest proportion of fixations, followed by the nose and then the mouth region) was independent of viewpoint and expression type. Our results suggest that the viewpoint-invariant facial expression processing is categorical perception, which could be linked to a viewpoint-invariant holistic gaze strategy for extracting expressive facial cues.
Collapse
|
32
|
Bennett CC, Šabanović S. Deriving Minimal Features for Human-Like Facial Expressions in Robotic Faces. Int J Soc Robot 2014. [DOI: 10.1007/s12369-014-0237-z] [Citation(s) in RCA: 40] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
33
|
Abstract
Drawing on research reviewed in this special section, the present article discusses how various contextual factors impact on production and decoding of emotion-related facial activity. Although emotion-related variables often contribute to activation of prototypical “emotion expressions” and perceivers can often infer emotional meanings from these facial configurations, neither process is invariant or direct. Many facial movements are directed towards or away from events in the shared environment, and their effects depend on these relational orientations. Facial activity is not only a medium for descriptive representation of internal affective states, but also a means of adjusting to, and operating on, external objects, and of influencing other people’s appraisals of those objects.
Collapse
Affiliation(s)
- Brian Parkinson
- Department of Experimental Psychology, University of Oxford, UK
| |
Collapse
|
34
|
Abstract
In this review, we highlight evidence suggesting that concepts represented in language are used to create a perception of emotion from the constant ebb and flow of other people’s facial muscle movements. In this “construction hypothesis,” (cf. Gendron, Lindquist, Barsalou, & Barrett, 2012) (see also Barrett, 2006b; Barrett, Lindquist, & Gendron, 2007; Barrett, Mesquita, & Gendron, 2011), language plays a constitutive role in emotion perception because words ground the otherwise highly variable instances of an emotion category. We demonstrate that language plays a constitutive role in emotion perception by discussing findings from behavior, neuropsychology, development, and neuroimaging. We close by discussing implications of a constructionist view for the science of emotion.
Collapse
Affiliation(s)
| | - Maria Gendron
- Department of Psychology, Boston College, USA; Department of Psychology, Northeastern University, USA
| |
Collapse
|
35
|
Abstract
Do basic emotions produce their predicted facial expressions in nonlaboratory settings? Available studies in naturalistic settings rarely test causation, but do show a surprisingly weak correlation between emotions and their predicted facial expressions. This evidence from field studies is more consistent with facial behavior having many causes, functions, and meanings, as opposed to their being fixed signals of basic emotion.
Collapse
Affiliation(s)
| | - Carlos Crivelli
- Facultad de Psicología, Universidad Autónoma de Madrid, Spain
| |
Collapse
|
36
|
Fernández-Dols JM. Advances in the Study of Facial Expression: An Introduction to the Special Section. EMOTION REVIEW 2013. [DOI: 10.1177/1754073912457209] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
For more than a century expressions have been approached as bidimensional, static, instantaneous, self-contained, well-defined, and universal signals. These assumptions are starting to be empirically reconsidered: this special section of Emotion Review includes reviews on the physical, social, and cultural dynamics of expressions, and on the complex ways in which, throughout the lifespan, facial behavior and emotion are perceived and categorized by primates’ and humans’ brain. All these advances are certainly paving the way for new exciting approaches to facial behavior more likely to strike an appropriate balance between description and explanation.
Collapse
|