1
|
Fang X, van Kleef GA, Kawakami K, Sauter DA. Registered report "Categorical perception of facial expressions of anger and disgust across cultures". Cogn Emot 2024:1-17. [PMID: 38973174 DOI: 10.1080/02699931.2024.2370667] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2021] [Accepted: 06/13/2024] [Indexed: 07/09/2024]
Abstract
Previous research has demonstrated that individuals from Western cultures exhibit categorical perception (CP) in their judgments of emotional faces. However, the extent to which this phenomenon characterises the judgments of facial expressions among East Asians remains relatively unexplored. Building upon recent findings showing that East Asians are more likely than Westerners to see a mixture of emotions in facial expressions of anger and disgust, the present research aimed to investigate whether East Asians also display CP for angry and disgusted faces. To address this question, participants from Canada and China were recruited to discriminate pairs of faces along the anger-disgust continuum. The results revealed the presence of CP in both cultural groups, as participants consistently exhibited higher accuracy and faster response latencies when discriminating between-category pairs of expressions compared to within-category pairs. Moreover, the magnitude of CP did not vary significantly across cultures. These findings provide novel evidence supporting the existence of CP for facial expressions in both East Asian and Western cultures, suggesting that CP is a perceptual phenomenon that transcends cultural boundaries. This research contributes to the growing literature on cross-cultural perceptions of facial expressions by deepening our understanding of how facial expressions are perceived categorically across cultures.
Collapse
Affiliation(s)
- Xia Fang
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou, People's Republic of China
| | - Gerben A van Kleef
- Department of Social Psychology, University of Amsterdam, Amsterdam, the Netherlands
| | - Kerry Kawakami
- Department of Social Psychology, York University, Toronto, Canada
| | - Disa A Sauter
- Department of Social Psychology, University of Amsterdam, Amsterdam, the Netherlands
| |
Collapse
|
2
|
The Relationship between Crawling and Emotion Discrimination in 9- to 10-Month-Old Infants. Brain Sci 2022; 12:brainsci12040479. [PMID: 35448010 PMCID: PMC9029591 DOI: 10.3390/brainsci12040479] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2022] [Revised: 03/30/2022] [Accepted: 03/31/2022] [Indexed: 01/27/2023] Open
Abstract
The present study examined whether infants’ crawling experience is related to their sensitivity to fearful emotional expressions. Twenty-nine 9- to 10-month-old infants were tested in a preferential looking task, in which they were presented with different pairs of animated faces on a screen displaying a 100% happy facial expression and morphed facial expressions containing varying degrees of fear and happiness. Regardless of their crawling experiences, all infants looked longer at more fearful faces. Additionally, infants with at least 6 weeks of crawling experience needed lower levels of fearfulness in the morphs in order to detect a change from a happy to a fearful face compared to those with less crawling experience. Thus, the crawling experience seems to increase infants’ sensitivity to fearfulness in faces.
Collapse
|
3
|
Zhang K, Yuan Y, Chen J, Wang G, Chen Q, Luo M. Eye Tracking Research on the Influence of Spatial Frequency and Inversion Effect on Facial Expression Processing in Children with Autism Spectrum Disorder. Brain Sci 2022; 12:brainsci12020283. [PMID: 35204046 PMCID: PMC8870542 DOI: 10.3390/brainsci12020283] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2022] [Revised: 02/11/2022] [Accepted: 02/16/2022] [Indexed: 12/10/2022] Open
Abstract
Facial expression processing mainly depends on whether the facial features related to expressions can be fully acquired, and whether the appropriate processing strategies can be adopted according to different conditions. Children with autism spectrum disorder (ASD) have difficulty accurately recognizing facial expressions and responding appropriately, which is regarded as an important cause of their social disorders. This study used eye tracking technology to explore the internal processing mechanism of facial expressions in children with ASD under the influence of spatial frequency and inversion effects for improving their social disorders. The facial expression recognition rate and eye tracking characteristics of children with ASD and typical developing (TD) children on the facial area of interest were recorded and analyzed. The multi-factor mixed experiment results showed that the facial expression recognition rate of children with ASD under various conditions was significantly lower than that of TD children. TD children had more visual attention to the eyes area. However, children with ASD preferred the features of the mouth area, and lacked visual attention and processing of the eyes area. When the face was inverted, TD children had the inversion effect under all three spatial frequency conditions, which was manifested as a significant decrease in expression recognition rate. However, children with ASD only had the inversion effect under the LSF condition, indicating that they mainly used a featural processing method and had the capacity of configural processing under the LSF condition. The eye tracking results showed that when the face was inverted or facial feature information was weakened, both children with ASD and TD children would adjust their facial expression processing strategies accordingly, to increase the visual attention and information processing of their preferred areas. The fixation counts and fixation duration of TD children on the eyes area increased significantly, while the fixation duration of children with ASD on the mouth area increased significantly. The results of this study provided theoretical and practical support for facial expression intervention in children with ASD.
Collapse
Affiliation(s)
- Kun Zhang
- National Engineering Research Center for E-Learning, Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430079, China; (K.Z.); (Y.Y.); (Q.C.); (M.L.)
- National Engineering Laboratory for Educational Big Data, Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430079, China
| | - Yishuang Yuan
- National Engineering Research Center for E-Learning, Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430079, China; (K.Z.); (Y.Y.); (Q.C.); (M.L.)
- National Engineering Laboratory for Educational Big Data, Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430079, China
| | - Jingying Chen
- National Engineering Research Center for E-Learning, Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430079, China; (K.Z.); (Y.Y.); (Q.C.); (M.L.)
- National Engineering Laboratory for Educational Big Data, Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430079, China
- Correspondence:
| | - Guangshuai Wang
- School of Computer Science, Wuhan University, Wuhan 430072, China;
| | - Qian Chen
- National Engineering Research Center for E-Learning, Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430079, China; (K.Z.); (Y.Y.); (Q.C.); (M.L.)
- National Engineering Laboratory for Educational Big Data, Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430079, China
| | - Meijuan Luo
- National Engineering Research Center for E-Learning, Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430079, China; (K.Z.); (Y.Y.); (Q.C.); (M.L.)
- National Engineering Laboratory for Educational Big Data, Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430079, China
| |
Collapse
|
4
|
Keemink JR, Jenner L, Prunty JE, Wood N, Kelly DJ. Eye Movements and Behavioural Responses to Gaze-Contingent Expressive Faces in Typically Developing Infants and Infant Siblings. Autism Res 2020; 14:973-983. [PMID: 33170549 DOI: 10.1002/aur.2432] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2020] [Revised: 09/30/2020] [Accepted: 10/21/2020] [Indexed: 01/01/2023]
Abstract
Studies with infant siblings of children with Autism Spectrum Disorder have attempted to identify early markers for the disorder and suggest that autistic symptoms emerge between 12 and 24 months of age. Yet, a reliable first-year marker remains elusive. We propose that in order to establish first-year manifestations of this inherently social disorder, we need to develop research methods that are sufficiently socially demanding and realistically interactive. Building on Keemink et al. [2019, Developmental Psychology, 55, 1362-1371], we employed a gaze-contingent eye-tracking paradigm in which infants could interact with face stimuli. Infants could elicit emotional expressions (happiness, sadness, surprise, fear, disgust, anger) from on-screen faces by engaging in eye contact. We collected eye-tracking data and video-recorded behavioural response data from 122 (64 male, 58 female) typically developing infants and 31 infant siblings (17 male, 14 female) aged 6-, 9- and 12-months old. All infants demonstrated a significant Expression by AOI interaction (F(10, 1470) = 10.003, P < 0.001, ŋp 2 = 0.064). Infants' eye movements were "expression-specific" with infants distributing their fixations to AOIs differently per expression. Whereas eye movements provide no evidence of deviancies, behavioural response data show significant aberrancies in reciprocity for infant siblings. Infant siblings show reduced social responsiveness at the group level (F(1, 147) = 4.10, P = 0.042, ŋp 2 = 0.028) and individual level (Fischer's Exact, P = 0.032). We conclude that the gaze-contingency paradigm provides a realistically interactive experience capable of detecting deviancies in social responsiveness early, and we discuss our results in relation to subsequent infant sibling development. LAY SUMMARY: We investigated how infant siblings of children with autism spectrum disorder respond to interactive faces presented on a computer screen. Our study demonstrates that infant siblings are less responsive when interacting with faces on a computer screen (e.g., they smile and imitate less) in comparison to infants without an older sibling with autism. Reduced responsiveness within social interaction could potentially have implications for how parents and carers interact with these infants. Autism Res 2021, 14: 973-983. © 2020 International Society for Autism Research and Wiley Periodicals LLC.
Collapse
Affiliation(s)
- Jolie R Keemink
- University of Kent, School of Psychology, Keynes College, Canterbury, Kent, UK
| | - Lauren Jenner
- University of Kent, School of Psychology, Keynes College, Canterbury, Kent, UK
| | - Jonathan E Prunty
- University of Kent, School of Psychology, Keynes College, Canterbury, Kent, UK
| | - Nicky Wood
- East Kent Hospitals University NHS Foundation Trust, Canterbury, Kent, UK
| | - David J Kelly
- University of Kent, School of Psychology, Keynes College, Canterbury, Kent, UK
| |
Collapse
|
5
|
Ruba AL, Meltzoff AN, Repacholi BM. Superordinate categorization of negative facial expressions in infancy: The influence of labels. Dev Psychol 2020; 56:671-685. [PMID: 31999185 PMCID: PMC7060120 DOI: 10.1037/dev0000892] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
Accurate perception of emotional (facial) expressions is an essential social skill. It is currently debated whether emotion categorization in infancy emerges in a "broad-to-narrow" pattern and the degree to which language influences this process. We used an habituation paradigm to explore (a) whether 14- and 18-month-old infants perceive different facial expressions (anger, sad, disgust) as belonging to a superordinate category of negative valence and (b) how verbal labels influence emotion category formation. Results indicated that infants did not spontaneously form a superordinate category of negative valence (Experiments 1 and 3). However, when a novel label ("toma") was added to each event during habituation trials (Experiments 2 and 4), infants formed this superordinate valance category when habituated to disgust and sad expressions (but not when habituated to anger and sadness). These labeling effects were obtained with two stimuli sets (Radboud Face Database and NimStim), even when controlling for the presence of teeth in the expressions. The results indicate that infants, at 14 and 18 months of age, show limited superordinate categorization based on the valence of different negative facial expressions. Specifically, infants only form this abstract emotion category when labels were provided, and the labeling effect depends on which emotions are presented during habituation. These findings have important implications for developmental theories of emotion. (PsycINFO Database Record (c) 2020 APA, all rights reserved).
Collapse
|
6
|
Ruba AL, Repacholi BM. Do Preverbal Infants Understand Discrete Facial Expressions of Emotion? EMOTION REVIEW 2019. [DOI: 10.1177/1754073919871098] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/28/2022]
Abstract
An ongoing debate in affective science concerns whether certain discrete, “basic” emotions have evolutionarily based signals (facial expressions) that are easily, universally, and (perhaps) innately identified. Studies with preverbal infants (younger than 24 months) have the potential to shed light on this debate. This review summarizes what is known about preverbal infants’ understanding of discrete emotional facial expressions. Overall, while many studies suggest that preverbal infants differentiate positive and negative facial expressions, few studies have tested whether infants understand discrete emotions (e.g., anger vs. disgust). Moreover, results vary greatly based on methodological factors. This review also (a) discusses how language may influence the development of emotion understanding, and (b) proposes a new developmental hypothesis for infants’ discrete emotion understanding.
Collapse
|
7
|
He H, Li J, Xiao Q, Jiang S, Yang Y, Zhi S. Language and Color Perception: Evidence From Mongolian and Chinese Speakers. Front Psychol 2019; 10:551. [PMID: 30923508 PMCID: PMC6426779 DOI: 10.3389/fpsyg.2019.00551] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2018] [Accepted: 02/26/2019] [Indexed: 11/13/2022] Open
Abstract
The present research contributes to the debate in cognitive sentence on the relationship between language and perception by comparing Mongolian and Chinese speakers' color perception. In this study, featuring a free sorting task and a visual search task comparing Mongolian and Chinese performances, the results show that both universal and relativistic forces are at play. Chinese (Mandarin) and Mongolian color terms divide the blue spectrum differently but the green spectrum, similarly. In Mongolian, light blue ("qinker") and dark blue ("huhe") are strictly distinct, while both light green and dark green are described as one word, nogvgan. In Chinese, however, both light blue and dark blue are simply described by one word, lan, and both light green and dark green are described by a single word, lv. The current study used a free-sorting task and a visual search task to investigate whether this linguistic difference between Chinese and Mongolian speakers leads to a difference in color discrimination. In the free-sorting task, compared with Chinese speakers, Mongolian speakers exhibited different sorting in the blue region (by distinguishing light and dark blue) and the same sorting in the green region. Further results showed that Mongolian speakers discriminated visual search displays that fall into different linguistic categories in Mongolian (e.g., qinker or huhe) more quickly than visual search displays that belong to the same linguistic category (e.g., both qinker) in a visual search task. Moreover, this effect was disrupted in Mongolian participants who performed a secondary task engaging involving verbal working memory (but not a task engaging involving spatial working memory), suggested linguistic interference. Chinese (Mandarin) speakers performing the visual search task did not show such a category advantage under any of the conditions. The finding provides support for the Whorf hypothesis with evidence from an Altay language. Meanwhile, both Chinese and Mongolian speakers reacted faster to the green color than the blue color in the visual search task, suggesting that the variation in human color perception is constrained by certain universal forces. The difference in categorical effects between Chinese and Mongolian speakers in the blue region suggests a relativistic aspect of language and color perception, while the speed of visual search in blue and green suggests a universalistic aspect of language and color perception. Thus, our findings suggest that our perception is shaped by both relativistic and universal forces.
Collapse
Affiliation(s)
- Hu He
- College of Educational Science, Inner Mongolia Normal University, Hohhot, China
| | - Jie Li
- College of Educational Science, Inner Mongolia Normal University, Hohhot, China.,Inner Mongolia Autonomous Region Key Laboratory of Psychology, Hohhot, China
| | - Qianguo Xiao
- Laboratory of Cognition and Mental Health, Chongqing University of Arts and Sciences, Chongqing, China
| | - Songxiu Jiang
- School of Education, Liaocheng University, Liaocheng, China
| | - Yisheng Yang
- College of Educational Science, Inner Mongolia Normal University, Hohhot, China
| | - Sheng Zhi
- College of Educational Science, Inner Mongolia Normal University, Hohhot, China
| |
Collapse
|
8
|
Plate RC, Wood A, Woodard K, Pollak SD. Probabilistic learning of emotion categories. J Exp Psychol Gen 2018; 148:1814-1827. [PMID: 30570327 DOI: 10.1037/xge0000529] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Although the configurations of facial muscles that humans perceive vary continuously, we often represent emotions as categories. This suggests that, as in other domains of categorical perception such as speech and color perception, humans become attuned to features of emotion cues that map onto meaningful thresholds for these signals given their environments. However, little is known about the learning processes underlying the representation of these salient social signals. In Experiment 1 we test the role of statistical distributions of facial cues in the maintenance of an emotion category in both children (6-8 years old) and adults (18-22 years old). Children and adults learned the boundary between neutral and angry when provided with explicit feedback (supervised learning). However, after we exposed participants to different statistical distributions of facial cues, they rapidly shifted their category boundaries for each emotion during a testing phase. In Experiments 2 and 3, we replicated this finding and also tested the extent to which learners are able to track statistical distributions for multiple actors. Not only did participants form actor-specific categories, but the distributions of facial cues also influenced participants' trait judgments about the actors. Taken together, these data are consistent with the view that the way humans construe emotion (in this case, anger) is not only flexible, but reflects complex learning about the distributions of the myriad cues individuals experience in their social environments. (PsycINFO Database Record (c) 2019 APA, all rights reserved).
Collapse
|