1
|
Wang Y, Luo Q, Zhang Y, Zhao K. Synchrony or asynchrony: development of facial expression recognition from childhood to adolescence based on large-scale evidence. Front Psychol 2024; 15:1379652. [PMID: 38725946 PMCID: PMC11079229 DOI: 10.3389/fpsyg.2024.1379652] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2024] [Accepted: 04/09/2024] [Indexed: 05/12/2024] Open
Abstract
The development of facial expression recognition ability in children is crucial for their emotional cognition and social interactions. In this study, 510 children aged between 6 and 15 participated in a two forced-choice task of facial expression recognition. The findings supported that recognition of the six basic facial expressions reached a relatively stable mature level around 8-9 years old. Additionally, model fitting results indicated that children showed the most significant improvement in recognizing expressions of disgust, closely followed by fear. Conversely, recognition of expressions of happiness and sadness showed slower improvement across different age groups. Regarding gender differences, girls exhibited a more pronounced advantage. Further model fitting revealed that boys showed more pronounced improvements in recognizing expressions of disgust, fear, and anger, while girls showed more pronounced improvements in recognizing expressions of surprise, sadness, and happiness. These clear findings suggested the synchronous developmental trajectory of facial expression recognition from childhood to adolescence, likely influenced by socialization processes and interactions related to brain maturation.
Collapse
Affiliation(s)
- Yihan Wang
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Qian Luo
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Yuanmeng Zhang
- College of Letters and Science, University of California, Berkeley, Berkeley, CA, United States
| | - Ke Zhao
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
2
|
Hermans MM, Schappin R, de Laat PCJ, Mendels EJ, Breur JMPJ, Langeveld HR, Raphael MF, de Graaf M, Breugem CC, de Wildt SN, Okkerse JME, Pasmans SGMA, Rietman AB. Mental Health of School-Aged Children Treated with Propranolol or Atenolol for Infantile Hemangioma and Their Parents. Dermatology 2024; 240:216-225. [PMID: 38228125 PMCID: PMC10997238 DOI: 10.1159/000536144] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2022] [Accepted: 01/03/2024] [Indexed: 01/18/2024] Open
Abstract
BACKGROUND Infants with infantile hemangioma (IH) have been effectively treated with propranolol or atenolol. Concerns were raised about the mental health of these children at school age, due to central nervous system effects of propranolol and visible nature of IH. OBJECTIVE This study aimed to compare the mental health at school age of children treated with propranolol to children treated with atenolol for IHs and their parents. METHODS This two-centered cross-sectional study included children aged ≥6 years and treated with either propranolol or atenolol for IH during infancy. Children's outcomes were performance-based affect recognition (Dutch version of the Developmental Neuropsychological Assessment-II [NEPSY-II-NL]), parent-reported emotional and behavioral functioning (Child Behavioral Checklist [CBCL]), and health-related quality of life (KIDSCREEN-27). Parents' outcome was parenting stress (Parenting Stress Questionnaire [OBVL]). RESULTS Data of 105 children (36 propranolol, 69 atenolol; 6.0-11.8 years) were analyzed. Mental health outcomes did not differ between both β-blocker groups. Although overall functioning was in line with norms, children presented specific problems concerning affect recognition, parent-reported attention, and social quality of life. Parents showed increased physical symptoms, depressive symptoms, and parent-child relationship problems. CONCLUSION No difference in mental health at school age was found between children treated with propranolol or atenolol for IH. Although few overall mental health problems were found, specific problems require follow-up. Follow-up of children should be directed toward affect recognition, attention, and social functioning in daily life. Problems reported by parents could be ameliorated by mental health support during and after their infant's β-blocker treatment.
Collapse
Affiliation(s)
- Mireille M Hermans
- Department of Dermatology - Center of Pediatric Dermatology, Center of Rare Skin Diseases, Vascular Anomaly Center Erasmus MC Rotterdam, Member of the ERN-SKIN-Mosaic Group and ERN-VASCERN-VASCA Group, Erasmus MC Sophia Children's Hospital, University Medical Center Rotterdam, Rotterdam, The Netherlands,
| | - Renske Schappin
- Department of Dermatology - Center of Pediatric Dermatology, Center of Rare Skin Diseases, Vascular Anomaly Center Erasmus MC Rotterdam, Member of the ERN-SKIN-Mosaic Group and ERN-VASCERN-VASCA Group, Erasmus MC Sophia Children's Hospital, University Medical Center Rotterdam, Rotterdam, The Netherlands
- Department of Surgery, Wilhelmina Children's Hospital, University Medical Center Utrecht, Rotterdam, The Netherlands
| | - Peter C J de Laat
- Department of Pediatrics (-Hemato-oncology), Center of Rare Skin Diseases, Vascular Anomaly Center Erasmus MC Rotterdam, Member of the ERN-SKIN-Mosaic Group and ERN-VASCERN-VASCA Group, Erasmus MC Sophia Children's Hospital, University Medical Center Rotterdam, Utrecht, The Netherlands
| | - Elodie J Mendels
- Department of Dermatology - Center of Pediatric Dermatology, Center of Rare Skin Diseases, Vascular Anomaly Center Erasmus MC Rotterdam, Member of the ERN-SKIN-Mosaic Group and ERN-VASCERN-VASCA Group, Erasmus MC Sophia Children's Hospital, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | - Johannes M P J Breur
- Department of Pediatric Cardiology, Wilhelmina Children's Hospital, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Hester R Langeveld
- Department of Intensive Care and Pediatric Surgery, Center of Rare Skin Diseases, Vascular Anomaly Center Erasmus MC Rotterdam, Member of the ERN-SKIN-Mosaic Group and ERN-VASCERN-VASCA Group, Erasmus MC Sophia Children's Hospital, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | - Martine F Raphael
- Department of Dermatology, UMC Utrecht Center for Vascular Anomalies, Wilhelmina Children's Hospital, University Medical Center Utrecht, Utrecht, The Netherlands
- Department Emma Children's Hospital, Amsterdam UMC Location University of Amsterdam, Amsterdam, The Netherlands
| | - Marlies de Graaf
- Department of Dermatology, UMC Utrecht Center for Vascular Anomalies, Wilhelmina Children's Hospital, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Corstiaan C Breugem
- Department of Plastic Surgery, UMC Utrecht Center for Vascular Anomalies, Wilhelmina Children's Hospital, University Medical Center Utrecht, Utrecht, The Netherlands
- Department of Plastic, Reconstructive and Hand Surgery, Amsterdam UMC Location University of Amsterdam, Amsterdam, The Netherlands
| | - Saskia N de Wildt
- Department of Pharmacology and Toxicology, Radboud Institute for Health Sciences, Radboud University Medical Center, Nijmegen, The Netherlands
| | - Jolanda M E Okkerse
- Department of Child and Adolescent Psychology/Psychiatry, Erasmus MC Sophia Children's Hospital, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | - Suzanne G M A Pasmans
- Department of Dermatology - Center of Pediatric Dermatology, Center of Rare Skin Diseases, Vascular Anomaly Center Erasmus MC Rotterdam, Member of the ERN-SKIN-Mosaic Group and ERN-VASCERN-VASCA Group, Erasmus MC Sophia Children's Hospital, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | - André B Rietman
- Department of Child and Adolescent Psychology/Psychiatry, Erasmus MC Sophia Children's Hospital, University Medical Center Rotterdam, Rotterdam, The Netherlands
| |
Collapse
|
3
|
Richoz AR, Stacchi L, Schaller P, Lao J, Papinutto M, Ticcinelli V, Caldara R. Recognizing facial expressions of emotion amid noise: A dynamic advantage. J Vis 2024; 24:7. [PMID: 38197738 PMCID: PMC10790674 DOI: 10.1167/jov.24.1.7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2023] [Accepted: 11/12/2023] [Indexed: 01/11/2024] Open
Abstract
Humans communicate internal states through complex facial movements shaped by biological and evolutionary constraints. Although real-life social interactions are flooded with dynamic signals, current knowledge on facial expression recognition mainly arises from studies using static face images. This experimental bias might stem from previous studies consistently reporting that young adults minimally benefit from the richer dynamic over static information, whereas children, the elderly, and clinical populations very strongly do (Richoz, Jack, Garrod, Schyns, & Caldara, 2015, Richoz, Jack, Garrod, Schyns, & Caldara, 2018b). These observations point to a near-optimal facial expression decoding system in young adults, almost insensitive to the advantage of dynamic over static cues. Surprisingly, no study has yet tested the idea that such evidence might be rooted in a ceiling effect. To this aim, we asked 70 healthy young adults to perform static and dynamic facial expression recognition of the six basic expressions while parametrically and randomly varying the low-level normalized phase and contrast signal (0%-100%) of the faces. As predicted, when 100% face signals were presented, static and dynamic expressions were recognized with equal efficiency with the exception of those with the most informative dynamics (i.e., happiness and surprise). However, when less signal was available, dynamic expressions were all better recognized than their static counterpart (peaking at ∼20%). Our data show that facial movements increase our ability to efficiently identify emotional states of others under the suboptimal visual conditions that can occur in everyday life. Dynamic signals are more effective and sensitive than static ones for decoding all facial expressions of emotion for all human observers.
Collapse
Affiliation(s)
- Anne-Raphaëlle Richoz
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Lisa Stacchi
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Pauline Schaller
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Junpeng Lao
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Michael Papinutto
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Valentina Ticcinelli
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Roberto Caldara
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| |
Collapse
|
4
|
Leung TS, Zeng G, Maylott SE, Martinez SN, Jakobsen KV, Simpson EA. Infection detection in faces: Children's development of pathogen avoidance. Child Dev 2024; 95:e35-e46. [PMID: 37589080 DOI: 10.1111/cdev.13983] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2022] [Revised: 06/20/2023] [Accepted: 07/05/2023] [Indexed: 08/18/2023]
Abstract
This study examined the development of children's avoidance and recognition of sickness using face photos from people with natural, acute, contagious illness. In a U.S. sample of fifty-seven 4- to 5-year-olds (46% male, 70% White), fifty-two 8- to 9-year-olds (26% male, 62% White), and 51 adults (59% male, 61% White), children and adults avoided and recognized sick faces (ds ranged from 0.38 to 2.26). Both avoidance and recognition improved with age. Interestingly, 4- to 5-year-olds' avoidance of sick faces positively correlated with their recognition, suggesting stable individual differences in these emerging skills. Together, these findings are consistent with a hypothesized immature but functioning and flexible behavioral immune system emerging early in development. Characterizing children's sickness perception may help design interventions to improve health.
Collapse
Affiliation(s)
- Tiffany S Leung
- Department of Psychology, University of Miami, Coral Gables, Florida, USA
| | - Guangyu Zeng
- Department of Psychology, University of Miami, Coral Gables, Florida, USA
- Division of Applied Psychology, The Chinese University of Hong Kong, Shenzhen, China
| | - Sarah E Maylott
- Department of Psychiatry & Behavioral Sciences, Duke University, Durham, North Carolina, USA
| | | | | | | |
Collapse
|
5
|
Dal Ben R. SHINE_color: Controlling low-level properties of colorful images. MethodsX 2023; 11:102377. [PMID: 37771500 PMCID: PMC10522894 DOI: 10.1016/j.mex.2023.102377] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2023] [Accepted: 09/12/2023] [Indexed: 09/30/2023] Open
Abstract
Visual perception combines top-down processes arising from participants individual histories, such as expectations and goals, and bottom-up processes that arise from visual stimuli properties, such as luminance and contrast. The precise control of low-level visual stimuli properties is essential when investigating visual perception. Without it, for instance, investigations of bottom-up processes are virtually impossible and investigations of top-down processes risk major confounds when testing and formulating hypotheses. The SHINE (spectrum, histogram, and intensity normalization and equalization) toolbox for MATLAB [1] allows precise control of images' Fourier amplitude spectra, the normalizing and scaling of luminance and contrast, and exact histogram specification optimized for perceptual visual quality. Following Willenbockel and Cols (2010) advices, here we present an adaptation of the SHINE toolbox, named SHINE_color, which extends SHINE functionalities by allowing the parametrical manipulation of low-level properties of both static and animated colorful images.•Parametric manipulation of low-level properties of colorful images•Spectrum, histogram, and intensity normalization and equalization.
Collapse
Affiliation(s)
- Rodrigo Dal Ben
- Ambrose University, 150 Ambrose Circle SW, Calgary, AB T3H 0L5, Canada
| |
Collapse
|
6
|
Su L, Lin Z, Li Y, Wang X, Lin Z, Dong L, Wei L. Own-Age Effects in a Face-Emotion Recognition Intervention for Children With ASD--Evidence From Eye Movements. Psychol Res Behav Manag 2023; 16:4479-4490. [PMID: 37942440 PMCID: PMC10629357 DOI: 10.2147/prbm.s427006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2023] [Accepted: 10/04/2023] [Indexed: 11/10/2023] Open
Abstract
Background The own-age effect is the phenomenon in which individuals perceive and recognize faces of their own age better than others in terms of cognitive processing. Previous eye movement studies on children with autism spectrum disorders (ASD) have reported that children with ASD have an attentional bias toward own-age faces and own-age scenes. Methods The present study used own-age faces as the intervention material and examined the application of the own-age effect in the emotional recognition of faces in ASD. The length of the intervention was 12 weeks, and 2 sessions were conducted each week. Results The results revealed that the own-age face intervention group gazed at children's faces significantly more often than before the intervention, gazed at children's angry faces significantly longer than before the intervention, and gazed at adults' happy faces significantly longer and more often than before the intervention; the other-age faces intervention group did not differ significantly from the preintervention in gazing at children's and adults' faces after the intervention. Conclusion The results suggest that own-age faces as teaching materials can better promote the emotion recognition ability of children with ASD than other-age faces.
Collapse
Affiliation(s)
- Linfei Su
- Department of Psychology, School of Health, Fujian Medical University, Fuzhou, People’s Republic of China
- Institute of Psychology, School of Education Science, Huazhong University of Science and Technology, Wuhan, People’s Republic of China
| | - Zehui Lin
- Department of Psychology, School of Health, Fujian Medical University, Fuzhou, People’s Republic of China
| | - Youyuan Li
- Department of Psychology, School of Health, Fujian Medical University, Fuzhou, People’s Republic of China
| | - Xiaoyan Wang
- Fuzhou Xingyu School, Fuzhou, People’s Republic of China
| | - Zengping Lin
- Fuzhou Xingyu School, Fuzhou, People’s Republic of China
| | - Lanjuan Dong
- Fuzhou Xingyu School, Fuzhou, People’s Republic of China
| | - Ling Wei
- Department of Psychology, School of Health, Fujian Medical University, Fuzhou, People’s Republic of China
| |
Collapse
|
7
|
Van der Donck S, Hendriks M, Vos S, Op de Beeck H, Boets B. Neural sensitivity to facial identity and facial expression discrimination in adults with autism. Autism Res 2023; 16:2110-2124. [PMID: 37823568 DOI: 10.1002/aur.3036] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2023] [Accepted: 09/15/2023] [Indexed: 10/13/2023]
Abstract
The fluent processing of faces can be challenging for autistic individuals. Here, we assessed the neural sensitivity to rapid changes in subtle facial cues in 23 autistic men and 23 age and IQ matched non-autistic (NA) controls using frequency-tagging electroencephalography (EEG). In oddball paradigms examining the automatic and implicit discrimination of facial identity and facial expression, base rate images were presented at 6 Hz, periodically interleaved every fifth image with an oddball image (i.e. 1.2 Hz oddball frequency). These distinctive frequency tags for base rate and oddball stimuli allowed direct and objective quantification of the neural discrimination responses. We found no large differences in the neural sensitivity of participants in both groups, not for facial identity discrimination, nor for facial expression discrimination. Both groups also showed a clear face-inversion effect, with reduced brain responses for inverted versus upright faces. Furthermore, sad faces generally elicited significantly lower neural amplitudes than angry, fearful and happy faces. The only minor group difference is the larger involvement of high-level right-hemisphere visual areas in NA men for facial expression processing. These findings are discussed from a developmental perspective, as they strikingly contrast with robust face processing deficits observed in autistic children using identical EEG paradigms.
Collapse
Affiliation(s)
- Stephanie Van der Donck
- Center for Developmental Psychiatry, KU Leuven, Leuven, Belgium
- Leuven Autism Research (LAuRes), KU Leuven, Leuven, Belgium
| | - Michelle Hendriks
- Leuven Autism Research (LAuRes), KU Leuven, Leuven, Belgium
- Research Unit Brain and Cognition, Leuven Brain Institute, KU Leuve, Leuven, Belgium
| | - Silke Vos
- Center for Developmental Psychiatry, KU Leuven, Leuven, Belgium
- Leuven Autism Research (LAuRes), KU Leuven, Leuven, Belgium
| | - Hans Op de Beeck
- Leuven Autism Research (LAuRes), KU Leuven, Leuven, Belgium
- Research Unit Brain and Cognition, Leuven Brain Institute, KU Leuve, Leuven, Belgium
| | - Bart Boets
- Center for Developmental Psychiatry, KU Leuven, Leuven, Belgium
- Leuven Autism Research (LAuRes), KU Leuven, Leuven, Belgium
| |
Collapse
|
8
|
Thomas M, Whittle S, Tian YE, van Rheenen TE, Zalesky A, Cropley VL. Pathways from threat exposure to psychotic symptoms in youth: The role of emotion recognition bias and brain structure. Schizophr Res 2023; 261:304-313. [PMID: 37898031 DOI: 10.1016/j.schres.2023.10.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/11/2023] [Revised: 08/21/2023] [Accepted: 10/11/2023] [Indexed: 10/30/2023]
Abstract
BACKGROUND Research supports an association between threatening experiences in childhood and psychosis. It is possible that early threat exposure disrupts the development of emotion recognition (specifically, producing a bias for facial expressions relating to threat) and the brain structures subserving it, contributing to psychosis development. METHODS Using data from the Philadelphia Neurodevelopmental Cohort, we examined associations between threat exposure and both the misattribution of facial expressions to fear/anger in an emotion recognition task, and gray matter volumes in key emotion processing regions. Our sample comprised youth with psychosis spectrum symptoms (N = 304), control youth (N = 787), and to evaluate specificity, youth with internalizing symptoms (N = 92). The moderating effects of group and sex were examined. RESULTS Both the psychosis spectrum and internalizing groups had higher levels of threat exposure than controls. In the total sample, threat exposure was associated with lower left medial prefrontal cortex (mPFC) volume but not misattributions to fear/anger. The effects of threat exposure did not significantly differ by group or sex. CONCLUSIONS The findings of this study provide evidence for an effect of threat exposure on mPFC morphology, but do not support an association between threat exposure and a recognition bias for threat-related expressions, that is particularly pronounced in psychosis. Future research should investigate factors linking transdiagnostic alterations related to threat exposure with psychotic symptoms, and attempt to clarify the mechanisms underpinning emotion recognition misattributions in threat-exposed youth.
Collapse
Affiliation(s)
- Megan Thomas
- Melbourne Neuropsychiatry Centre, Department of Psychiatry, The University of Melbourne & Melbourne Health, Melbourne, Australia.
| | - Sarah Whittle
- Melbourne Neuropsychiatry Centre, Department of Psychiatry, The University of Melbourne & Melbourne Health, Melbourne, Australia
| | - Ye E Tian
- Melbourne Neuropsychiatry Centre, Department of Psychiatry, The University of Melbourne & Melbourne Health, Melbourne, Australia
| | - Tamsyn E van Rheenen
- Melbourne Neuropsychiatry Centre, Department of Psychiatry, The University of Melbourne & Melbourne Health, Melbourne, Australia; Centre for Mental Health, School of Health Sciences, Swinburne University, Melbourne, Australia
| | - Andrew Zalesky
- Melbourne Neuropsychiatry Centre, Department of Psychiatry, The University of Melbourne & Melbourne Health, Melbourne, Australia
| | - Vanessa L Cropley
- Melbourne Neuropsychiatry Centre, Department of Psychiatry, The University of Melbourne & Melbourne Health, Melbourne, Australia
| |
Collapse
|
9
|
Olaya-Galindo MD, Vargas-Cifuentes OA, Vélez Van-Meerbeke A, Talero-Gutiérrez C. Establishing the Relationship Between Attention Deficit Hyperactivity Disorder and Emotional Facial Expression Recognition Deficit: A Systematic Review. J Atten Disord 2023; 27:1181-1195. [PMID: 36843351 PMCID: PMC10466982 DOI: 10.1177/10870547231154901] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/28/2023]
Abstract
OBJECTIVE In this review, we examined if there is a deficit in facial recognition of emotion (FER) in children, adolescents, and adults with attention deficit hyperactivity disorder (ADHD). BACKGROUND Emotional regulation is impaired in ADHD. Although a facial emotion recognition deficit has been described in this condition, the underlying causal mechanisms remain unclear. METHODS The search was performed in six databases in September 2022. Studies assessing children, adolescents, or adults with isolated or comorbid ADHD that evaluated participants using a FER task were included. RESULTS Twelve studies out of 385 were selected, with participants ranging in age from 6 to 37.1 years. A deficit in FER specific to ADHD, or secondary to comorbid autism spectrum disorder, anxiety, and oppositional symptoms, was found. CONCLUSIONS There is a FER deficit in patients with ADHD. Adults showed improved recognition accuracy, reflecting partial compensation. ADHD symptoms and comorbidities appear to influence FER deficits.
Collapse
Affiliation(s)
- Maria Daniela Olaya-Galindo
- Neuroscience research group (NeURos), NeuroVitae Center for Neuroscience, School of Medicine and Health Sciences, Universidad del Rosario, Bogotá, Colombia
| | - Oscar Alberto Vargas-Cifuentes
- Neuroscience research group (NeURos), NeuroVitae Center for Neuroscience, School of Medicine and Health Sciences, Universidad del Rosario, Bogotá, Colombia
| | - Alberto Vélez Van-Meerbeke
- Neuroscience research group (NeURos), NeuroVitae Center for Neuroscience, School of Medicine and Health Sciences, Universidad del Rosario, Bogotá, Colombia
| | - Claudia Talero-Gutiérrez
- Neuroscience research group (NeURos), NeuroVitae Center for Neuroscience, School of Medicine and Health Sciences, Universidad del Rosario, Bogotá, Colombia
| |
Collapse
|
10
|
Fan Z, Liu Z, Yang J, Yang J, Sun F, Tang S, Wu G, Guo S, Ouyang X, Tao H. Hypoactive Visual Cortex, Prefrontal Cortex and Insula during Self-Face Recognition in Adults with First-Episode Major Depressive Disorder. Biomedicines 2023; 11:2200. [PMID: 37626697 PMCID: PMC10452386 DOI: 10.3390/biomedicines11082200] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2023] [Revised: 07/30/2023] [Accepted: 08/01/2023] [Indexed: 08/27/2023] Open
Abstract
Self-face recognition is a vital aspect of self-referential processing, which is closely related to affective states. However, neuroimaging research on self-face recognition in adults with major depressive disorder is lacking. This study aims to investigate the alteration of brain activation during self-face recognition in adults with first-episode major depressive disorder (FEMDD) via functional magnetic resonance imaging (fMRI); FEMDD (n = 59) and healthy controls (HC, n = 36) who performed a self-face-recognition task during the fMRI scan. The differences in brain activation signal values between the two groups were analyzed, and Pearson correlation analysis was used to evaluate the relationship between the brain activation of significant group differences and the severity of depressive symptoms and negative self-evaluation; FEMDD showed significantly decreased brain activation in the bilateral occipital cortex, bilateral fusiform gyrus, right inferior frontal gyrus, and right insula during the task compared with HC. No significant correlation was detected between brain activation with significant group differences and the severity of depression and negative self-evaluation in FEMDD or HC. The results suggest the involvement of the malfunctioning visual cortex, prefrontal cortex, and insula in the pathophysiology of self-face recognition in FEMDD, which may provide a novel therapeutic target for adults with FEMDD.
Collapse
Affiliation(s)
- Zebin Fan
- Department of Psychiatry, National Clinical Research Center for Mental Disorders, and National Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha 410011, China; (Z.F.)
| | - Zhening Liu
- Department of Psychiatry, National Clinical Research Center for Mental Disorders, and National Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha 410011, China; (Z.F.)
| | - Jie Yang
- Department of Psychiatry, National Clinical Research Center for Mental Disorders, and National Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha 410011, China; (Z.F.)
| | - Jun Yang
- Department of Psychiatry, National Clinical Research Center for Mental Disorders, and National Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha 410011, China; (Z.F.)
| | - Fuping Sun
- Department of Psychiatry, National Clinical Research Center for Mental Disorders, and National Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha 410011, China; (Z.F.)
| | - Shixiong Tang
- Department of Radiology, The Second Xiangya Hospital of Central South University, Changsha 410011, China
| | - Guowei Wu
- Department of Psychiatry, National Clinical Research Center for Mental Disorders, and National Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha 410011, China; (Z.F.)
| | - Shuixia Guo
- Key Laboratory of Computing and Stochastic Mathematics (Ministry of Education), School of Mathematics and Statistics, Hunan Normal University, Changsha 410006, China
- Key Laboratory of Applied Statistics and Data Science, College of Hunan Province, Hunan Normal University, Changsha 410006, China
| | - Xuan Ouyang
- Department of Psychiatry, National Clinical Research Center for Mental Disorders, and National Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha 410011, China; (Z.F.)
| | - Haojuan Tao
- Department of Psychiatry, National Clinical Research Center for Mental Disorders, and National Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha 410011, China; (Z.F.)
| |
Collapse
|
11
|
Della Longa L, Carnevali L, Farroni T. The role of affective touch in modulating emotion processing among preschool children. J Exp Child Psychol 2023; 235:105726. [PMID: 37336064 DOI: 10.1016/j.jecp.2023.105726] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2023] [Revised: 05/26/2023] [Accepted: 05/29/2023] [Indexed: 06/21/2023]
Abstract
Recognizing emotional expressions is a prerequisite for understanding others' feelings and intentions, a key component of social interactions that develops throughout childhood. In multisensory social environments, touch may be crucial for emotion processing, linking external sensory information with internal affective states. The current study investigated whether affective touch facilitates recognition of emotional expressions throughout childhood. Preschool children (N = 121 3- to 6-year-olds) were presented with different tactile stimulations followed by an emotion-matching task. Results revealed that affective touch fosters the recognition of negative emotions and increases the speed of association of positive emotions, highlighting the centrality of tactile experiences for socioemotional understanding. The current research opens new perspectives on how to support emotional recognition with potential consequences for the development of social functioning.
Collapse
Affiliation(s)
- Letizia Della Longa
- Department of Developmental Psychology and Socialization, University of Padova, 35131 Padova, Italy.
| | - Laura Carnevali
- Department of Developmental Psychology and Socialization, University of Padova, 35131 Padova, Italy
| | - Teresa Farroni
- Department of Developmental Psychology and Socialization, University of Padova, 35131 Padova, Italy
| |
Collapse
|
12
|
Preißler L, Keck J, Krüger B, Munzert J, Schwarzer G. Recognition of emotional body language from dyadic and monadic point-light displays in 5-year-old children and adults. J Exp Child Psychol 2023; 235:105713. [PMID: 37331307 DOI: 10.1016/j.jecp.2023.105713] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2022] [Revised: 04/13/2023] [Accepted: 05/16/2023] [Indexed: 06/20/2023]
Abstract
Most child studies on emotion perception used faces and speech as emotion stimuli, but little is known about children's perception of emotions conveyed by body movements, that is, emotional body language (EBL). This study aimed to investigate whether processing advantages for positive emotions in children and negative emotions in adults found in studies on emotional face and term perception also occur in EBL perception. We also aimed to uncover which specific movement features of EBL contribute to emotion perception from interactive dyads compared with noninteractive monads in children and adults. We asked 5-year-old children and adults to categorize happy and angry point-light displays (PLDs), presented as pairs (dyads) and single actors (monads), in a button-press task. By applying representational similarity analyses, we determined intra- and interpersonal movement features of the PLDs and their relation to the participants' emotional categorizations. Results showed significantly higher recognition of happy PLDs in 5-year-olds and of angry PLDs in adults in monads but not in dyads. In both age groups, emotion recognition depended significantly on kinematic and postural movement features such as limb contraction and vertical movement in monads and dyads, whereas in dyads recognition also relied on interpersonal proximity measures such as interpersonal distance. Thus, EBL processing in monads seems to undergo a similar developmental shift from a positivity bias to a negativity bias, as was previously found for emotional faces and terms. Despite these age-specific processing biases, children and adults seem to use similar movement features in EBL processing.
Collapse
Affiliation(s)
- Lucie Preißler
- Department of Developmental Psychology, Justus Liebig University Giessen, 35394 Gießen, Germany.
| | - Johannes Keck
- Neuromotor Behavior Lab, Department of Sport Science, Justus Liebig University Giessen, 35394 Gießen, Germany
| | - Britta Krüger
- Neuromotor Behavior Lab, Department of Sport Science, Justus Liebig University Giessen, 35394 Gießen, Germany
| | - Jörn Munzert
- Neuromotor Behavior Lab, Department of Sport Science, Justus Liebig University Giessen, 35394 Gießen, Germany
| | - Gudrun Schwarzer
- Department of Developmental Psychology, Justus Liebig University Giessen, 35394 Gießen, Germany
| |
Collapse
|
13
|
Wardle SG, Ewing L, Malcolm GL, Paranjape S, Baker CI. Children perceive illusory faces in objects as male more often than female. Cognition 2023; 235:105398. [PMID: 36791506 PMCID: PMC10085858 DOI: 10.1016/j.cognition.2023.105398] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2022] [Revised: 02/02/2023] [Accepted: 02/04/2023] [Indexed: 02/15/2023]
Abstract
Face pareidolia is the experience of seeing illusory faces in inanimate objects. While children experience face pareidolia, it is unknown whether they perceive gender in illusory faces, as their face evaluation system is still developing in the first decade of life. In a sample of 412 children and adults from 4 to 80 years of age we found that like adults, children perceived many illusory faces in objects to have a gender and had a strong bias to see them as male rather than female, regardless of their own gender identification. These results provide evidence that the male bias for face pareidolia emerges early in life, even before the ability to discriminate gender from facial cues alone is fully developed. Further, the existence of a male bias in children suggests that any social context that elicits the cognitive bias to see faces as male has remained relatively consistent across generations.
Collapse
Affiliation(s)
- Susan G Wardle
- Laboratory of Brain and Cognition, National Institutes of Health, Bethesda, MD, USA.
| | - Louise Ewing
- School of Psychology, University of East Anglia, UK
| | | | - Sanika Paranjape
- Laboratory of Brain and Cognition, National Institutes of Health, Bethesda, MD, USA; Department of Psychological and Brain Sciences, George Washington University, Washington, DC, USA
| | - Chris I Baker
- Laboratory of Brain and Cognition, National Institutes of Health, Bethesda, MD, USA
| |
Collapse
|
14
|
Dela Cruz KL, Kelsey CM, Tong X, Grossmann T. Infant and maternal responses to emotional facial expressions: A longitudinal study. Infant Behav Dev 2023; 71:101818. [PMID: 36739815 PMCID: PMC10257770 DOI: 10.1016/j.infbeh.2023.101818] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2022] [Revised: 01/09/2023] [Accepted: 01/09/2023] [Indexed: 02/05/2023]
Abstract
The current longitudinal study (N = 107) examined mothers' facial emotion recognition using reaction time and their infants' affect-based attention at 5, 7, and 14 months of age using eyetracking. Our results, examining maternal and infant responses to angry, fearful and happy facial expressions, show that only maternal responses to angry facial expressions were robustly and positively linked across time points, indexing a consistent trait-like response to social threat among mothers. However, neither maternal responses to happy or fearful facial expressions nor infant responses to all three facial emotions show such consistency, pointing to the changeable nature of facial emotion processing, especially among infants. In general, infants' attention toward negative emotions (i.e., angry and fear) at earlier timepoints was linked to their affect-biased attention for these emotions at 14 months but showed greater dynamic change across time. Moreover, our results provide limited evidence for developmental continuity in processing negative emotions and for the bidirectional interplay of infant affect-biased attention and maternal facial emotion recognition. This pattern of findings suggests that infants' affect-biased attention to facial expressions of emotion are characterized by dynamic changes.
Collapse
Affiliation(s)
- Kenn L Dela Cruz
- Department of Psychology, University of Virginia, Charlottesville, VA, USA.
| | - Caroline M Kelsey
- Department of Pediatrics, Division of Developmental Medicine, Boston Children's Hospital, Boston, MA, USA; Department of Pediatrics, Harvard Medical School, Boston, MA, USA
| | - Xin Tong
- Department of Psychology, University of Virginia, Charlottesville, VA, USA
| | - Tobias Grossmann
- Department of Psychology, University of Virginia, Charlottesville, VA, USA
| |
Collapse
|
15
|
Rodger H, Sokhn N, Lao J, Liu Y, Caldara R. Developmental eye movement strategies for decoding facial expressions of emotion. J Exp Child Psychol 2023; 229:105622. [PMID: 36641829 DOI: 10.1016/j.jecp.2022.105622] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2022] [Revised: 12/21/2022] [Accepted: 12/23/2022] [Indexed: 01/15/2023]
Abstract
In our daily lives, we routinely look at the faces of others to try to understand how they are feeling. Few studies have examined the perceptual strategies that are used to recognize facial expressions of emotion, and none have attempted to isolate visual information use with eye movements throughout development. Therefore, we recorded the eye movements of children from 5 years of age up to adulthood during recognition of the six "basic emotions" to investigate when perceptual strategies for emotion recognition become mature (i.e., most adult-like). Using iMap4, we identified the eye movement fixation patterns for recognition of the six emotions across age groups in natural viewing and gaze-contingent (i.e., expanding spotlight) conditions. While univariate analyses failed to reveal significant differences in fixation patterns, more sensitive multivariate distance analyses revealed a U-shaped developmental trajectory with the eye movement strategies of the 17- to 18-year-old group most similar to adults for all expressions. A developmental dip in strategy similarity was found for each emotional expression revealing which age group had the most distinct eye movement strategy from the adult group: the 13- to 14-year-olds for sadness recognition; the 11- to 12-year-olds for fear, anger, surprise, and disgust; and the 7- to 8-year-olds for happiness. Recognition performance for happy, angry, and sad expressions did not differ significantly across age groups, but the eye movement strategies for these expressions diverged for each group. Therefore, a unique strategy was not a prerequisite for optimal recognition performance for these expressions. Our data provide novel insights into the developmental trajectories underlying facial expression recognition, a critical ability for adaptive social relations.
Collapse
Affiliation(s)
- Helen Rodger
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, 1700 Fribourg, Switzerland.
| | - Nayla Sokhn
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, 1700 Fribourg, Switzerland
| | - Junpeng Lao
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, 1700 Fribourg, Switzerland
| | - Yingdi Liu
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, 1700 Fribourg, Switzerland
| | - Roberto Caldara
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, 1700 Fribourg, Switzerland.
| |
Collapse
|
16
|
Martini M, Marzola E, Musso M, Brustolin A, Abbate-Daga G. Association of emotion recognition ability and interpersonal emotional competence in anorexia nervosa: A study with a multimodal dynamic task. Int J Eat Disord 2023; 56:407-417. [PMID: 36373846 DOI: 10.1002/eat.23854] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/27/2022] [Revised: 11/04/2022] [Accepted: 11/04/2022] [Indexed: 11/16/2022]
Abstract
OBJECTIVE Interpersonal difficulties are evidenced in Anorexia Nervosa (AN) and are thought to contribute to disease onset and maintenance, however, research in the framework of emotional competence is currently limited. Previous studies have often only used static images for emotion recognition tasks, and evidence is lacking on the relationships between performance-based emotional abilities and self-reported intra- and interpersonal emotional traits. This study aimed to test multimodal dynamic emotion recognition ability in AN and analyze its correlation with the psychometric scores of self- and other-related emotional competence. METHOD A total of 268 participants (128 individuals with AN and 140 healthy controls) completed the Geneva Emotion Recognition Test, the Profile of Emotional Competence, the Reading the Mind in the Eyes Test, and measures of general and eating psychopathology. Scores were compared between the two groups. Linear mixed effects models were utilized to examine the relationship between emotion recognition ability and self-reported measures and clinical variables. RESULTS Individuals with AN showed significantly poorer recognition of emotions of both negative and positive valence and significantly lower scores in all emotional competence dimensions. Beside emotion type and group, linear mixed models evidenced significant effects of interpersonal comprehension on emotion recognition ability. DISCUSSION Individuals with AN show impairment in multimodal emotion recognition and report their difficulties accordingly. Notably, among all emotional competence dimensions, interpersonal comprehension emerges as a significant correlate to emotion recognition in others, and could represent a specific area of intervention in the treatment of individuals with AN. PUBLIC SIGNIFICANCE In this study, we evidence that the ability to recognize the emotions displayed by others is related to the level of interpersonal emotional competence reported by individuals with anorexia nervosa. This result helps in understanding the social impairments in people with anorexia nervosa and could contribute to advancements in the application of the training of emotional competence in the treatment of this disorder.
Collapse
Affiliation(s)
- Matteo Martini
- Eating Disorders Center, Department of Neuroscience "Rita Levi Montalcini,", University of Turin, Turin, Italy
| | - Enrica Marzola
- Eating Disorders Center, Department of Neuroscience "Rita Levi Montalcini,", University of Turin, Turin, Italy
| | - Maria Musso
- Eating Disorders Center, Department of Neuroscience "Rita Levi Montalcini,", University of Turin, Turin, Italy
| | - Annalisa Brustolin
- Eating Disorders Center, Department of Neuroscience "Rita Levi Montalcini,", University of Turin, Turin, Italy
| | - Giovanni Abbate-Daga
- Eating Disorders Center, Department of Neuroscience "Rita Levi Montalcini,", University of Turin, Turin, Italy
| |
Collapse
|
17
|
Cuzzocrea F, Gugliandolo MC, Cannavò M, Liga F. Emotion recognition in individuals wearing facemasks: a preliminary analysis of age-related differences. CURRENT PSYCHOLOGY 2023; 42:1-4. [PMID: 36684462 PMCID: PMC9843093 DOI: 10.1007/s12144-023-04239-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2022] [Revised: 11/11/2022] [Accepted: 01/03/2023] [Indexed: 01/19/2023]
Abstract
COVID-19 is severely affecting individuals' lives worldwide. Previous research warned that facial occlusion may impair facial emotion recognition, whilst other findings suggested that age-related differences may be relevant in emotion recognition in others' faces. However, studies observing individuals' ability to interpret others' facial mimicry are heterogenous, thus precluding the generalizability of the findings. This preliminary study examined age-related differences and the influence of different covering types (with and without face masks) in determining different levels of facial emotion recognition. 131 participants were split into 3 age-groups (10-14; 15-17; 20-25) and were asked to complete an emotion recognition task. Participants were better able to recognize facial emotions without any occlusion, and happiness was the most recognizable emotion. Moreover, adolescent group performed better in recognizing anger and fear in stimuli depicting masked and unmasked faces. Current results suggest the importance of monitoring emotion recognition abilities in developing individuals during the COVID-19 pandemic.
Collapse
Affiliation(s)
- Francesca Cuzzocrea
- Dipartimento di Scienze della Salute, Università degli Studi “Magna Graecia” di Catanzaro, Catanzaro, Italy
| | | | - Marco Cannavò
- Dipartimento di Scienze della Salute, Università degli Studi “Magna Graecia” di Catanzaro, Catanzaro, Italy
| | - Francesca Liga
- Dipartimento di Medicina Clinica e Sperimentale, Università degli Studi di Messina, Messina, Italy
| |
Collapse
|
18
|
Li C, Wen C, Qiu Y. A Video Sequence Face Expression Recognition Method Based on Squeeze-and-Excitation and 3DPCA Network. SENSORS (BASEL, SWITZERLAND) 2023; 23:823. [PMID: 36679620 PMCID: PMC9861482 DOI: 10.3390/s23020823] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/07/2022] [Revised: 12/26/2022] [Accepted: 01/07/2023] [Indexed: 06/17/2023]
Abstract
Expression recognition is a very important direction for computers to understand human emotions and human-computer interaction. However, for 3D data such as video sequences, the complex structure of traditional convolutional neural networks, which stretch the input 3D data into vectors, not only leads to a dimensional explosion, but also fails to retain structural information in 3D space, simultaneously leading to an increase in computational cost and a lower accuracy rate of expression recognition. This paper proposes a video sequence face expression recognition method based on Squeeze-and-Excitation and 3DPCA Network (SE-3DPCANet). The introduction of a 3DPCA algorithm in the convolution layer directly constructs tensor convolution kernels to extract the dynamic expression features of video sequences from the spatial and temporal dimensions, without weighting the convolution kernels of adjacent frames by shared weights. Squeeze-and-Excitation Network is introduced in the feature encoding layer, to automatically learn the weights of local channel features in the tensor features, thus increasing the representation capability of the model and further improving recognition accuracy. The proposed method is validated on three video face expression datasets. Comparisons were made with other common expression recognition methods, achieving higher recognition rates while significantly reducing the time required for training.
Collapse
Affiliation(s)
- Chang Li
- School of Automation, Guangdong University of Petrochemical Technology, Maoming 525000, China
| | - Chenglin Wen
- School of Automation, Guangdong University of Petrochemical Technology, Maoming 525000, China
| | - Yiting Qiu
- School of Automation, Hangzhou Dianzi University, Hangzhou 310018, China
| |
Collapse
|
19
|
Developmental trajectory of time perception from childhood to adolescence. CURRENT PSYCHOLOGY 2022. [DOI: 10.1007/s12144-022-03526-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022]
|
20
|
de Lissa P, Watanabe K, Gu L, Ishii T, Nakamura K, Kimura T, Sagasaki A, Caldara R. Race categorization in noise. Iperception 2022; 13:20416695221119530. [PMID: 36061242 PMCID: PMC9437912 DOI: 10.1177/20416695221119530] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2022] [Accepted: 07/27/2022] [Indexed: 11/17/2022] Open
Abstract
People are typically faster to categorize the race of a face if it belongs to a race different from their own. This Other Race Categorization Advantage (ORCA) is thought to reflect an enhanced sensitivity to the visual race signals of other race faces, leading to faster response times. The current study investigated this sensitivity in a cross-cultural sample of Swiss and Japanese observers with a race categorization task using faces that had been parametrically degraded of visual structure, with normalized luminance and contrast. While Swiss observers exhibited an increasingly strong ORCA in both reaction time and accuracy as the face images were visually degraded up to 20% structural coherence, the Japanese observers manifested this pattern most distinctly when the faces were fully structurally-intact. Critically, for both observer groups, there was a clear accuracy effect at the 20% structural coherence level, indicating that the enhanced sensitivity to other race visual signals persists in significantly degraded stimuli. These results suggest that different cultural groups may rely on and extract distinct types of visual race signals during categorization, which may depend on the available visual information. Nevertheless, heavily degraded stimuli specifically favor the perception of other race faces, indicating that the visual system is tuned by experience and is sensitive to the detection of unfamiliar signals.
Collapse
Affiliation(s)
| | | | - Li Gu
- School of Innovation Design, Guangzhou Academy of Fine Arts,
Guangzhou, China
| | - Tatsunori Ishii
- Japan Womens' University, Tokyo, Japan; Waseda University, Tokyo, Japan
| | - Koyo Nakamura
- University of Vienna, Vienna, Austria; Japan Society for the Promotion of Science, Tokyo, Japan; Waseda University, Tokyo, Japan
| | | | | | | |
Collapse
|
21
|
Della Longa L, Nosarti C, Farroni T. Emotion Recognition in Preterm and Full-Term School-Age Children. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:6507. [PMID: 35682092 PMCID: PMC9180201 DOI: 10.3390/ijerph19116507] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/15/2022] [Revised: 05/15/2022] [Accepted: 05/25/2022] [Indexed: 12/04/2022]
Abstract
Children born preterm (<37 weeks’ gestation) show a specific vulnerability for socio-emotional difficulties, which may lead to an increased likelihood of developing behavioral and psychiatric problems in adolescence and adulthood. The accurate decoding of emotional signals from faces represents a fundamental prerequisite for early social interactions, allowing children to derive information about others’ feelings and intentions. The present study aims to explore possible differences between preterm and full-term children in the ability to detect emotional expressions, as well as possible relationships between this ability and socio-emotional skills and problem behaviors during everyday activities. We assessed 55 school-age children (n = 34 preterm and n = 21 full-term) with a cognitive battery that ensured comparable cognitive abilities between the two groups. Moreover, children were asked to identify emotional expressions from pictures of peers’ faces (Emotion Recognition Task). Finally, children’s emotional, social and behavioral outcomes were assessed with parent-reported questionnaires. The results revealed that preterm children were less accurate than full-term children in detecting positive emotional expressions and they showed poorer social and behavioral outcomes. Notably, correlational analyses showed a relationship between the ability to recognize emotional expressions and socio-emotional functioning. The present study highlights that early difficulties in decoding emotional signals from faces may be critically linked to emotional and behavioral regulation problems, with important implications for the development of social skills and effective interpersonal interactions.
Collapse
Affiliation(s)
- Letizia Della Longa
- Developmental Psychology and Socialization Department, University of Padova, 35131 Padova, Italy;
| | - Chiara Nosarti
- Department of Child and Adolescent Psychiatry, King’s College London, London SE5 8AF, UK;
| | - Teresa Farroni
- Developmental Psychology and Socialization Department, University of Padova, 35131 Padova, Italy;
| |
Collapse
|
22
|
Sandre A, Morningstar M, Farrell-Reeves A, Dirks M, Weinberg A. Adolescents and young adults differ in their neural response to and recognition of adolescent and adult emotional faces. Psychophysiology 2022; 59:e14060. [PMID: 35357699 DOI: 10.1111/psyp.14060] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 02/25/2022] [Accepted: 03/11/2022] [Indexed: 11/28/2022]
Abstract
Peer relationships become increasingly important during adolescence. The success of these relationships may rely on the ability to attend to and decode subtle or ambiguous emotional expressions that are common in social interactions. However, most studies examining youths' processing and labeling of facial emotion have employed adult faces and faces that depict emotional extremes as stimuli. In this study, 40 adolescents and 40 young adults viewed blends of angry-neutral, fearful-neutral, and happy-neutral faces (e.g., 100% angry, 66% angry, 33% angry, neutral) portrayed by adolescent and adult actors as electroencephalogram (EEG) was recorded. Participants also labeled these faces according to the emotion expressed (i.e., angry, fearful, happy, or neutral). The Late Positive Potential (LPP), an event-related potential (ERP) component that reflects sustained attention to motivationally salient information, was scored from the EEG following face presentation. Among adolescents, as peer-age faces moved from ambiguous (33%) to unambiguous (100%) emotional expression, the LPP similarly increased. These effects were not found when adolescents viewed emotional face blends portrayed by adult actors. Additionally, while both adolescents and young adults showed greater emotion labeling accuracy as faces increased in emotional intensity from ambiguous to unambiguous emotional expression, adolescent participants did not show greater accuracy when labeling peer-compared to adult-age faces. Together, these data suggest that adolescents attend more to subtle differences in peer-age emotional faces, but they do not label these emotional expressions more accurately than adults.
Collapse
Affiliation(s)
- Aislinn Sandre
- Department of Psychology, McGill University, Montreal, Quebec, Canada
| | | | | | - Melanie Dirks
- Department of Psychology, McGill University, Montreal, Quebec, Canada
| | - Anna Weinberg
- Department of Psychology, McGill University, Montreal, Quebec, Canada
| |
Collapse
|
23
|
Palix J, Abu-Akel A, Moulin V, Abbiati M, Gasser J, Hasler C, Marcot D, Mohr C, Dan-Glauser E. The Utility of Physiological Measures in Assessing the Empathic Skills of Incarcerated Violent Offenders. INTERNATIONAL JOURNAL OF OFFENDER THERAPY AND COMPARATIVE CRIMINOLOGY 2022; 66:98-122. [PMID: 33567952 PMCID: PMC8609505 DOI: 10.1177/0306624x21994056] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Since lack of empathy is an important indicator of violent behaviors, researchers need consistent and valid measures. This study evaluated the practical significance of a potential physiological correlate of empathy compared to a traditional self-report questionnaire in 18 male violent offenders and 21 general population controls. Empathy skills were assessed with the Interpersonal Reactivity Index (IRI) questionnaire. Heart-Rate Variability (HRV) was assessed with an electrocardiogram. The RMSSD (Root Mean Square of the Successive beat-to-beat Differences), an HRV index implicated in social cognition, was calculated. There were no group differences in IRI scores. However, RMSSD was lower in the offender group. Positive correlations between RMSSD and IRI subscales were found for controls only. We conclude that psychometric measures of empathy do not discriminate incarcerated violent offenders, and that the incorporation of psychophysiological measures, such as HRV, could be an avenue for forensic research on empathy to establish translatable evidence-based information.
Collapse
Affiliation(s)
- Julie Palix
- Centre Hospitalier Universitaire Vaudois, Lausanne, Switzerland
| | - Ahmad Abu-Akel
- Institute of Psychology, University of Lausanne, Switzerland
| | - Valérie Moulin
- Centre Hospitalier Universitaire Vaudois, Lausanne, Switzerland
| | - Milena Abbiati
- Centre Hospitalier Universitaire Vaudois, Lausanne, Switzerland
| | - Jacques Gasser
- Centre Hospitalier Universitaire Vaudois, Lausanne, Switzerland
| | | | | | - Christine Mohr
- Institute of Psychology, University of Lausanne, Switzerland
| | - Elise Dan-Glauser
- Centre Hospitalier Universitaire Vaudois, Lausanne, Switzerland
- Institute of Psychology, University of Lausanne, Switzerland
| |
Collapse
|
24
|
Galarneau E, Colasante T, Speidel R, Malti T. Correlates of children's sympathy: Recognition and regulation of sadness and anger. SOCIAL DEVELOPMENT 2021. [DOI: 10.1111/sode.12577] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Affiliation(s)
- Emma Galarneau
- Department of Psychology University of Toronto, Toronto, Ontario, Canada
- Centre for Child Development, Mental Health, and Policy University of Toronto Mississauga, Ontario, Canada
| | - Tyler Colasante
- Department of Psychology University of Toronto, Toronto, Ontario, Canada
- Centre for Child Development, Mental Health, and Policy University of Toronto Mississauga, Ontario, Canada
| | - Ruth Speidel
- Department of Psychology University of Toronto, Toronto, Ontario, Canada
- Centre for Child Development, Mental Health, and Policy University of Toronto Mississauga, Ontario, Canada
| | - Tina Malti
- Department of Psychology University of Toronto, Toronto, Ontario, Canada
- Centre for Child Development, Mental Health, and Policy University of Toronto Mississauga, Ontario, Canada
| |
Collapse
|
25
|
Barisnikov K, Thomasson M, Stutzmann J, Lejeune F. Sensitivity to Emotion Intensity and Recognition of Emotion Expression in Neurotypical Children. CHILDREN (BASEL, SWITZERLAND) 2021; 8:children8121108. [PMID: 34943304 PMCID: PMC8700579 DOI: 10.3390/children8121108] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/14/2021] [Revised: 11/22/2021] [Accepted: 11/25/2021] [Indexed: 06/14/2023]
Abstract
This study assessed two components of face emotion processing: emotion recognition and sensitivity to intensity of emotion expressions and their relation in children age 4 to 12 (N = 216). Results indicated a slower development in the accurate decoding of low intensity expressions compared to high intensity. Between age 4 and 12, children discriminated high intensity expressions better than low ones. The intensity of expression had a stronger impact on overall face expression recognition. High intensity happiness was better recognized than low intensity up to age 11, while children 4 to 12 had difficulties discriminating between high and low intensity sadness. Our results suggest that sensitivity to low intensity expressions acts as a complementary mediator between age and emotion expression recognition, while this was not the case for the recognition of high intensity expressions. These results could help in the development of specific interventions for populations presenting socio-cognitive and emotion difficulties.
Collapse
|
26
|
Vesker M, Bahn D, Kauschke C, Schwarzer G. Developmental Changes in Gaze Behavior and the Effects of Auditory Emotion Word Priming in Emotional Face Categorization. Multisens Res 2021; 35:1-21. [PMID: 34534967 DOI: 10.1163/22134808-bja10063] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2020] [Accepted: 08/26/2021] [Indexed: 11/19/2022]
Abstract
Social interactions often require the simultaneous processing of emotions from facial expressions and speech. However, the development of the gaze behavior used for emotion recognition, and the effects of speech perception on the visual encoding of facial expressions is less understood. We therefore conducted a word-primed face categorization experiment, where participants from multiple age groups (six-year-olds, 12-year-olds, and adults) categorized target facial expressions as positive or negative after priming with valence-congruent or -incongruent auditory emotion words, or no words at all. We recorded our participants' gaze behavior during this task using an eye-tracker, and analyzed the data with respect to the fixation time toward the eyes and mouth regions of faces, as well as the time until participants made the first fixation within those regions (time to first fixation, TTFF). We found that the six-year-olds showed significantly higher accuracy in categorizing congruently primed faces compared to the other conditions. The six-year-olds also showed faster response times, shorter total fixation durations, and faster TTFF measures in all primed trials, regardless of congruency, as compared to unprimed trials. We also found that while adults looked first, and longer, at the eyes as compared to the mouth regions of target faces, children did not exhibit this gaze behavior. Our results thus indicate that young children are more sensitive than adults or older children to auditory emotion word primes during the perception of emotional faces, and that the distribution of gaze across the regions of the face changes significantly from childhood to adulthood.
Collapse
Affiliation(s)
- Michael Vesker
- Department of Developmental Psychology, Justus-Liebig-Universität Gießen, 35394 Giessen, Germany
| | - Daniela Bahn
- Clinical Linguistics, Department of German Linguistics, Philipps-Universität Marburg, 35032 Marburg, Germany
| | - Christina Kauschke
- Clinical Linguistics, Department of German Linguistics, Philipps-Universität Marburg, 35032 Marburg, Germany
| | - Gudrun Schwarzer
- Department of Developmental Psychology, Justus-Liebig-Universität Gießen, 35394 Giessen, Germany
| |
Collapse
|
27
|
Rodger H, Lao J, Stoll C, Richoz AR, Pascalis O, Dye M, Caldara R. The recognition of facial expressions of emotion in deaf and hearing individuals. Heliyon 2021; 7:e07018. [PMID: 34041389 PMCID: PMC8141778 DOI: 10.1016/j.heliyon.2021.e07018] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Revised: 03/25/2021] [Accepted: 05/04/2021] [Indexed: 11/25/2022] Open
Abstract
During real-life interactions, facial expressions of emotion are perceived dynamically with multimodal sensory information. In the absence of auditory sensory channel inputs, it is unclear how facial expressions are recognised and internally represented by deaf individuals. Few studies have investigated facial expression recognition in deaf signers using dynamic stimuli, and none have included all six basic facial expressions of emotion (anger, disgust, fear, happiness, sadness, and surprise) with stimuli fully controlled for their low-level visual properties, leaving the question of whether or not a dynamic advantage for deaf observers exists unresolved. We hypothesised, in line with the enhancement hypothesis, that the absence of auditory sensory information might have forced the visual system to better process visual (unimodal) signals, and predicted that this greater sensitivity to visual stimuli would result in better recognition performance for dynamic compared to static stimuli, and for deaf-signers compared to hearing non-signers in the dynamic condition. To this end, we performed a series of psychophysical studies with deaf signers with early-onset severe-to-profound deafness (dB loss >70) and hearing controls to estimate their ability to recognize the six basic facial expressions of emotion. Using static, dynamic, and shuffled (randomly permuted video frames of an expression) stimuli, we found that deaf observers showed similar categorization profiles and confusions across expressions compared to hearing controls (e.g., confusing surprise with fear). In contrast to our hypothesis, we found no recognition advantage for dynamic compared to static facial expressions for deaf observers. This observation shows that the decoding of dynamic facial expression emotional signals is not superior even in the deaf expert visual system, suggesting the existence of optimal signals in static facial expressions of emotion at the apex. Deaf individuals match hearing individuals in the recognition of facial expressions of emotion.
Collapse
Affiliation(s)
- Helen Rodger
- Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Junpeng Lao
- Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Chloé Stoll
- Laboratoire de Psychologie et de Neurocognition (CNRS-UMR5105), Université Grenoble-Alpes, France
| | | | - Olivier Pascalis
- Laboratoire de Psychologie et de Neurocognition (CNRS-UMR5105), Université Grenoble-Alpes, France
| | - Matthew Dye
- National Technical Institute for Deaf/Rochester Institute of Technology, Rochester, New York, USA
| | - Roberto Caldara
- Department of Psychology, University of Fribourg, Fribourg, Switzerland
| |
Collapse
|
28
|
Mayor Torres JM, Clarkson T, Hauschild KM, Luhmann CC, Lerner MD, Riccardi G. Facial Emotions Are Accurately Encoded in the Neural Signal of Those With Autism Spectrum Disorder: A Deep Learning Approach. BIOLOGICAL PSYCHIATRY: COGNITIVE NEUROSCIENCE AND NEUROIMAGING 2021; 7:688-695. [PMID: 33862256 DOI: 10.1016/j.bpsc.2021.03.015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/03/2021] [Revised: 03/31/2021] [Accepted: 03/31/2021] [Indexed: 11/18/2022]
Abstract
BACKGROUND Individuals with autism spectrum disorder (ASD) exhibit frequent behavioral deficits in facial emotion recognition (FER). It remains unknown whether these deficits arise because facial emotion information is not encoded in their neural signal or because it is encodes but fails to translate to FER behavior (deployment). This distinction has functional implications, including constraining when differences in social information processing occur in ASD, and guiding interventions (i.e., developing prosthetic FER vs. reinforcing existing skills). METHODS We utilized a discriminative and contemporary machine learning approach-deep convolutional neural networks-to classify facial emotions viewed by individuals with and without ASD (N = 88) from concurrently recorded electroencephalography signals. RESULTS The convolutional neural network classified facial emotions with high accuracy for both ASD and non-ASD groups, even though individuals with ASD performed more poorly on the concurrent FER task. In fact, convolutional neural network accuracy was greater in the ASD group and was not related to behavioral performance. This pattern of results replicated across three independent participant samples. Moreover, feature importance analyses suggested that a late temporal window of neural activity (1000-1500 ms) may be uniquely important in facial emotion classification for individuals with ASD. CONCLUSIONS Our results reveal for the first time that facial emotion information is encoded in the neural signal of individuals with (and without) ASD. Thus, observed difficulties in behavioral FER associated with ASD likely arise from difficulties in decoding or deployment of facial emotion information within the neural signal. Interventions should focus on capitalizing on this intact encoding rather than promoting compensation or FER prostheses.
Collapse
Affiliation(s)
- Juan Manuel Mayor Torres
- Department of Information Engineering and Computer Science, University of Trento, Povo Trento, Italy
| | - Tessa Clarkson
- Department of Psychology, Temple University, Philadelphia, Pennsylvania.
| | | | - Christian C Luhmann
- Department of Psychology, Stony Brook University, Stony Brook, New York; Institute for Advanced Computational Science, Stony Brook University, Stony Brook, New York
| | - Matthew D Lerner
- Department of Psychology, Stony Brook University, Stony Brook, New York; Department of Psychology, University of Virginia, Charlottesville, Virginia
| | - Giuseppe Riccardi
- Department of Information Engineering and Computer Science, University of Trento, Povo Trento, Italy
| |
Collapse
|
29
|
Patoilo MS, Berman ME, Coccaro EF. Emotion attribution in intermittent explosive disorder. Compr Psychiatry 2021; 106:152229. [PMID: 33662604 DOI: 10.1016/j.comppsych.2021.152229] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/02/2020] [Revised: 12/29/2020] [Accepted: 01/15/2021] [Indexed: 11/29/2022] Open
Abstract
BACKGROUND Accurate recognition of the emotions of others is an important part of healthy neurological development and promotes positive psychosocial adaptation. Differences in emotional recognition may be associated with the presence of emotional biases and can alter one's perception, thus influencing their overall social cognition abilities. The present study aims to extend our collective understanding of emotion attribution abnormalities in individuals with Intermittent Explosive Disorder (IED). METHODS Two-hundred and forty-two adults participated, separated into groups of those diagnosed with IED according to DSM 5 criteria, Psychiatric Controls (PC), and Healthy Controls (HC). Participants completed a modified version of the Emotional Attribution Task wherein they attributed an emotion to the main character of a short vignette. RESULTS Participants with IED correctly identified anger stories and misattributed anger to non-anger stories significantly more often than PC and HC participants. They were also significantly less likely than HC participants to correctly identify "sad stories." LIMITATIONS We utilized self-report assessments in a community-recruited sample. Replication in a clinical is suggested. CONCLUSIONS Findings from this study support the validity of IED as a diagnostic entity and provide important information about how individuals with psychiatric disorders perceive and experience emotional cues.
Collapse
Affiliation(s)
- Michaela S Patoilo
- Department of Psychology, Mississippi State, Starkville, MS, United States of America
| | - Mitchell E Berman
- Department of Psychology, Mississippi State, Starkville, MS, United States of America
| | - Emil F Coccaro
- Clinical Neuroscience and Psychotherapeutics Research Unit, Department of Psychiatry and Behavioral Health, The Ohio State University Wexner Medical Center, Columbus, OH, United States of America.
| |
Collapse
|
30
|
Chronic early trauma impairs emotion recognition and executive functions in youth; specifying biobehavioral precursors of risk and resilience. Dev Psychopathol 2021; 34:1339-1352. [PMID: 33779536 DOI: 10.1017/s0954579421000067] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
Exposure to chronic early trauma carries lasting effects on children's well-being and adaptation. Guided by models on resilience, we assessed the interplay of biological, emotional, cognitive, and relational factors in shaping two regulatory outcomes in trauma-exposed youth: emotion recognition (ER) and executive functions (EF). A unique war-exposed cohort was followed from early childhood to early adolescence. At preadolescence (11-13 years), ER and EF were assessed and respiratory sinus arrhythmia (RSA), biomarker of parasympathetic regulation, was quantified. Mother-child dyadic reciprocity, child's avoidance symptoms, and cortisol (CT) were measured in early childhood. Trauma-exposed youth displayed impaired ER and EF abilities. Conditional process analysis described two differential indirect paths leading from early trauma to regulatory outcomes. ER was mediated by avoidance symptoms in early childhood and modulated by cortisol, such that this path was evident only for preadolescents with high, but not low, CT. In comparison, EF was mediated by the degree of dyadic reciprocity experienced in early childhood and modulated by RSA, observed only among youth with lower RSA. Findings pinpoint trauma-related disruptions to key regulatory support systems in preadolescence as mediated by early-childhood relational, clinical, and physiological factors and highlight the need to specify biobehavioral precursors of resilience toward targeted early interventions.
Collapse
|
31
|
Improving emotion recognition is associated with subsequent mental health and well-being in children with severe behavioural problems. Eur Child Adolesc Psychiatry 2021; 30:1769-1777. [PMID: 32997168 PMCID: PMC8558267 DOI: 10.1007/s00787-020-01652-y] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/09/2020] [Accepted: 09/20/2020] [Indexed: 12/02/2022]
Abstract
Impaired emotion recognition is a transdiagnostic risk factor for a range of psychiatric disorders. It has been argued that improving emotion recognition may lead to improvements in behaviour and mental health, but supportive evidence is limited. We assessed emotion recognition and mental health following a brief and targeted computerised emotion recognition training in children referred into an intervention program because of severe family adversity and behavioural problems (n = 62; aged 7-10). While all children continued to receive their usual interventions, only children impaired in emotion recognition (n = 40) received the emotion training. Teachers blind to whether or not children had received the training rated children's mental health problems before and 6 months after the training. Participants who received the emotion training significantly improved their recognition of negative and neutral facial expressions. Although both groups showed improved behaviour at follow-up, the reduction in behavioural problems was only significant in children who received the emotion training. Post-training emotion recognition scores predicted mental health problems 6 months later independently of initial emotion recognition ability and severity of behavioural problems. The results are consistent with the view that targeting emotion recognition can improve longer term functioning in individuals with disruptive behaviour, although further research using fully randomised designs is needed before causal conclusions can be drawn with confidence.
Collapse
|
32
|
Jelili S, Halayem S, Taamallah A, Ennaifer S, Rajhi O, Moussa M, Ghazzei M, Nabli A, Ouanes S, Abbes Z, Hajri M, Fakhfakh R, Bouden A. Impaired Recognition of Static and Dynamic Facial Emotions in Children With Autism Spectrum Disorder Using Stimuli of Varying Intensities, Different Genders, and Age Ranges Faces. Front Psychiatry 2021; 12:693310. [PMID: 34489754 PMCID: PMC8417587 DOI: 10.3389/fpsyt.2021.693310] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/10/2021] [Accepted: 07/26/2021] [Indexed: 11/13/2022] Open
Abstract
A multitude of research on facial emotion recognition (FER) in Autism Spectrum Disorders (ASD) have been published since several years. However, these studies have mainly used static high intensity stimuli, including adult and/or children facial emotions. This current study investigated FER in children with ASD using an innovative task, composed of a combination of static (114 pictures) and dynamic (36 videos) subtests, including children, adolescent, and adult male and female faces, with high, medium, and low intensity of basic facial emotions, and neutral expression. The ASD group consisted of 45 Tunisian verbal children, and the control group consisted of 117 tunisian typically developing children. Both groups were aged 7-12 years. After adjusting for sex, age, mental age, and school grade, the ASD group scored lower than controls on all tests except for the recognition of happiness and fear in the static subtest, and the recognition of happiness, fear, and sadness in the dynamic subtest (p ≥ 0.05). In the ASD group, the total score of both the static and the dynamic subtest were positively correlated with the school grade (p < 0.001), but not with age, or mental age. Children with ASD performed better in recognizing facial emotions in children than in adults and adolescents on videos and photos (p < 0.001). Impairments in FER would have negative impact on the child's social development. Thus, the creation of new intervention instruments aiming to improve emotion recognition strategies at an early stage to individuals with ASD seems fundamental.
Collapse
Affiliation(s)
- Selima Jelili
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Soumeyya Halayem
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Amal Taamallah
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Selima Ennaifer
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Olfa Rajhi
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Mohamed Moussa
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Melek Ghazzei
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Ahmed Nabli
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia
| | - Sami Ouanes
- Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia.,Department of Psychiatry- Hamad Medical Corporation, Doha, Qatar
| | - Zeineb Abbes
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Malek Hajri
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | | | - Asma Bouden
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| |
Collapse
|
33
|
Dados normativos de um conjunto de faces do Karolinska Directed Emotional Faces em uma amostra brasileira. PSICO 2020. [DOI: 10.15448/1980-8623.2020.3.34083] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022] Open
Abstract
O objetivo desta pesquisa foi obter dados normativos de um conjunto de faces do Karolinska Directed Emotional Faces (KDEF) em uma amostra brasileira. Para isso foi utilizada uma amostra não probabilística (por conveniência) de 100 participantes da cidade de João Pessoa-PB. Esses tinham idades entre 18 e 62 anos (M=21,6; DP=6,2), a maioria do sexo feminino (76%). Os resultados mostraram que os participantes obtiveram um percentual de acerto médio de 76,2%, de modo que expressões de Alegria (94.7%) e Surpresa (90.3%) foram as emoções mais facilmente identificáveis e Medo (40.65%) a mais difícil. Em relação às medidas de intensidade e valência, Nojo seguida de Surpresa obtiveram classificações mais intensas, e Alegria foi a única emoção com valência positiva alta. Esses achados foram bastante similares com àqueles relatados em pesquisas anteriores, fornecendo normas subjetivas de classificação mais adequadas às características da população brasileira.
Collapse
|
34
|
Mei G, Li Y, Chen S, Cen M, Bao M. Lower recognition thresholds for sad facial expressions in subthreshold depression: a longitudinal study. Psychiatry Res 2020; 294:113499. [PMID: 33068912 DOI: 10.1016/j.psychres.2020.113499] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/21/2020] [Accepted: 10/04/2020] [Indexed: 10/23/2022]
Abstract
Subthreshold depression (StD) is more prevalent than major depressive disorder (MDD) all over the world. Previous studies have indicated that depression is associated with impaired perception of facial expressions. However, for individuals with StD, whether perceptual sensitivity toward facial expressions could be altered and whether these alterations could stabilize over time remain largely unknown. Using the QUEST psychometric procedure, here we assessed recognition thresholds of five facial expressions (angry, fear, happy, sad and neutral) for individuals with StD and non-depressed controls. These subjects were retested after approximately 2-month time intervals. At the initial assessment, individuals with StD demonstrated lower recognition thresholds (i.e., stronger sensitivity) for only sadness compared to non-depressed controls. At the follow-up assessment, we classified the StD group as two subgroups: the non-remitted and the remitted group. For the former, lower recognition thresholds for only sadness were again found; for the latter, there was no significant difference. More importantly, individuals displaying lower recognition thresholds for sadness at the initial assessment were less likely to improve in depressive symptoms at the follow-up assessment. These results indicate that the alteration of perceptual sensitivity toward the sad expression for individuals with StD is associated with the current clinical state.
Collapse
Affiliation(s)
- Gaoxing Mei
- School of Psychology, Guizhou Normal University, Guiyang, PR China.
| | - Yufeng Li
- School of Psychology, Guizhou Normal University, Guiyang, PR China
| | - Shiyu Chen
- School of Psychology, Guizhou Normal University, Guiyang, PR China
| | - Mofen Cen
- School of Psychology, Guizhou Normal University, Guiyang, PR China
| | - Min Bao
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, PR China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, PR China; State Key Laboratory of Brain and Cognitive Science, Beijing, PR China
| |
Collapse
|
35
|
Samaey C, Van der Donck S, van Winkel R, Boets B. Facial Expression Processing Across the Autism-Psychosis Spectra: A Review of Neural Findings and Associations With Adverse Childhood Events. Front Psychiatry 2020; 11:592937. [PMID: 33281648 PMCID: PMC7691238 DOI: 10.3389/fpsyt.2020.592937] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/08/2020] [Accepted: 10/09/2020] [Indexed: 11/13/2022] Open
Abstract
Autism spectrum disorder (ASD) and primary psychosis are classified as distinct neurodevelopmental disorders, yet they display overlapping epidemiological, environmental, and genetic components as well as endophenotypic similarities. For instance, both disorders are characterized by impairments in facial expression processing, a crucial skill for effective social communication, and both disorders display an increased prevalence of adverse childhood events (ACE). This narrative review provides a brief summary of findings from neuroimaging studies investigating facial expression processing in ASD and primary psychosis with a focus on the commonalities and differences between these disorders. Individuals with ASD and primary psychosis activate the same brain regions as healthy controls during facial expression processing, albeit to a different extent. Overall, both groups display altered activation in the fusiform gyrus and amygdala as well as altered connectivity among the broader face processing network, probably indicating reduced facial expression processing abilities. Furthermore, delayed or reduced N170 responses have been reported in ASD and primary psychosis, but the significance of these findings is questioned, and alternative frequency-tagging electroencephalography (EEG) measures are currently explored to capture facial expression processing impairments more selectively. Face perception is an innate process, but it is also guided by visual learning and social experiences. Extreme environmental factors, such as adverse childhood events, can disrupt normative development and alter facial expression processing. ACE are hypothesized to induce altered neural facial expression processing, in particular a hyperactive amygdala response toward negative expressions. Future studies should account for the comorbidity among ASD, primary psychosis, and ACE when assessing facial expression processing in these clinical groups, as it may explain some of the inconsistencies and confound reported in the field.
Collapse
Affiliation(s)
- Celine Samaey
- Department of Neurosciences, Center for Clinical Psychiatry, KU Leuven, Leuven, Belgium
| | - Stephanie Van der Donck
- Department of Neurosciences, Center for Developmental Psychiatry, KU Leuven, Leuven, Belgium
- Leuven Autism Research (LAuRes), KU Leuven, Leuven, Belgium
| | - Ruud van Winkel
- Department of Neurosciences, Center for Clinical Psychiatry, KU Leuven, Leuven, Belgium
- University Psychiatric Center (UPC), KU Leuven, Leuven, Belgium
| | - Bart Boets
- Department of Neurosciences, Center for Developmental Psychiatry, KU Leuven, Leuven, Belgium
- Leuven Autism Research (LAuRes), KU Leuven, Leuven, Belgium
| |
Collapse
|
36
|
Vandewouw MM, Choi E, Hammill C, Arnold P, Schachar R, Lerch JP, Anagnostou E, Taylor MJ. Emotional face processing across neurodevelopmental disorders: a dynamic faces study in children with autism spectrum disorder, attention deficit hyperactivity disorder and obsessive-compulsive disorder. Transl Psychiatry 2020; 10:375. [PMID: 33139709 PMCID: PMC7608673 DOI: 10.1038/s41398-020-01063-2] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/19/2019] [Revised: 04/14/2020] [Accepted: 04/21/2020] [Indexed: 12/12/2022] Open
Abstract
Autism spectrum disorder (ASD) is classically associated with poor face processing skills, yet evidence suggests that those with obsessive-compulsive disorder (OCD) and attention deficit hyperactivity disorder (ADHD) also have difficulties understanding emotions. We determined the neural underpinnings of dynamic emotional face processing across these three clinical paediatric groups, including developmental trajectories, compared with typically developing (TD) controls. We studied 279 children, 5-19 years of age but 57 were excluded due to excessive motion in fMRI, leaving 222: 87 ASD, 44 ADHD, 42 OCD and 49 TD. Groups were sex- and age-matched. Dynamic faces (happy, angry) and dynamic flowers were presented in 18 pseudo-randomized blocks while fMRI data were collected with a 3T MRI. Group-by-age interactions and group difference contrasts were analysed for the faces vs. flowers and between happy and angry faces. TD children demonstrated different activity patterns across the four contrasts; these patterns were more limited and distinct for the NDDs. Processing happy and angry faces compared to flowers yielded similar activation in occipital regions in the NDDs compared to TDs. Processing happy compared to angry faces showed an age by group interaction in the superior frontal gyrus, increasing with age for ASD and OCD, decreasing for TDs. Children with ASD, ADHD and OCD differentiated less between dynamic faces and dynamic flowers, with most of the effects seen in the occipital and temporal regions, suggesting that emotional difficulties shared in NDDs may be partly attributed to shared atypical visual information processing.
Collapse
Affiliation(s)
- Marlee M Vandewouw
- Department of Diagnostic Imaging, Hospital for Sick Children, Toronto, Canada
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Canada
| | - EunJung Choi
- Department of Diagnostic Imaging, Hospital for Sick Children, Toronto, Canada
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Canada
- Bloorview Research Institute, University of Toronto, 150 Kilgour Road, Toronto, Canada
| | - Christopher Hammill
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Canada
| | - Paul Arnold
- Mathison Centre for Mental Health Research & Education, Hotchkiss Brain Institute, Cumming School of Medicine, University of Calgary, Alberta, Canada
| | - Russell Schachar
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Canada
- Department of Psychiatry, Hospital for Sick Children, Toronto, Canada
| | - Jason P Lerch
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Canada
- Wellcome Centre for Integrative Neuroimaging, FMRIB, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, UK
- Department of Medical Biophysics, University of Toronto, Toronto, Canada
| | - Evdokia Anagnostou
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Canada
- Bloorview Research Institute, University of Toronto, 150 Kilgour Road, Toronto, Canada
| | - Margot J Taylor
- Department of Diagnostic Imaging, Hospital for Sick Children, Toronto, Canada.
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Canada.
- Department of Psychology, University of Toronto, Toronto, Canada.
- Department of Medical Imaging, University of Toronto, Toronto, Canada.
| |
Collapse
|
37
|
Cooper S, Hobson CW, van Goozen SH. Facial emotion recognition in children with externalising behaviours: A systematic review. Clin Child Psychol Psychiatry 2020; 25:1068-1085. [PMID: 32713184 DOI: 10.1177/1359104520945390] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Difficulties in facial emotion recognition (FER) are associated with a range of mental health and antisocial presentations in adolescents and adults. Externalising behaviours in children are often one of the earliest signs of risk for the development of such difficulties. This article systematically reviews the evidence (from both group and correlational studies) for whether there is a relationship between FER and externalising behaviours in pre-adolescent children (aged 12 and under), both across and within externalising behaviour domains (hyperactivity, conduct problems, callous-unemotional traits, and aggression). Four electronic databases were searched producing 1,296 articles. Articles were included if they used validated measures of FER and externalising behaviours. Sixteen articles met criteria for inclusion in the review. Overall, the results suggested FER problems are present in ADHD, CP and callous-unemotional presentations, and in samples of children with higher levels of externalising problems rather than in community samples. However, there was no consistent evidence for specific emotions being implicated in the studies reviewed. Clinically, the findings suggest that FER difficulties are commonly associated with externalising behaviours, and hence this review offers some support that FER deficits could be a relevant target of intervention for externalising behaviours. However, more longitudinal studies are required, that control for other variables that might underlie FER difficulties (e.g. IQ or basic Theory of Mind abilities), to inform our knowledge of whether FER difficulties are a causal factor in externalising behaviours.
Collapse
Affiliation(s)
- Sara Cooper
- School of Psychology, Cardiff University, Cardiff, UK
| | | | | |
Collapse
|
38
|
Barisnikov K, Theurel A, Lejeune F. Emotion knowledge in neurotypical children and in those with down syndrome. APPLIED NEUROPSYCHOLOGY-CHILD 2020; 11:197-211. [PMID: 32579087 DOI: 10.1080/21622965.2020.1777131] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
Abstract
This research aimed to assess two components of emotion knowledge (EK): receptive EK with face emotion identification and matching tasks, and emotion situation knowledge with the emotion attribution task (EAT). Study 1 assessed the development of EK in 265 neurotypical (NT) children (4-11 years), divided into four age groups. Overall, results showed a significant improvement of EK with age in the NT population for the three tasks, especially between the ages of 4/5 and 6/7. Children were less successful at the EAT in comparison to the other two tasks, indicating that receptive EK develops earlier than emotion situation knowledge. The presence of visual context (EAT) does not help to improve our children's overall facial emotion recognition, especially for anger and sadness, while these emotions are well recognized in isolated facial expressions (emotion identification). Study 2 compared EK between 32 children with Down syndrome (CA: M = 13 years, SD = 2.13) and 32 NT children (CA: M = 5.3 years, SD = 1.36): matched on a vocabulary task. Children with DS had more difficulties in EK than NT children. They had lower performances on the identification and the EAT tasks, while exhibited similar performances to their NT controls on the emotion matching task. Moreover, good abilities to identify emotion expressions seem to be a prerequisite for successful face-context recognition in NT children, but not in children with DS. Difficulties encountered by children with DS could result from executive dysfunction when dealing with complex visual information in addition to emotion processing difficulties.
Collapse
Affiliation(s)
- Koviljka Barisnikov
- Child Clinical Neuropsychology Unit, FPSE, University of Geneva, Geneva, Switzerland
| | | | - Fleur Lejeune
- Child Clinical Neuropsychology Unit, FPSE, University of Geneva, Geneva, Switzerland
| |
Collapse
|
39
|
Álvarez-Pato VM, Sánchez CN, Domínguez-Soberanes J, Méndoza-Pérez DE, Velázquez R. A Multisensor Data Fusion Approach for Predicting Consumer Acceptance of Food Products. Foods 2020; 9:E774. [PMID: 32545344 PMCID: PMC7353528 DOI: 10.3390/foods9060774] [Citation(s) in RCA: 21] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2020] [Revised: 05/31/2020] [Accepted: 06/04/2020] [Indexed: 01/25/2023] Open
Abstract
Sensory experiences play an important role in consumer response, purchase decision, and fidelity towards food products. Consumer studies when launching new food products must incorporate physiological response assessment to be more precise and, thus, increase their chances of success in the market. This paper introduces a novel sensory analysis system that incorporates facial emotion recognition (FER), galvanic skin response (GSR), and cardiac pulse to determine consumer acceptance of food samples. Taste and smell experiments were conducted with 120 participants recording facial images, biometric signals, and reported liking when trying a set of pleasant and unpleasant flavors and odors. Data fusion and analysis by machine learning models allow predicting the acceptance elicited by the samples. Results confirm that FER alone is not sufficient to determine consumers' acceptance. However, when combined with GSR and, to a lesser extent, with pulse signals, acceptance prediction can be improved. This research targets predicting consumer's acceptance without the continuous use of liking scores. In addition, the findings of this work may be used to explore the relationships between facial expressions and physiological reactions for non-rational decision-making when interacting with new food products.
Collapse
Affiliation(s)
- Víctor M. Álvarez-Pato
- Facultad de Ingeniería, Universidad Panamericana, Aguascalientes 20290, Mexico; (V.M.Á.-P.); (C.N.S.)
| | - Claudia N. Sánchez
- Facultad de Ingeniería, Universidad Panamericana, Aguascalientes 20290, Mexico; (V.M.Á.-P.); (C.N.S.)
| | - Julieta Domínguez-Soberanes
- Escuela de Negocios Gastronómicos, Universidad Panamericana, Aguascalientes 20290, Mexico; (J.D.-S.); (D.E.M.-P.)
| | - David E. Méndoza-Pérez
- Escuela de Negocios Gastronómicos, Universidad Panamericana, Aguascalientes 20290, Mexico; (J.D.-S.); (D.E.M.-P.)
| | - Ramiro Velázquez
- Facultad de Ingeniería, Universidad Panamericana, Aguascalientes 20290, Mexico; (V.M.Á.-P.); (C.N.S.)
| |
Collapse
|
40
|
Griffiths S, Goh SKY, Norbury CF. Early language competence, but not general cognitive ability, predicts children's recognition of emotion from facial and vocal cues. PeerJ 2020; 8:e9118. [PMID: 32435540 PMCID: PMC7227654 DOI: 10.7717/peerj.9118] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2020] [Accepted: 04/12/2020] [Indexed: 11/20/2022] Open
Abstract
The ability to accurately identify and label emotions in the self and others is crucial for successful social interactions and good mental health. In the current study we tested the longitudinal relationship between early language skills and recognition of facial and vocal emotion cues in a representative UK population cohort with diverse language and cognitive skills (N = 369), including a large sample of children that met criteria for Developmental Language Disorder (DLD, N = 97). Language skills, but not non-verbal cognitive ability, at age 5–6 predicted emotion recognition at age 10–12. Children that met the criteria for DLD showed a large deficit in recognition of facial and vocal emotion cues. The results highlight the importance of language in supporting identification of emotions from non-verbal cues. Impairments in emotion identification may be one mechanism by which language disorder in early childhood predisposes children to later adverse social and mental health outcomes.
Collapse
Affiliation(s)
- Sarah Griffiths
- Psychology and Language Sciences, University College London, London, United Kingdom
| | - Shaun Kok Yew Goh
- Psychology and Language Sciences, University College London, London, United Kingdom.,Centre for Research in Child Development, Office of Educational Research, National Institute of Education, Nanyang Technological University, Singapore, Singapore
| | - Courtenay Fraiser Norbury
- Psychology and Language Sciences, University College London, London, United Kingdom.,Department of Special Needs Education, University of Oslo, Oslo, Norway
| | | |
Collapse
|
41
|
Abstract
OBJECTIVE Impairments in facial emotion recognition are an underlying factor of deficits in emotion regulation and interpersonal difficulties in mental disorders and are evident in eating disorders (EDs). METHODS We used a computerized psychophysical paradigm to manipulate parametrically the quantity of signal in facial expressions of emotion (QUEST threshold seeking algorithm). This was used to measure emotion recognition in 308 adult women (anorexia nervosa [n = 61], bulimia nervosa [n = 58], healthy controls [n = 130], and mixed mental disorders [mixed, n = 59]). The M (SD) age was 22.84 (3.90) years. The aims were to establish recognition thresholds defining how much information a person needs to recognize a facial emotion expression and to identify deficits in EDs compared with healthy and clinical controls. The stimuli included six basic emotion expressions (fear, anger, disgust, happiness, sadness, surprise), plus a neutral expression. RESULTS Happiness was discriminated at the lowest, fear at the highest threshold by all groups. There were no differences regarding thresholds between groups, except for the mixed and the bulimia nervosa group with respect to the expression of disgust (F(3,302) = 5.97, p = .001, η = .056). Emotional clarity, ED pathology, and depressive symptoms did not predict performance (RChange ≤ .010, F(1,305) ≤ 5.74, p ≥ .079). The confusion matrix did not reveal specific biases in either group. CONCLUSIONS Overall, within-subject effects were as expected, whereas between-subject effects were marginal and psychopathology did not influence emotion recognition. Facial emotion recognition abilities in women experiencing EDs compared with women experiencing mixed mental disorders and healthy controls were similar. Although basic facial emotion recognition processes seems to be intact, dysfunctional aspects such as misinterpretation might be important in emotion regulation problems. CLINICAL TRIAL REGISTRATION NUMBER DRKS-ID: DRKS00005709.
Collapse
|
42
|
Nagels L, Gaudrain E, Vickers D, Matos Lopes M, Hendriks P, Başkent D. Development of vocal emotion recognition in school-age children: The EmoHI test for hearing-impaired populations. PeerJ 2020; 8:e8773. [PMID: 32274264 PMCID: PMC7130108 DOI: 10.7717/peerj.8773] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2019] [Accepted: 02/21/2020] [Indexed: 11/20/2022] Open
Abstract
Traditionally, emotion recognition research has primarily used pictures and videos, while audio test materials are not always readily available or are not of good quality, which may be particularly important for studies with hearing-impaired listeners. Here we present a vocal emotion recognition test with pseudospeech productions from multiple speakers expressing three core emotions (happy, angry, and sad): the EmoHI test. The high sound quality recordings make the test suitable for use with populations of children and adults with normal or impaired hearing. Here we present normative data for vocal emotion recognition development in normal-hearing (NH) school-age children using the EmoHI test. Furthermore, we investigated cross-language effects by testing NH Dutch and English children, and the suitability of the EmoHI test for hearing-impaired populations, specifically for prelingually deaf Dutch children with cochlear implants (CIs). Our results show that NH children's performance improved significantly with age from the youngest age group onwards (4-6 years: 48.9%, on average). However, NH children's performance did not reach adult-like values (adults: 94.1%) even for the oldest age group tested (10-12 years: 81.1%). Additionally, the effect of age on NH children's development did not differ across languages. All except one CI child performed at or above chance-level showing the suitability of the EmoHI test. In addition, seven out of 14 CI children performed within the NH age-appropriate range, and nine out of 14 CI children did so when performance was adjusted for hearing age, measured from their age at CI implantation. However, CI children showed great variability in their performance, ranging from ceiling (97.2%) to below chance-level performance (27.8%), which could not be explained by chronological age alone. The strong and consistent development in performance with age, the lack of significant differences across the tested languages for NH children, and the above-chance performance of most CI children affirm the usability and versatility of the EmoHI test.
Collapse
Affiliation(s)
- Leanne Nagels
- Center for Language and Cognition Groningen (CLCG), University of Groningen, Groningen, The Netherlands.,Department of Otorhinolaryngology/Head and Neck Surgery, University Medical Center Groningen, Groningen, The Netherlands
| | - Etienne Gaudrain
- Department of Otorhinolaryngology/Head and Neck Surgery, University Medical Center Groningen, Groningen, The Netherlands.,CNRS, Lyon Neuroscience Research Center, Université de Lyon, Lyon, France
| | - Deborah Vickers
- Cambridge Hearing Group, Clinical Neurosciences Department, University of Cambridge, Cambridge, United Kingdom
| | - Marta Matos Lopes
- Hearbase Ltd, The Hearing Specialists, Kent, United Kingdom.,The Ear Institute, University College London, London, United Kingdom
| | - Petra Hendriks
- Center for Language and Cognition Groningen (CLCG), University of Groningen, Groningen, The Netherlands
| | - Deniz Başkent
- Department of Otorhinolaryngology/Head and Neck Surgery, University Medical Center Groningen, Groningen, The Netherlands
| |
Collapse
|
43
|
Willner CJ, Jetha MK, Segalowitz SJ, Gatzke-Kopp LM. Neurophysiological evidence for distinct biases in emotional face processing associated with internalizing and externalizing symptoms in children. Biol Psychol 2020; 150:107829. [PMID: 31790713 PMCID: PMC7007849 DOI: 10.1016/j.biopsycho.2019.107829] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2018] [Revised: 11/24/2019] [Accepted: 11/27/2019] [Indexed: 01/24/2023]
Abstract
Attentional bias to threat has been implicated in both internalizing and externalizing disorders. This study utilizes event-related potentials to examine early stages of perceptual attention to threatening (angry or fearful) versus neutral faces among a sample of 200 children ages 6-8 years from a low-income, urban community. Although both internalizing and externalizing symptoms were associated with processing biases, the nature of the bias differed between these two symptom domains. Internalizing symptoms were associated with heightened early attentional selection (P1) and later perceptual processing (P2) of fearful faces. In contrast, externalizing symptoms were associated with reduced early attentional selection (P1) of fearful faces and enhanced perceptual processing (P2) of neutral faces, possibly indicative of a hostile interpretation bias for ambiguous social cues. These results provide insight into the distinct cognitive-affective processes that may contribute to the etiology and maintenance of internalizing and externalizing psychopathology.
Collapse
Affiliation(s)
- Cynthia J Willner
- The Pennsylvania State University, Department of Human Development and Family Studies, 228 Health and Human Development Building, University Park, PA, 16802, United States.
| | - Michelle K Jetha
- Cape Breton University, Department of Psychology, 1250 Grand Lake Road, Sydney, Nova Scotia, B1P 6L2, Canada.
| | - Sidney J Segalowitz
- Brock University, Department of Psychology, 1812 Sir Isaac Brock Way, St. Catharines, ON, L2S 3A1, Canada.
| | - Lisa M Gatzke-Kopp
- The Pennsylvania State University, Department of Human Development and Family Studies, 228 Health and Human Development Building, University Park, PA, 16802, United States.
| |
Collapse
|
44
|
Meinhardt-Injac B, Daum MM, Meinhardt G. Theory of mind development from adolescence to adulthood: Testing the two-component model. BRITISH JOURNAL OF DEVELOPMENTAL PSYCHOLOGY 2020; 38:289-303. [PMID: 31960462 DOI: 10.1111/bjdp.12320] [Citation(s) in RCA: 31] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2019] [Revised: 12/12/2019] [Indexed: 11/27/2022]
Abstract
The ability to infer mental and affective states of others is crucial for social functioning. This ability, denoted as Theory of Mind (ToM), develops rapidly during childhood, yet results on its development across adolescence and into young adulthood are rare. In the present study, we tested the two-component model, measuring age-related changes in social-perceptual and social-cognitive ToM in a sample of 267 participants between 11 and 25 years of age. Additionally, we measured language, reasoning, and inhibitory control as major covariates. Participants inferred mental states from non-verbal cues in a social-perceptual task (Eye Test) and from stories with faux pas in a social-cognitive task (Faux Pas Test). Results showed substantial improvement across adolescence in both ToM measures and in the covariates. Analysis with linear mixed models (LMM) revealed specific age-related growth for the social-perceptual component, while the age-related increase of the social-cognitive component fully aligned with the increase of the covariates. These results support the distinction between ToM components and indicate that adolescence is a crucial period for developing social-perceptual ToM abilities. Statement of contribution What is already known on this subject? To date, much research has been dedicated to Theory of Mind (ToM) development in early and middle childhood. However, only a few studies have examined development of ToM in adolescence. Studies so far suggest age-related differences in ToM between adolescents and young adults. What this study adds The study offers several methodological advantages including a large sample size with a continuous distribution of age (age 11-25) and the use of a comprehensive test battery to assess ToM and covariates (language, executive functions, reasoning). The results provide evidence for asymmetries in the development of two ToM components (social-perceptual and social-cognitive; the two-component account) across the studied age range: the social perceptual component showed specific development, while the age-related increase of the social-cognitive component fully aligned with increase of the covariates. Adolescence is a crucial period for developing social-perceptual ToM abilities.
Collapse
Affiliation(s)
- Bozana Meinhardt-Injac
- Catholic University of Applied Science Berlin (KHSB), Berlin, Germany.,Department of Psychology, Johannes Gutenberg University, Mainz, Germany
| | - Moritz M Daum
- Department of Psychology, University of Zurich, Switzerland.,Neuroscience Center Zurich, University of Zurich and ETH Zurich, Switzerland
| | - Günter Meinhardt
- Department of Psychology, Johannes Gutenberg University, Mainz, Germany
| |
Collapse
|
45
|
The coupling between face and emotion recognition from early adolescence to young adulthood. COGNITIVE DEVELOPMENT 2020. [DOI: 10.1016/j.cogdev.2020.100851] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
|
46
|
López-Morales H, Zabaletta V, Vivas L, López MC. Reconocimiento de Expresiones Faciales Emocionales. Diferencias en el Desarrollo. PSICOLOGIA: TEORIA E PESQUISA 2020. [DOI: 10.1590/0102.3772e3626] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
RESUMEN El trabajo se propuso caracterizar el reconocimiento facial de emociones en población infanto-juvenil. Se administró una adaptación digital del Test Pictures of Facial Affects a 147 participantes de entre 9 y 18 años. Los resultados evidenciaron una asociación negativa entre la edad y la tasa de aciertos para alegría y positiva para asco y miedo. Además, se evidenció un efecto significativo de la edad en los tiempos de respuesta de todas las emociones a excepción del miedo. Los resultados sugieren que a medida que aumenta la edad el reconocimiento emocional es más veloz, sin embargo, esto se refleja en una mejoría en el reconocimiento emocional sólo en asco y miedo. Se discuten la importancia de estas emociones para la adolescencia.
Collapse
|
47
|
Baccolo E, Macchi Cassia V. Age-Related Differences in Sensitivity to Facial Trustworthiness: Perceptual Representation and the Role of Emotional Development. Child Dev 2019; 91:1529-1547. [PMID: 31769004 DOI: 10.1111/cdev.13340] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2019] [Revised: 07/29/2019] [Accepted: 08/29/2019] [Indexed: 11/30/2022]
Abstract
The ability to discriminate social signals from faces is a fundamental component of human social interactions whose developmental origins are still debated. In this study, 5-year-old (N = 29) and 7-year-old children (N = 31) and adults (N = 34) made perceptual similarity and trustworthiness judgments on a set of female faces varying in level of expressed trustworthiness. All groups represented perceived similarity of the faces as a function of trustworthiness intensity, but such representation becomes more fine-grained with development. Moreover, 5-year-olds' accuracy in choosing the more trustworthy face in a pair varied as a function of children's score at the Test of Emotion Comprehension, suggesting that the ability to perform face-to-trait inferences is related to the development of emotional understanding.
Collapse
|
48
|
Stewart E, Catroppa C, Gonzalez L, Gill D, Webster R, Lawson J, Sabaz M, Mandalis A, Barton B, McLean S, Lah S. Facial emotion perception and social competence in children (8 to 16 years old) with genetic generalized epilepsy and temporal lobe epilepsy. Epilepsy Behav 2019; 100:106301. [PMID: 31133510 DOI: 10.1016/j.yebeh.2019.04.054] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/28/2019] [Revised: 04/01/2019] [Accepted: 04/27/2019] [Indexed: 10/26/2022]
Abstract
Facial emotion perception (FEP) impairments are common in adults with epilepsy and associated with impaired psychosocial functioning. Research into the presence of FEP deficits in children with epilepsy and the functional implications of these deficits is limited. The primary aims of this study were to assess FEP abilities in children (8 to 16 years old) with genetic generalized epilepsy (GGE) and temporal lobe epilepsy (TLE) and examine whether FEP is related to everyday social functioning. Forty-four children (8 to 16 years) with epilepsy (22 GGE, 22 TLE) and 22 typically developing controls completed the Pictures of Facial Affect (POFA) battery to assess FEP and a brief test of intellectual functioning (intelligence quotient [IQ]). Parents completed questionnaires assessing social competence of their child. Neurologists completed the Global Assessment of Severity of Epilepsy (GASE) scale as a measure of overall epilepsy severity. Demographic and clinical information was obtained from medical records and clinical interviews with parents. Findings revealed significant, overall FEP impairments and reduced social competence in children with GGE and TLE compared to controls. The magnitude of FEP impairment (i.e., across all emotions) was comparable in the two epilepsy groups, yet different emotions were impaired in each group: children with GGE were impaired in recognizing anger and disgust, whereas children with TLE were impaired in sadness and disgust, compared to controls. Contrary to expectations, total FEP accuracy was not significantly correlated with social competence in either epilepsy group. In conclusion, children with GGE and TLE have significant impairments recognizing emotional expressions on faces. Further research is needed to examine whether underlying FEP impairments relate to social and emotional functioning in children with epilepsy.
Collapse
Affiliation(s)
- Elizabeth Stewart
- School of Psychology, The University of Sydney, 94 - 100 Mallett Street, Camperdown, Sydney, Australia; ARC Centre of Excellence in Cognition and its Disorders, Sydney, Australia
| | - Cathy Catroppa
- Murdoch Children's Research Institute, 50 Flemington Road, Parkville, Melbourne, Australia
| | - Linda Gonzalez
- Murdoch Children's Research Institute, 50 Flemington Road, Parkville, Melbourne, Australia
| | - Deepak Gill
- T.Y Nelson Department of Neurology and Neurosurgery, The Children's Hospital at Westmead, Corner Hawkesbury Road and Hainsworth Street, Westmead, Sydney, Australia
| | - Richard Webster
- T.Y Nelson Department of Neurology and Neurosurgery, The Children's Hospital at Westmead, Corner Hawkesbury Road and Hainsworth Street, Westmead, Sydney, Australia
| | - John Lawson
- Department of Neurology, Sydney Children's Hospital, High Street Randwick, Sydney, Australia
| | - Mark Sabaz
- Department of Psychology, Sydney Children's Hospital, High Street Randwick, Sydney, Australia
| | - Anna Mandalis
- Department of Psychology, Sydney Children's Hospital, High Street Randwick, Sydney, Australia
| | - Belinda Barton
- Children's Hospital Education Research Institute, The Children's Hospital at Westmead, Corner Hawkesbury Road and Hainsworth Street, Westmead, Sydney, Australia
| | - Samantha McLean
- T.Y Nelson Department of Neurology and Neurosurgery, The Children's Hospital at Westmead, Corner Hawkesbury Road and Hainsworth Street, Westmead, Sydney, Australia
| | - Suncica Lah
- School of Psychology, The University of Sydney, 94 - 100 Mallett Street, Camperdown, Sydney, Australia; ARC Centre of Excellence in Cognition and its Disorders, Sydney, Australia.
| |
Collapse
|
49
|
Stoll C, Rodger H, Lao J, Richoz AR, Pascalis O, Dye M, Caldara R. Quantifying Facial Expression Intensity and Signal Use in Deaf Signers. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2019; 24:346-355. [PMID: 31271428 DOI: 10.1093/deafed/enz023] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/09/2018] [Revised: 04/30/2019] [Accepted: 05/03/2019] [Indexed: 06/09/2023]
Abstract
We live in a world of rich dynamic multisensory signals. Hearing individuals rapidly and effectively integrate multimodal signals to decode biologically relevant facial expressions of emotion. Yet, it remains unclear how facial expressions are decoded by deaf adults in the absence of an auditory sensory channel. We thus compared early and profoundly deaf signers (n = 46) with hearing nonsigners (n = 48) on a psychophysical task designed to quantify their recognition performance for the six basic facial expressions of emotion. Using neutral-to-expression image morphs and noise-to-full signal images, we quantified the intensity and signal levels required by observers to achieve expression recognition. Using Bayesian modeling, we found that deaf observers require more signal and intensity to recognize disgust, while reaching comparable performance for the remaining expressions. Our results provide a robust benchmark for the intensity and signal use in deafness and novel insights into the differential coding of facial expressions of emotion between hearing and deaf individuals.
Collapse
Affiliation(s)
- Chloé Stoll
- Laboratoire de Psychologie et de Neurocognition (CNRS-UMR5105), Université Grenoble-Alpes
- Laboratory for Investigative Neurophysiology, Centre Hospitalier Universitaire Vaudois and University of Lausanne
| | - Helen Rodger
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg
| | - Junpeng Lao
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg
| | - Anne-Raphaëlle Richoz
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg
| | - Olivier Pascalis
- Laboratoire de Psychologie et de Neurocognition (CNRS-UMR5105), Université Grenoble-Alpes
| | - Matthew Dye
- National Technical Institute for Deaf/Rochester Institute of Technology
| | - Roberto Caldara
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg
| |
Collapse
|
50
|
Colasante T, Gao X, Malti T. Aware and tuned to care: Children with better distress recognition and higher sympathy anticipate more guilt after harming others. BRITISH JOURNAL OF DEVELOPMENTAL PSYCHOLOGY 2019; 37:600-610. [PMID: 31509269 DOI: 10.1111/bjdp.12305] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2018] [Revised: 08/22/2019] [Indexed: 12/01/2022]
Abstract
Helping children recognize the distress of their victims and feel sympathy may facilitate the optimal socialization of ethical guilt. With a sample of 150 eight-year-olds, we tested the main and interactive relations of distress recognition and sympathy to ethical guilt after hypothetically stealing and pushing. Better fear recognition and higher sympathy were uniquely associated with higher ethical guilt. The link between fear recognition and ethical guilt was stronger in children with higher sympathy. Beyond their unique contributions, distress recognition and sympathy may work in concert to facilitate ethical guilt after harming others. Statement of contribution What is already known on this subject Children are thought to express more guilt if they recognize their victims' distress and feel sympathy for them. However, there is little evidence for the direct roles of distress recognition and sympathy in children's guilt, and none for their joint contribution. What the present study adds The link between fear recognition and guilt was stronger in children with higher sympathy. Sympathy may help children harness and translate the awareness afforded by distress recognition into feelings of accountability and regret. This study was the first to clarify the main and additive roles of sympathy and distress recognition in children's anticipation of guilt after harming others. Promoting distress recognition and sympathy may represent a viable two-step approach to inducing guilt in children after they violate others' welfare.
Collapse
Affiliation(s)
- Tyler Colasante
- Department of Psychology, University of Toronto Mississauga, Mississauga, Ontario, Canada.,Centre for Child Development, Mental Health, and Policy, Mississauga, Ontario, Canada
| | - Xiaoqing Gao
- Center for Psychological Sciences, Zhejiang University, Hangzhou, China
| | - Tina Malti
- Department of Psychology, University of Toronto Mississauga, Mississauga, Ontario, Canada.,Centre for Child Development, Mental Health, and Policy, Mississauga, Ontario, Canada.,Department of Psychiatry, University of Toronto, Ontario, Canada
| |
Collapse
|