1
|
Wang Y, Luo Q, Zhang Y, Zhao K. Synchrony or asynchrony: development of facial expression recognition from childhood to adolescence based on large-scale evidence. Front Psychol 2024; 15:1379652. [PMID: 38725946 PMCID: PMC11079229 DOI: 10.3389/fpsyg.2024.1379652] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2024] [Accepted: 04/09/2024] [Indexed: 05/12/2024] Open
Abstract
The development of facial expression recognition ability in children is crucial for their emotional cognition and social interactions. In this study, 510 children aged between 6 and 15 participated in a two forced-choice task of facial expression recognition. The findings supported that recognition of the six basic facial expressions reached a relatively stable mature level around 8-9 years old. Additionally, model fitting results indicated that children showed the most significant improvement in recognizing expressions of disgust, closely followed by fear. Conversely, recognition of expressions of happiness and sadness showed slower improvement across different age groups. Regarding gender differences, girls exhibited a more pronounced advantage. Further model fitting revealed that boys showed more pronounced improvements in recognizing expressions of disgust, fear, and anger, while girls showed more pronounced improvements in recognizing expressions of surprise, sadness, and happiness. These clear findings suggested the synchronous developmental trajectory of facial expression recognition from childhood to adolescence, likely influenced by socialization processes and interactions related to brain maturation.
Collapse
Affiliation(s)
- Yihan Wang
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Qian Luo
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Yuanmeng Zhang
- College of Letters and Science, University of California, Berkeley, Berkeley, CA, United States
| | - Ke Zhao
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
2
|
Fan XR, Wang YS, Chang D, Yang N, Rong MJ, Zhang Z, He Y, Hou X, Zhou Q, Gong ZQ, Cao LZ, Dong HM, Nie JJ, Chen LZ, Zhang Q, Zhang JX, Zhang L, Li HJ, Bao M, Chen A, Chen J, Chen X, Ding J, Dong X, Du Y, Feng C, Feng T, Fu X, Ge LK, Hong B, Hu X, Huang W, Jiang C, Li L, Li Q, Li S, Liu X, Mo F, Qiu J, Su XQ, Wei GX, Wu Y, Xia H, Yan CG, Yan ZX, Yang X, Zhang W, Zhao K, Zhu L, Zuo XN. A longitudinal resource for population neuroscience of school-age children and adolescents in China. Sci Data 2023; 10:545. [PMID: 37604823 PMCID: PMC10442366 DOI: 10.1038/s41597-023-02377-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2023] [Accepted: 07/11/2023] [Indexed: 08/23/2023] Open
Abstract
During the past decade, cognitive neuroscience has been calling for population diversity to address the challenge of validity and generalizability, ushering in a new era of population neuroscience. The developing Chinese Color Nest Project (devCCNP, 2013-2022), the first ten-year stage of the lifespan CCNP (2013-2032), is a two-stages project focusing on brain-mind development. The project aims to create and share a large-scale, longitudinal and multimodal dataset of typically developing children and adolescents (ages 6.0-17.9 at enrolment) in the Chinese population. The devCCNP houses not only phenotypes measured by demographic, biophysical, psychological and behavioural, cognitive, affective, and ocular-tracking assessments but also neurotypes measured with magnetic resonance imaging (MRI) of brain morphometry, resting-state function, naturalistic viewing function and diffusion structure. This Data Descriptor introduces the first data release of devCCNP including a total of 864 visits from 479 participants. Herein, we provided details of the experimental design, sampling strategies, and technical validation of the devCCNP resource. We demonstrate and discuss the potential of a multicohort longitudinal design to depict normative brain growth curves from the perspective of developmental population neuroscience. The devCCNP resource is shared as part of the "Chinese Data-sharing Warehouse for In-vivo Imaging Brain" in the Chinese Color Nest Project (CCNP) - Lifespan Brain-Mind Development Data Community ( https://ccnp.scidb.cn ) at the Science Data Bank.
Collapse
Affiliation(s)
- Xue-Ru Fan
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, 100875, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- Developmental Population Neuroscience Research Center, International Data Group/McGovern Institute for Brain Research, Beijing Normal University, Beijing, 100875, China
| | - Yin-Shan Wang
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, 100875, China
- Developmental Population Neuroscience Research Center, International Data Group/McGovern Institute for Brain Research, Beijing Normal University, Beijing, 100875, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Da Chang
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, 100875, China
- Developmental Population Neuroscience Research Center, International Data Group/McGovern Institute for Brain Research, Beijing Normal University, Beijing, 100875, China
| | - Ning Yang
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, 100875, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- Developmental Population Neuroscience Research Center, International Data Group/McGovern Institute for Brain Research, Beijing Normal University, Beijing, 100875, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Meng-Jie Rong
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, 100875, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- Developmental Population Neuroscience Research Center, International Data Group/McGovern Institute for Brain Research, Beijing Normal University, Beijing, 100875, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Zhe Zhang
- College of Education, Hebei Normal University, Shijiazhuang, 050024, China
| | - Ye He
- School of Artificial Intelligence, Beijing University of Posts and Telecommunications, Beijing, 100876, China
| | - Xiaohui Hou
- Laboratory of Cognitive Neuroscience and Education, School of Education Science, Nanning Normal University, Nanning, 530299, China
| | - Quan Zhou
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, 100875, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- Developmental Population Neuroscience Research Center, International Data Group/McGovern Institute for Brain Research, Beijing Normal University, Beijing, 100875, China
| | - Zhu-Qing Gong
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, 100875, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- Developmental Population Neuroscience Research Center, International Data Group/McGovern Institute for Brain Research, Beijing Normal University, Beijing, 100875, China
| | - Li-Zhi Cao
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Hao-Ming Dong
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, 100875, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
- Changping Laboratory, Beijing, 102206, China
| | - Jing-Jing Nie
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, 100875, China
- Developmental Population Neuroscience Research Center, International Data Group/McGovern Institute for Brain Research, Beijing Normal University, Beijing, 100875, China
| | - Li-Zhen Chen
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, 100875, China
- Developmental Population Neuroscience Research Center, International Data Group/McGovern Institute for Brain Research, Beijing Normal University, Beijing, 100875, China
| | - Qing Zhang
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Jia-Xin Zhang
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Lei Zhang
- School of Government, Shanghai University of Political Science and Law, Shanghai, 201701, China
| | - Hui-Jie Li
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Min Bao
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Antao Chen
- School of Psychology, Research Center for Exercise and Brain Science, Shanghai University of Sport, Shanghai, 200438, China
- Faculty of Psychology, Southwest University, Chongqing, 400715, China
| | - Jing Chen
- School of Psychology, Research Center for Exercise and Brain Science, Shanghai University of Sport, Shanghai, 200438, China
- Faculty of Psychology, Southwest University, Chongqing, 400715, China
| | - Xu Chen
- Faculty of Psychology, Southwest University, Chongqing, 400715, China
| | - Jinfeng Ding
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Xue Dong
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Yi Du
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Chen Feng
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Tingyong Feng
- Faculty of Psychology, Southwest University, Chongqing, 400715, China
| | - Xiaolan Fu
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China
| | - Li-Kun Ge
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Bao Hong
- NYU-ECNU Institute of Brain and Cognitive Science at New York University Shanghai, Shanghai, 200062, China
- School of Psychology and Cognitive Science, East China Normal University, Shanghai, 200062, China
| | - Xiaomeng Hu
- Department of Psychology, Renmin University of China, Beijing, 100872, China
| | - Wenjun Huang
- NYU-ECNU Institute of Brain and Cognitive Science at New York University Shanghai, Shanghai, 200062, China
- School of Psychology and Cognitive Science, East China Normal University, Shanghai, 200062, China
| | - Chao Jiang
- Beijing Key Laboratory of Learning and Cognition, School of Psychology, Capital Normal University, Beijing, 100048, China
| | - Li Li
- NYU-ECNU Institute of Brain and Cognitive Science at New York University Shanghai, Shanghai, 200062, China
- Faculty of Arts and Science, New York University Shanghai, Shanghai, 200122, China
| | - Qi Li
- Beijing Key Laboratory of Learning and Cognition, School of Psychology, Capital Normal University, Beijing, 100048, China
| | - Su Li
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Xun Liu
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Fan Mo
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China
| | - Jiang Qiu
- Faculty of Psychology, Southwest University, Chongqing, 400715, China
| | - Xue-Quan Su
- Laboratory of Cognitive Neuroscience and Education, School of Education Science, Nanning Normal University, Nanning, 530299, China
| | - Gao-Xia Wei
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Yiyang Wu
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Haishuo Xia
- Faculty of Psychology, Southwest University, Chongqing, 400715, China
| | - Chao-Gan Yan
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Zhi-Xiong Yan
- Laboratory of Cognitive Neuroscience and Education, School of Education Science, Nanning Normal University, Nanning, 530299, China
| | - Xiaohong Yang
- Department of Psychology, Renmin University of China, Beijing, 100872, China
| | - Wenfang Zhang
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Ke Zhao
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- School of Psychology and Cognitive Science, East China Normal University, Shanghai, 200062, China
| | - Liqi Zhu
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China
| | - Xi-Nian Zuo
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, 100875, China.
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China.
- Developmental Population Neuroscience Research Center, International Data Group/McGovern Institute for Brain Research, Beijing Normal University, Beijing, 100875, China.
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, 100101, China.
- Laboratory of Cognitive Neuroscience and Education, School of Education Science, Nanning Normal University, Nanning, 530299, China.
- School of Education, Hunan University of Science and Technology, Hunan Xiangtan, 411201, China.
- National Basic Science Data Center, Beijing, 100190, China.
| |
Collapse
|
3
|
Rodger H, Sokhn N, Lao J, Liu Y, Caldara R. Developmental eye movement strategies for decoding facial expressions of emotion. J Exp Child Psychol 2023; 229:105622. [PMID: 36641829 DOI: 10.1016/j.jecp.2022.105622] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2022] [Revised: 12/21/2022] [Accepted: 12/23/2022] [Indexed: 01/15/2023]
Abstract
In our daily lives, we routinely look at the faces of others to try to understand how they are feeling. Few studies have examined the perceptual strategies that are used to recognize facial expressions of emotion, and none have attempted to isolate visual information use with eye movements throughout development. Therefore, we recorded the eye movements of children from 5 years of age up to adulthood during recognition of the six "basic emotions" to investigate when perceptual strategies for emotion recognition become mature (i.e., most adult-like). Using iMap4, we identified the eye movement fixation patterns for recognition of the six emotions across age groups in natural viewing and gaze-contingent (i.e., expanding spotlight) conditions. While univariate analyses failed to reveal significant differences in fixation patterns, more sensitive multivariate distance analyses revealed a U-shaped developmental trajectory with the eye movement strategies of the 17- to 18-year-old group most similar to adults for all expressions. A developmental dip in strategy similarity was found for each emotional expression revealing which age group had the most distinct eye movement strategy from the adult group: the 13- to 14-year-olds for sadness recognition; the 11- to 12-year-olds for fear, anger, surprise, and disgust; and the 7- to 8-year-olds for happiness. Recognition performance for happy, angry, and sad expressions did not differ significantly across age groups, but the eye movement strategies for these expressions diverged for each group. Therefore, a unique strategy was not a prerequisite for optimal recognition performance for these expressions. Our data provide novel insights into the developmental trajectories underlying facial expression recognition, a critical ability for adaptive social relations.
Collapse
Affiliation(s)
- Helen Rodger
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, 1700 Fribourg, Switzerland.
| | - Nayla Sokhn
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, 1700 Fribourg, Switzerland
| | - Junpeng Lao
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, 1700 Fribourg, Switzerland
| | - Yingdi Liu
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, 1700 Fribourg, Switzerland
| | - Roberto Caldara
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, 1700 Fribourg, Switzerland.
| |
Collapse
|
4
|
Della Longa L, Nosarti C, Farroni T. Emotion Recognition in Preterm and Full-Term School-Age Children. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:6507. [PMID: 35682092 PMCID: PMC9180201 DOI: 10.3390/ijerph19116507] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/15/2022] [Revised: 05/15/2022] [Accepted: 05/25/2022] [Indexed: 12/04/2022]
Abstract
Children born preterm (<37 weeks’ gestation) show a specific vulnerability for socio-emotional difficulties, which may lead to an increased likelihood of developing behavioral and psychiatric problems in adolescence and adulthood. The accurate decoding of emotional signals from faces represents a fundamental prerequisite for early social interactions, allowing children to derive information about others’ feelings and intentions. The present study aims to explore possible differences between preterm and full-term children in the ability to detect emotional expressions, as well as possible relationships between this ability and socio-emotional skills and problem behaviors during everyday activities. We assessed 55 school-age children (n = 34 preterm and n = 21 full-term) with a cognitive battery that ensured comparable cognitive abilities between the two groups. Moreover, children were asked to identify emotional expressions from pictures of peers’ faces (Emotion Recognition Task). Finally, children’s emotional, social and behavioral outcomes were assessed with parent-reported questionnaires. The results revealed that preterm children were less accurate than full-term children in detecting positive emotional expressions and they showed poorer social and behavioral outcomes. Notably, correlational analyses showed a relationship between the ability to recognize emotional expressions and socio-emotional functioning. The present study highlights that early difficulties in decoding emotional signals from faces may be critically linked to emotional and behavioral regulation problems, with important implications for the development of social skills and effective interpersonal interactions.
Collapse
Affiliation(s)
- Letizia Della Longa
- Developmental Psychology and Socialization Department, University of Padova, 35131 Padova, Italy;
| | - Chiara Nosarti
- Department of Child and Adolescent Psychiatry, King’s College London, London SE5 8AF, UK;
| | - Teresa Farroni
- Developmental Psychology and Socialization Department, University of Padova, 35131 Padova, Italy;
| |
Collapse
|
5
|
Qu Z, Yang R, Gao L, Han Y, Su Y, Cui T, Zhang X. Social avoidance motivation tendency linked to face processing ability among 6- to 12-year-old children. COGNITIVE DEVELOPMENT 2022. [DOI: 10.1016/j.cogdev.2022.101178] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
6
|
Romani-Sponchiado A, Maia CP, Torres CN, Tavares I, Arteche AX. Emotional face expressions recognition in childhood: developmental markers, age and sex effect. Cogn Process 2022; 23:467-477. [PMID: 35362838 DOI: 10.1007/s10339-022-01086-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2021] [Accepted: 03/10/2022] [Indexed: 11/29/2022]
Abstract
Recognizing emotional face expressions in others is a valuable non-verbal communication and particularly relevant throughout childhood given that children's language skills are not yet fully developed, but the first interactions with peers have just started. This study aims to investigate developmental markers of emotional facial expression in children and the effect of age and sex on it. A total of 90 children split into three age groups: 6-7 years old (n = 30); 8-9 years old (n = 30); 10-11 years old (n = 30) took part in the study. Participants were exposed to 38 photos in two exposure times (500 ms and 1000 ms) of children expressing happiness, sadness, anger, disgust, fear and surprise on three intensities, plus images of neutral faces. Happiness was the easiest expression to be recognized, followed by disgust and surprise. As expected, 10-11-year-old group showed the highest accuracy means, whereas 6-7-year-old group had the lowest means of accuracy. Data support the non-existence of female advantage.
Collapse
Affiliation(s)
- Aline Romani-Sponchiado
- Psychology Department, Pontifical Catholic University of Rio Grande Do Sul (PUCRS), Av. Ipiranga 6681, Building 11, 9th Floor, Porto Alegre, RS, 90619-900, Brazil
| | - Cíntia Pacheco Maia
- Psychology Department, Pontifical Catholic University of Rio Grande Do Sul (PUCRS), Av. Ipiranga 6681, Building 11, 9th Floor, Porto Alegre, RS, 90619-900, Brazil
| | - Carol Nunes Torres
- Psychology Department, Pontifical Catholic University of Rio Grande Do Sul (PUCRS), Av. Ipiranga 6681, Building 11, 9th Floor, Porto Alegre, RS, 90619-900, Brazil
| | - Inajá Tavares
- Psychology Department, Pontifical Catholic University of Rio Grande Do Sul (PUCRS), Av. Ipiranga 6681, Building 11, 9th Floor, Porto Alegre, RS, 90619-900, Brazil
| | - Adriane Xavier Arteche
- Psychology Department, Pontifical Catholic University of Rio Grande Do Sul (PUCRS), Av. Ipiranga 6681, Building 11, 9th Floor, Porto Alegre, RS, 90619-900, Brazil.
| |
Collapse
|
7
|
Zagury-Orly I, Kroeck MR, Soussand L, Li Cohen A. Face-Processing Performance is an Independent Predictor of Social Affect as Measured by the Autism Diagnostic Observation Schedule Across Large-Scale Datasets. J Autism Dev Disord 2022; 52:674-688. [PMID: 33743118 PMCID: PMC9747289 DOI: 10.1007/s10803-021-04971-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/10/2021] [Indexed: 02/03/2023]
Abstract
Face-processing deficits, while not required for the diagnosis of autism spectrum disorder (ASD), have been associated with impaired social skills-a core feature of ASD; however, the strength and prevalence of this relationship remains unclear. Across 445 participants from the NIMH Data Archive, we examined the relationship between Benton Face Recognition Test (BFRT) performance and Autism Diagnostic Observation Schedule-Social Affect (ADOS-SA) scores. Lower BFRT scores (worse face-processing performance) were associated with higher ADOS-SA scores (higher ASD severity)-a relationship that held after controlling for other factors associated with face processing, i.e., age, sex, and IQ. These findings underscore the utility of face discrimination, not just recognition of facial emotion, as a key covariate for the severity of symptoms that characterize ASD.
Collapse
Affiliation(s)
- Ivry Zagury-Orly
- Department of Neurology, Boston Children’s Hospital, Harvard Medical School, Boston, MA, USA,Faculty of Medicine, Université de Montréal, Montreal, QC, CA
| | - Mallory R. Kroeck
- Department of Neurology, Boston Children’s Hospital, Harvard Medical School, Boston, MA, USA
| | - Louis Soussand
- Department of Neurology, Boston Children’s Hospital, Harvard Medical School, Boston, MA, USA
| | - Alexander Li Cohen
- Department of Neurology, Boston Children’s Hospital, Harvard Medical School, Boston, MA, USA,Computational Radiology Laboratory, Department of Radiology, Boston Children’s Hospital, Harvard Medical School, Boston, MA, USA,Center for Brain Circuit Therapeutics, Brigham and Women’s Hospital, Harvard Medical School, Boston, MA, USA
| |
Collapse
|
8
|
Ramos-Loyo J, Olguín-Rodríguez PV, Espinosa-Denenea SE, Llamas-Alonso LA, Rivera-Tello S, Müller MF. EEG functional brain connectivity strengthens with age during attentional processing to faces in children. FRONTIERS IN NETWORK PHYSIOLOGY 2022; 2:890906. [PMID: 36926063 PMCID: PMC10013043 DOI: 10.3389/fnetp.2022.890906] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/07/2022] [Accepted: 09/15/2022] [Indexed: 03/18/2023]
Abstract
Studying functional connectivity may generate clues to the maturational changes that occur in children, expressed by the dynamical organization of the functional network assessed by electroencephalographic recordings (EEG). In the present study, we compared the EEG functional connectivity pattern estimated by linear cross-correlations of the electrical brain activity of three groups of children (6, 8, and 10 years of age) while performing odd-ball tasks containing facial stimuli that are chosen considering their importance in socioemotional contexts in everyday life. On the first task, the children were asked to identify the sex of faces, on the second, the instruction was to identify the happy expressions of the faces. We estimated the stable correlation pattern (SCP) by the average cross-correlation matrix obtained separately for the resting state and the task conditions and quantified the similarity of these average matrices comparing the different conditions. The accuracy improved with higher age. Although the topology of the SCPs showed high similarity across all ages, the two older groups showed a higher correlation between regions associated with the attentional and face processing networks compared to the youngest group. Only in the youngest group, the similarity metric decreased during the sex condition. In general, correlation values strengthened with age and during task performance compared to rest. Our findings indicate that there is a spatially extended stable brain network organization in children like that reported in adults. Lower similarity scores between several regions in the youngest children might indicate a lesser ability to cope with tasks. The brain regions associated with the attention and face networks presented higher synchronization across regions with increasing age, modulated by task demands.
Collapse
Affiliation(s)
- Julieta Ramos-Loyo
- Instituto de Neurociencias, Universidad de Guadalajara, Guadalajara, Jalisco, México
| | - Paola V Olguín-Rodríguez
- Instituto de Ciencias Nucleares, Universidad Nacional Autónoma de México, Ciudad de México, México.,Centro de Ciencias de La Complejidad, Universidad Nacional Autónoma de México, Ciudad de México, México
| | | | | | - Sergio Rivera-Tello
- Instituto de Neurociencias, Universidad de Guadalajara, Guadalajara, Jalisco, México
| | - Markus F Müller
- Centro de Ciencias de La Complejidad, Universidad Nacional Autónoma de México, Ciudad de México, México.,Centro de Investigación en Ciencias, Universidad Autónoma del Estado de Morelos, Cuernavaca, Morelos, México.,Centro Internacional de Ciencias A. C., Cuernavaca, Morelos, México
| |
Collapse
|
9
|
Galarneau E, Colasante T, Speidel R, Malti T. Correlates of children's sympathy: Recognition and regulation of sadness and anger. SOCIAL DEVELOPMENT 2021. [DOI: 10.1111/sode.12577] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Affiliation(s)
- Emma Galarneau
- Department of Psychology University of Toronto, Toronto, Ontario, Canada
- Centre for Child Development, Mental Health, and Policy University of Toronto Mississauga, Ontario, Canada
| | - Tyler Colasante
- Department of Psychology University of Toronto, Toronto, Ontario, Canada
- Centre for Child Development, Mental Health, and Policy University of Toronto Mississauga, Ontario, Canada
| | - Ruth Speidel
- Department of Psychology University of Toronto, Toronto, Ontario, Canada
- Centre for Child Development, Mental Health, and Policy University of Toronto Mississauga, Ontario, Canada
| | - Tina Malti
- Department of Psychology University of Toronto, Toronto, Ontario, Canada
- Centre for Child Development, Mental Health, and Policy University of Toronto Mississauga, Ontario, Canada
| |
Collapse
|
10
|
Taamallah A, Halayem S, Rajhi O, Ghazzai M, Moussa M, Touati M, Ayadi HBY, Ouanes S, Abbes ZS, Hajri M, Jelili S, Fakhfakh R, Bouden A. Validation of the Tunisian Test for Facial Emotions Recognition: Study in Children From 7 to 12 Years Old. Front Psychol 2021; 12:643749. [PMID: 34880800 PMCID: PMC8645551 DOI: 10.3389/fpsyg.2021.643749] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2021] [Accepted: 10/01/2021] [Indexed: 01/03/2023] Open
Abstract
Background: Facial expressions transmit information about emotional state, facilitating communication and regulation in interpersonal relationships. Their acute recognition is essential in social adaptation and lacks among children suffering from autism spectrum disorders. The aim of our study was to validate the "Recognition of Facial Emotions: Tunisian Test for Children" among Tunisian children in order to assess facial emotion recognition in children with autism spectrum disorders (ASD). Methods: We conducted a cross-sectional study among neurotypical children from the general population. The final version of or test consisted of a static subtest of 114 photographs and a dynamic subtest of 36 videos expressing the six basic emotions (happiness, anger, sadness, disgust, fear and surprise), presented by actors of different ages and genders. The test items were coded according to Ekman's "Facial Action Coding System" method. The validation study focused on the validity of the content, the validity of the construct and the reliability. Results: We included 116 neurotypical children, from 7 to 12 years old. Our population was made up of 54 boys and 62 girls. The reliability's study showed good internal consistency for each subtest: the Cronbach coefficient was 0.88 for the static subtest and 0.85 for the dynamic subtest. The study of the internal structure through the exploratory factor analysis of the items of emotions and those of intensity showed that the distribution of the items in sub-domains was similar to their theoretical distribution. Age was significantly correlated to the mean of the overall score for both subtests (p < 10-3). Gender was no significantly correlated to the overall score (p = 0.15). High intensity photographs were better recognized. The emotion of happiness was the most recognized in both subtests. A significant difference between the overall score of the static and dynamic subtest, in favor of the dynamic one, was identified (p < 10-3). Conclusion: This work provides clinicians with a reliable tool to assess recognition of facial emotions in typically developing children.
Collapse
Affiliation(s)
| | - Soumeyya Halayem
- Hôpital Razi, Manouba, Tunisia
- Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | | | | | | | | | | | - Sami Ouanes
- Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
- Department of Psychiatry, Hamad Medical Corporation, Doha, Qatar
| | - Zeineb S. Abbes
- Hôpital Razi, Manouba, Tunisia
- Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Melek Hajri
- Hôpital Razi, Manouba, Tunisia
- Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Selima Jelili
- Hôpital Razi, Manouba, Tunisia
- Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | | | - Asma Bouden
- Hôpital Razi, Manouba, Tunisia
- Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| |
Collapse
|
11
|
Vesker M, Bahn D, Kauschke C, Schwarzer G. Developmental Changes in Gaze Behavior and the Effects of Auditory Emotion Word Priming in Emotional Face Categorization. Multisens Res 2021; 35:1-21. [PMID: 34534967 DOI: 10.1163/22134808-bja10063] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2020] [Accepted: 08/26/2021] [Indexed: 11/19/2022]
Abstract
Social interactions often require the simultaneous processing of emotions from facial expressions and speech. However, the development of the gaze behavior used for emotion recognition, and the effects of speech perception on the visual encoding of facial expressions is less understood. We therefore conducted a word-primed face categorization experiment, where participants from multiple age groups (six-year-olds, 12-year-olds, and adults) categorized target facial expressions as positive or negative after priming with valence-congruent or -incongruent auditory emotion words, or no words at all. We recorded our participants' gaze behavior during this task using an eye-tracker, and analyzed the data with respect to the fixation time toward the eyes and mouth regions of faces, as well as the time until participants made the first fixation within those regions (time to first fixation, TTFF). We found that the six-year-olds showed significantly higher accuracy in categorizing congruently primed faces compared to the other conditions. The six-year-olds also showed faster response times, shorter total fixation durations, and faster TTFF measures in all primed trials, regardless of congruency, as compared to unprimed trials. We also found that while adults looked first, and longer, at the eyes as compared to the mouth regions of target faces, children did not exhibit this gaze behavior. Our results thus indicate that young children are more sensitive than adults or older children to auditory emotion word primes during the perception of emotional faces, and that the distribution of gaze across the regions of the face changes significantly from childhood to adulthood.
Collapse
Affiliation(s)
- Michael Vesker
- Department of Developmental Psychology, Justus-Liebig-Universität Gießen, 35394 Giessen, Germany
| | - Daniela Bahn
- Clinical Linguistics, Department of German Linguistics, Philipps-Universität Marburg, 35032 Marburg, Germany
| | - Christina Kauschke
- Clinical Linguistics, Department of German Linguistics, Philipps-Universität Marburg, 35032 Marburg, Germany
| | - Gudrun Schwarzer
- Department of Developmental Psychology, Justus-Liebig-Universität Gießen, 35394 Giessen, Germany
| |
Collapse
|
12
|
Kuć J, Szarejko KD, Gołȩbiewska M. Smiling, Yawning, Jaw Functional Limitations and Oral Behaviors With Respect to General Health Status in Patients With Temporomandibular Disorder-Myofascial Pain With Referral. Front Neurol 2021; 12:646293. [PMID: 34108927 PMCID: PMC8182059 DOI: 10.3389/fneur.2021.646293] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/25/2020] [Accepted: 04/19/2021] [Indexed: 12/15/2022] Open
Abstract
Background: The temporomandibular joint is the one of the most important joints in the human body. It enables numerous orofacial functions such as mastication, swallowing, breathing, speech, emotional communication, and facial expressions. The aim of the study was to evaluate the prevalence of jaw functional limitations and oral behaviors with respect to general health status in patients with temporomandibular joint disorders—myofascial pain with referral. Materials and methods: The study group consisted of 50 individuals (37 females and 13 males) with complete natural dentition. The average age was 23.36 years with ± 0.30 as a standard error. All subjects underwent clinical examination and were diagnosed with myofascial pain with referral according to the Diagnostic Criteria for Temporomandibular Disorders. The survey was conducted in connection with the Jaw Functional Limitation Scale-8 (JFLS-8), Jaw Functional Limitation Scale-20 (JFLS-20), Patient Health Questionnaire-4 (PHQ-4), Patient Health Questionnaire-9 (PHQ-9), Generalized Anxiety Disorder-7 (GAD-7), Patient Health Questionnaire-15 (PHQ-15), and Oral Behaviors Checklist (OBC). Results: The most common functional problems in the entire study group were chewing tough food and yawning. In terms of gender, statistically significant differences were noted for chewing tough food and smiling (p = 0.015451; p = 0.035978, respectively). With respect to Bonferroni correction and Benjamini-Hochberg procedure, the observed differences were not statistically significant. There were no statistically considerable differences in mastication, mandibular mobility, verbal and emotional communication, or global limitations (p > 0.05). Over half (56%) of the respondents had depression of varying severity. Somatic symptoms of different severity were found in 78% of the patients, and 44% of the respondents declared anxiety disorders. The score of the Oral Behavior Checklist (OBC = 27.18) highlighted a high tendency for developing craniomandibular disorders. Conclusion: Patients with myofascial pain with referral, demonstrated a disturbed biopsychosocial profile. The restrictions in yawning and smiling as well as limitations in mastication, mobility, verbal and emotional communication, and global limitations appear to be significant predictors of craniomandibular dysfunction. Depression, stress, and somatic disorders are important factors predisposing patients to the occurrence of myofascial pain with referral. The progression of oral behaviors may indicate the role of somatosensory amplification.
Collapse
Affiliation(s)
- Joanna Kuć
- Department of Prosthodontics, Medical University of Bialystok, Białystok, Poland
| | | | - Maria Gołȩbiewska
- Department of Dental Techniques, Medical University of Bialystok, Białystok, Poland
| |
Collapse
|
13
|
Kawahara M, Sauter DA, Tanaka A. Culture shapes emotion perception from faces and voices: changes over development. Cogn Emot 2021; 35:1175-1186. [PMID: 34000966 DOI: 10.1080/02699931.2021.1922361] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
The perception of multisensory emotion cues is affected by culture. For example, East Asians rely more on vocal, as compared to facial, affective cues compared to Westerners. However, it is unknown whether these cultural differences exist in childhood, and if not, which processing style is exhibited in children. The present study tested East Asian and Western children, as well as adults from both cultural backgrounds, to probe cross-cultural similarities and differences at different ages, and to establish the weighting of each modality at different ages. Participants were simultaneously shown a face and a voice expressing either congruent or incongruent emotions, and were asked to judge whether the person was happy or angry. Replicating previous research, East Asian adults relied more on vocal cues than did Western adults. Young children from both cultural groups, however, behaved like Western adults, relying primarily on visual information. The proportion of responses based on vocal cues increased with age in East Asian, but not Western, participants. These results suggest that culture is an important factor in developmental changes in the perception of facial and vocal affective information.
Collapse
Affiliation(s)
- Misako Kawahara
- Department of Psychology, Tokyo Woman's Christian University, Tokyo, Japan.,Kojimachi Business Center Building, Japan Society for the Promotion of Science, Tokyo, Japan
| | - Disa A Sauter
- Department of Psychology, University of Amsterdam, Amsterdam, Netherlands
| | - Akihiro Tanaka
- Department of Psychology, Tokyo Woman's Christian University, Tokyo, Japan
| |
Collapse
|
14
|
Jelili S, Halayem S, Taamallah A, Ennaifer S, Rajhi O, Moussa M, Ghazzei M, Nabli A, Ouanes S, Abbes Z, Hajri M, Fakhfakh R, Bouden A. Impaired Recognition of Static and Dynamic Facial Emotions in Children With Autism Spectrum Disorder Using Stimuli of Varying Intensities, Different Genders, and Age Ranges Faces. Front Psychiatry 2021; 12:693310. [PMID: 34489754 PMCID: PMC8417587 DOI: 10.3389/fpsyt.2021.693310] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/10/2021] [Accepted: 07/26/2021] [Indexed: 11/13/2022] Open
Abstract
A multitude of research on facial emotion recognition (FER) in Autism Spectrum Disorders (ASD) have been published since several years. However, these studies have mainly used static high intensity stimuli, including adult and/or children facial emotions. This current study investigated FER in children with ASD using an innovative task, composed of a combination of static (114 pictures) and dynamic (36 videos) subtests, including children, adolescent, and adult male and female faces, with high, medium, and low intensity of basic facial emotions, and neutral expression. The ASD group consisted of 45 Tunisian verbal children, and the control group consisted of 117 tunisian typically developing children. Both groups were aged 7-12 years. After adjusting for sex, age, mental age, and school grade, the ASD group scored lower than controls on all tests except for the recognition of happiness and fear in the static subtest, and the recognition of happiness, fear, and sadness in the dynamic subtest (p ≥ 0.05). In the ASD group, the total score of both the static and the dynamic subtest were positively correlated with the school grade (p < 0.001), but not with age, or mental age. Children with ASD performed better in recognizing facial emotions in children than in adults and adolescents on videos and photos (p < 0.001). Impairments in FER would have negative impact on the child's social development. Thus, the creation of new intervention instruments aiming to improve emotion recognition strategies at an early stage to individuals with ASD seems fundamental.
Collapse
Affiliation(s)
- Selima Jelili
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Soumeyya Halayem
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Amal Taamallah
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Selima Ennaifer
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Olfa Rajhi
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Mohamed Moussa
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Melek Ghazzei
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Ahmed Nabli
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia
| | - Sami Ouanes
- Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia.,Department of Psychiatry- Hamad Medical Corporation, Doha, Qatar
| | - Zeineb Abbes
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | - Malek Hajri
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| | | | - Asma Bouden
- Department of Child and Adolescent Psychiatry, Razi Hospital, Manouba, Tunisia.,Faculty of Medicine, Tunis El Manar University, Tunis, Tunisia
| |
Collapse
|
15
|
Samaey C, Van der Donck S, van Winkel R, Boets B. Facial Expression Processing Across the Autism-Psychosis Spectra: A Review of Neural Findings and Associations With Adverse Childhood Events. Front Psychiatry 2020; 11:592937. [PMID: 33281648 PMCID: PMC7691238 DOI: 10.3389/fpsyt.2020.592937] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/08/2020] [Accepted: 10/09/2020] [Indexed: 11/13/2022] Open
Abstract
Autism spectrum disorder (ASD) and primary psychosis are classified as distinct neurodevelopmental disorders, yet they display overlapping epidemiological, environmental, and genetic components as well as endophenotypic similarities. For instance, both disorders are characterized by impairments in facial expression processing, a crucial skill for effective social communication, and both disorders display an increased prevalence of adverse childhood events (ACE). This narrative review provides a brief summary of findings from neuroimaging studies investigating facial expression processing in ASD and primary psychosis with a focus on the commonalities and differences between these disorders. Individuals with ASD and primary psychosis activate the same brain regions as healthy controls during facial expression processing, albeit to a different extent. Overall, both groups display altered activation in the fusiform gyrus and amygdala as well as altered connectivity among the broader face processing network, probably indicating reduced facial expression processing abilities. Furthermore, delayed or reduced N170 responses have been reported in ASD and primary psychosis, but the significance of these findings is questioned, and alternative frequency-tagging electroencephalography (EEG) measures are currently explored to capture facial expression processing impairments more selectively. Face perception is an innate process, but it is also guided by visual learning and social experiences. Extreme environmental factors, such as adverse childhood events, can disrupt normative development and alter facial expression processing. ACE are hypothesized to induce altered neural facial expression processing, in particular a hyperactive amygdala response toward negative expressions. Future studies should account for the comorbidity among ASD, primary psychosis, and ACE when assessing facial expression processing in these clinical groups, as it may explain some of the inconsistencies and confound reported in the field.
Collapse
Affiliation(s)
- Celine Samaey
- Department of Neurosciences, Center for Clinical Psychiatry, KU Leuven, Leuven, Belgium
| | - Stephanie Van der Donck
- Department of Neurosciences, Center for Developmental Psychiatry, KU Leuven, Leuven, Belgium
- Leuven Autism Research (LAuRes), KU Leuven, Leuven, Belgium
| | - Ruud van Winkel
- Department of Neurosciences, Center for Clinical Psychiatry, KU Leuven, Leuven, Belgium
- University Psychiatric Center (UPC), KU Leuven, Leuven, Belgium
| | - Bart Boets
- Department of Neurosciences, Center for Developmental Psychiatry, KU Leuven, Leuven, Belgium
- Leuven Autism Research (LAuRes), KU Leuven, Leuven, Belgium
| |
Collapse
|
16
|
Thompson A, Steinbeis N. Computational modelling of attentional bias towards threat in paediatric anxiety. Dev Sci 2020; 24:e13055. [PMID: 33098719 PMCID: PMC8244064 DOI: 10.1111/desc.13055] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2020] [Revised: 09/18/2020] [Accepted: 10/04/2020] [Indexed: 12/04/2022]
Abstract
Computational modelling can be used to precisely characterize the cognitive processes involved in attentional biases towards threat, yet so far has only been applied in the context of adult anxiety. Furthermore, studies investigating attentional biases in childhood anxiety have largely used tasks that conflate automatic and controlled attentional processes. By using a perceptual load paradigm, we separately investigate contributions from automatic and controlled processes to attentional biases towards negative stimuli and their association with paediatric anxiety. We also use computational modelling to investigate these mechanisms in children for the first time. In a sample of 60 children (aged 5‐11 years) we used a perceptual load task specifically adapted for children, in order to investigate attentional biases towards fearful (compared with happy and neutral) faces. Outcome measures were reaction time and percentage accuracy. We applied a drift diffusion model to investigate the precise cognitive mechanisms involved. The load effect was associated with significant differences in response time, accuracy and the diffusion modelling parameters drift rate and extra‐decisional time. Greater anxiety was associated with greater accuracy and the diffusion modelling parameter ‘drift rate’ on the fearful face trials. This was specific to the high load condition. These findings suggest that attentional biases towards fearful faces in childhood anxiety are driven by increased perceptual sensitivity towards fear in automatic attentional systems. Our findings from computational modelling suggest that current attention bias modification treatments should target perceptual encoding directly rather than processes occurring afterwards.
Collapse
Affiliation(s)
- Abigail Thompson
- Department of Clinical, Educational and Health Psychology, UCL, London, UK
| | - Nikolaus Steinbeis
- Department of Clinical, Educational and Health Psychology, UCL, London, UK
| |
Collapse
|
17
|
Padhy SK, Rina K, Sarkar S. Smile, grimace or grin? Recalibrating psychiatrist-patient interaction in the era of face masks. Asian J Psychiatr 2020; 53:102389. [PMID: 32890982 PMCID: PMC7451215 DOI: 10.1016/j.ajp.2020.102389] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/05/2020] [Revised: 08/24/2020] [Accepted: 08/25/2020] [Indexed: 12/30/2022]
Affiliation(s)
- Susanta Kumar Padhy
- Department of Psychiatry, All India Institute of Medical Sciences, Bhubaneswar, 751019, India
| | - Kumari Rina
- Department of Psychiatry, All India Institute of Medical Sciences, Bhubaneswar, 751019, India
| | - Siddharth Sarkar
- Department of Psychiatry and National Drug Dependence Treatment Centre, All India Institute of Medical Sciences, New Delhi, 110029, India.
| |
Collapse
|
18
|
Vestergaard M, Kongerslev MT, Thomsen MS, Mathiesen BB, Harmer CJ, Simonsen E, Miskowiak KW. Women With Borderline Personality Disorder Show Reduced Identification of Emotional Facial Expressions and a Heightened Negativity Bias. J Pers Disord 2020; 34:677-698. [PMID: 30689504 DOI: 10.1521/pedi_2019_33_409] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
Individuals with borderline personality disorder (BPD) frequently display impairments in the identification of emotional facial expressions paralleled by a negativity bias. However, it remains unclear whether misperception of facial expressions is a key psychopathological marker of BPD. To address this question, the authors examined 43 women diagnosed with BPD and 56 healthy female controls using an emotion face identification task and a face dot-probe task together with measures on psychopathology. Compared to controls, women with BPD showed impaired identification of disgusted and angry faces concurrent with a bias to misclassify faces as angry, and a faster preconscious vigilance for fearful relative to happy facial expressions. Increased severity of borderline symptoms and global psychopathology in BPD patients were associated with reduced ability to identify angry facial expressions and a stronger negativity bias to anger. The findings indicate that BPD patients who misperceive face emotions have the greatest mental health issues.
Collapse
Affiliation(s)
| | - Mickey T Kongerslev
- Psychiatric Research Unit, Psychiatry Region Zealand, Denmark.,Department of Psychology, University of Southern Denmark, Denmark.,Psychiatric Clinic, Psychiatry Roskilde, Region Zealand, Denmark
| | - Marianne S Thomsen
- Psychiatric Research Unit, Psychiatry Region Zealand, Denmark.,Psychiatric Clinic, Psychiatry Roskilde, Region Zealand, Denmark.,Department of Psychology, University of Copenhagen, Denmark
| | | | | | - Erik Simonsen
- Psychiatric Research Unit, Psychiatry Region Zealand, Denmark.,Institute of Clinical Medicine, Faculty of Health and Medical Sciences, University of Copenhagen, Denmark
| | - Kamilla W Miskowiak
- Department of Psychology, University of Copenhagen, Denmark.,Copenhagen Affective Disorder Research Centre, Psychiatric Center Copenhagen, Rigshospitalet, Copenhagen, Denmark
| |
Collapse
|
19
|
Van der Donck S, Dzhelyova M, Vettori S, Mahdi SS, Claes P, Steyaert J, Boets B. Rapid neural categorization of angry and fearful faces is specifically impaired in boys with autism spectrum disorder. J Child Psychol Psychiatry 2020; 61:1019-1029. [PMID: 32003011 PMCID: PMC7496330 DOI: 10.1111/jcpp.13201] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 12/18/2019] [Indexed: 01/05/2023]
Abstract
BACKGROUND Difficulties with facial expression processing may be associated with the characteristic social impairments in individuals with autism spectrum disorder (ASD). Emotional face processing in ASD has been investigated in an abundance of behavioral and EEG studies, yielding, however, mixed and inconsistent results. METHODS We combined fast periodic visual stimulation (FPVS) with EEG to assess the neural sensitivity to implicitly detect briefly presented facial expressions among a stream of neutral faces, in 23 boys with ASD and 23 matched typically developing (TD) boys. Neutral faces with different identities were presented at 6 Hz, periodically interleaved with an expressive face (angry, fearful, happy, sad in separate sequences) every fifth image (i.e., 1.2 Hz oddball frequency). These distinguishable frequency tags for neutral and expressive stimuli allowed direct and objective quantification of the expression-categorization responses, needing only four sequences of 60 s of recording per condition. RESULTS Both groups show equal neural synchronization to the general face stimulation and similar neural responses to happy and sad faces. However, the ASD group displays significantly reduced responses to angry and fearful faces, compared to TD boys. At the individual subject level, these neural responses allow to predict membership of the ASD group with an accuracy of 87%. Whereas TD participants show a significantly lower sensitivity to sad faces than to the other expressions, ASD participants show an equally low sensitivity to all the expressions. CONCLUSIONS Our results indicate an emotion-specific processing deficit, instead of a general emotion-processing problem: Boys with ASD are less sensitive than TD boys to rapidly and implicitly detect angry and fearful faces. The implicit, fast, and straightforward nature of FPVS-EEG opens new perspectives for clinical diagnosis.
Collapse
Affiliation(s)
- Stephanie Van der Donck
- Department of NeurosciencesCenter for Developmental PsychiatryKU LeuvenLeuvenBelgium
- Leuven Autism Research (LAuRes)KU LeuvenLeuvenBelgium
| | - Milena Dzhelyova
- Institute of Research in Psychological SciencesInstitute of NeuroscienceUniversity of LouvainLouvain‐La‐NeuveBelgium
| | - Sofie Vettori
- Department of NeurosciencesCenter for Developmental PsychiatryKU LeuvenLeuvenBelgium
- Leuven Autism Research (LAuRes)KU LeuvenLeuvenBelgium
| | - Soha Sadat Mahdi
- Department of NeurosciencesCenter for Developmental PsychiatryKU LeuvenLeuvenBelgium
- Medical Imaging Research Center, MIRCUZ LeuvenLeuvenBelgium
| | - Peter Claes
- Medical Imaging Research Center, MIRCUZ LeuvenLeuvenBelgium
- Department of Electrical Engineering (ESAT/PSI)KU LeuvenLeuvenBelgium
- Department of Human GeneticsKU LeuvenLeuvenBelgium
| | - Jean Steyaert
- Department of NeurosciencesCenter for Developmental PsychiatryKU LeuvenLeuvenBelgium
- Leuven Autism Research (LAuRes)KU LeuvenLeuvenBelgium
| | - Bart Boets
- Department of NeurosciencesCenter for Developmental PsychiatryKU LeuvenLeuvenBelgium
- Leuven Autism Research (LAuRes)KU LeuvenLeuvenBelgium
| |
Collapse
|
20
|
Garcia SE, Tully EC. Children's recognition of happy, sad, and angry facial expressions across emotive intensities. J Exp Child Psychol 2020; 197:104881. [PMID: 32559635 DOI: 10.1016/j.jecp.2020.104881] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2019] [Revised: 04/25/2020] [Accepted: 04/27/2020] [Indexed: 11/26/2022]
Abstract
Sharing emotional experiences is a key task that requires accurate recognition of peers' emotions during middle childhood. Existing research suggests that children are proficient at discerning emotion from facial expressions during middle childhood, but this research has focused on recognition of adults' intense emotional expressions. In this study, facial emotion recognition for children's happy, sad, and angry expressions across low, medium, and high intensities was measured in a sample of 7- to 10-year-old children (N = 80; 53% female) to quantify overall accurate recognition as well as inaccuracies, including identifying an emotion as present when it is not (false alarms) and failing to identify an emotion when present (miss rate). Children's recognition accuracy for low-threshold happiness, sadness, and anger was quite poor but improved in a cubic fashion as expression intensity increased, with dramatic improvements across medium-intensity expressions, and little further improvement across high-intensity expressions. A positivity bias was evident; children were more accurate at recognizing happiness than at recognizing sadness and anger, rarely failed to identify happiness when present, and tended to mislabel expressions as happy rather than as angry or sad. Children were generally better at recognizing anger compared with sadness but were more accurate at recognizing subtle sadness compared with anger, which appeared to be due to children missing subtle anger when present. The findings are discussed with regard to the functionality of others' happiness for signaling positive socializing opportunities, anger for signaling threatening interactions, and sadness for prompting prosocial action and with regard to how children's facial emotion recognition may affect general socioemotional development.
Collapse
Affiliation(s)
- Sarah E Garcia
- Department of Psychology, Georgia State University, Atlanta, GA 30302, USA.
| | - Erin C Tully
- Department of Psychology, Georgia State University, Atlanta, GA 30302, USA
| |
Collapse
|
21
|
López-Morales H, Zabaletta V, Vivas L, López MC. Reconocimiento de Expresiones Faciales Emocionales. Diferencias en el Desarrollo. PSICOLOGIA: TEORIA E PESQUISA 2020. [DOI: 10.1590/0102.3772e3626] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
RESUMEN El trabajo se propuso caracterizar el reconocimiento facial de emociones en población infanto-juvenil. Se administró una adaptación digital del Test Pictures of Facial Affects a 147 participantes de entre 9 y 18 años. Los resultados evidenciaron una asociación negativa entre la edad y la tasa de aciertos para alegría y positiva para asco y miedo. Además, se evidenció un efecto significativo de la edad en los tiempos de respuesta de todas las emociones a excepción del miedo. Los resultados sugieren que a medida que aumenta la edad el reconocimiento emocional es más veloz, sin embargo, esto se refleja en una mejoría en el reconocimiento emocional sólo en asco y miedo. Se discuten la importancia de estas emociones para la adolescencia.
Collapse
|
22
|
Peper JS, Burke SM, Wierenga LM. Sex differences and brain development during puberty and adolescence. HANDBOOK OF CLINICAL NEUROLOGY 2020; 175:25-54. [PMID: 33008529 DOI: 10.1016/b978-0-444-64123-6.00003-5] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Sex differences in behavior, and whether these behavioral differences are related to sex differences in brain development, has been a longstanding topic of debate. Presumably, sex differences can provide critically important leads for explaining the etiology of various illnesses that show (i) large sex differences in prevalence and (ii) have an origin before or during adolescence. The general aim of this chapter is to provide an overview of scientific studies on sex differences in normative brain and behavioral development across puberty and adolescence, including the (sex) hormone-driven transition phase of puberty. Moreover, we describe the literature on brain and behavioral development in gender dysphoria, a severe and persistent incongruence between the self-identified gender and the assigned sex at birth. From the literature it becomes clear there is evidence for a specific link between pubertal maturation and developmental changes in arousal, motivation, and emotion. However, this link is rather similar between boys and girls. Moreover, although there is substantial evidence for sex differences in mean brain structure, these have not always been linked to sex differences in behavior, cognition, or psychopathology. Furthermore, there is little evidence for sex differences in brain development and thus, studies so far have been unable to explain sex differences in cognition. Suggestions for future research and methodologic considerations are provided.
Collapse
Affiliation(s)
- Jiska S Peper
- Department of Psychology, Leiden University, Leiden, The Netherlands.
| | - Sarah M Burke
- Department of Psychology, Leiden University, Leiden, The Netherlands
| | - Lara M Wierenga
- Department of Psychology, Leiden University, Leiden, The Netherlands
| |
Collapse
|
23
|
Abstract
This study examined socio-emotional skills, utilizing a facial emotion recognition (FER) task featuring unfamiliar and familiar faces, in children with autism spectrum disorders (ASD) compared to typically developing (TD) children. Results showed that the TD children were more proficient on the FER overall whereas ASD children recognized familiar expressions more precisely than unfamiliar ones. Further, ASD children did not differ from TD children in recognizing happy expressions but ASD children were less skilled with recognizing negative expressions. Findings suggest that ASD children possess more adept FER abilities than previously thought especially for important social others. Ultimately, a task featuring an array of positive and negative familiar and unfamiliar expressions may provide a more comprehensive assessment of socio-emotional abilities in ASD children.
Collapse
|
24
|
Children with facial paralysis due to Moebius syndrome exhibit reduced autonomic modulation during emotion processing. J Neurodev Disord 2019; 11:12. [PMID: 31291910 PMCID: PMC6617955 DOI: 10.1186/s11689-019-9272-2] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/17/2018] [Accepted: 06/21/2019] [Indexed: 12/31/2022] Open
Abstract
BACKGROUND Facial mimicry is crucial in the recognition of others' emotional state. Thus, the observation of others' facial expressions activates the same neural representation of that affective state in the observer, along with related autonomic and somatic responses. What happens, therefore, when someone cannot mimic others' facial expressions? METHODS We investigated whether psychophysiological emotional responses to others' facial expressions were impaired in 13 children (9 years) with Moebius syndrome (MBS), an extremely rare neurological disorder (1/250,000 live births) characterized by congenital facial paralysis. We inspected autonomic responses and vagal regulation through facial cutaneous thermal variations and by the computation of respiratory sinus arrhythmia (RSA). These parameters provide measures of emotional arousal and show the autonomic adaptation to others' social cues. Physiological responses in children with MBS were recorded during dynamic facial expression observation and were compared to those of a control group (16 non-affected children, 9 years). RESULTS There were significant group effects on thermal patterns and RSA, with lower values in children with MBS. We also observed a mild deficit in emotion recognition in these patients. CONCLUSION Results support "embodied" theory, whereby the congenital inability to produce facial expressions induces alterations in the processing of facial expression of emotions. Such alterations may constitute a risk for emotion dysregulation.
Collapse
|
25
|
Covic A, von Steinbüchel N, Kiese-Himmel C. Emotion Recognition in Kindergarten Children. Folia Phoniatr Logop 2019; 72:273-281. [PMID: 31256156 DOI: 10.1159/000500589] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2018] [Accepted: 04/18/2019] [Indexed: 11/19/2022] Open
Abstract
BACKGROUND/AIMS Recognition and understanding of emotions are essential skills in nonverbal communication and in everyday social functioning. These are already evident in infancy. We aimed to compare how young children recognize facial emotional expressions from static faces versus vocal emotional expressions from speech prosody. METHODS Participants were 313 kindergarten children (162 girls, mean age = 51.01, SD 9.65 months; range 36-72). The design consisted of a visual and an auditory block (with 45 randomized trials each). Children were seated in front of a 14-inch laptop monitor and received visual stimuli (photos of faces) or auditory stimuli (spoken sentences) via loudspeakers. RESULTS Recognizing emotions from looking at static faces was found to be easier compared to interpreting emotions transmitted by speech prosody alone. The ability to interpret emotions from both faces and speech prosody increased with age. It was easier to identify a "happy" emotion from a facial expression than an "angry" or "sad" one, whereas a "sad" emotion could be more easily recognized from speech prosody alone than facial imagery alone. Girls were significantly better than boys in identifying "sad" facial expressions. CONCLUSION The results of the study are discussed in terms of educational implications for nonverbal communication.
Collapse
Affiliation(s)
- Amra Covic
- Institute of Medical Psychology and Medical Sociology, University of Medicine Göttingen, Göttingen, Germany,
| | - Nicole von Steinbüchel
- Institute of Medical Psychology and Medical Sociology, University of Medicine Göttingen, Göttingen, Germany
| | - Christiane Kiese-Himmel
- Institute of Medical Psychology and Medical Sociology, University of Medicine Göttingen, Göttingen, Germany.,Phoniatric and Pediatric Audiological Psychology, University of Medicine Göttingen, Göttingen, Germany
| |
Collapse
|
26
|
More than blindsight: Case report of a child with extraordinary visual capacity following perinatal bilateral occipital lobe injury. Neuropsychologia 2019; 128:178-186. [DOI: 10.1016/j.neuropsychologia.2017.11.017] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2017] [Revised: 09/26/2017] [Accepted: 11/12/2017] [Indexed: 11/18/2022]
|
27
|
A Deep-Learning Model for Subject-Independent Human Emotion Recognition Using Electrodermal Activity Sensors. SENSORS 2019; 19:s19071659. [PMID: 30959956 PMCID: PMC6479880 DOI: 10.3390/s19071659] [Citation(s) in RCA: 33] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/14/2019] [Revised: 03/31/2019] [Accepted: 04/03/2019] [Indexed: 11/16/2022]
Abstract
One of the main objectives of Active and Assisted Living (AAL) environments is to ensure that elderly and/or disabled people perform/live well in their immediate environments; this can be monitored by among others the recognition of emotions based on non-highly intrusive sensors such as Electrodermal Activity (EDA) sensors. However, designing a learning system or building a machine-learning model to recognize human emotions while training the system on a specific group of persons and testing the system on a totally a new group of persons is still a serious challenge in the field, as it is possible that the second testing group of persons may have different emotion patterns. Accordingly, the purpose of this paper is to contribute to the field of human emotion recognition by proposing a Convolutional Neural Network (CNN) architecture which ensures promising robustness-related results for both subject-dependent and subject-independent human emotion recognition. The CNN model has been trained using a grid search technique which is a model hyperparameter optimization technique to fine-tune the parameters of the proposed CNN architecture. The overall concept’s performance is validated and stress-tested by using MAHNOB and DEAP datasets. The results demonstrate a promising robustness improvement regarding various evaluation metrics. We could increase the accuracy for subject-independent classification to 78% and 82% for MAHNOB and DEAP respectively and to 81% and 85% subject-dependent classification for MAHNOB and DEAP respectively (4 classes/labels). The work shows clearly that while using solely the non-intrusive EDA sensors a robust classification of human emotion is possible even without involving additional/other physiological signals.
Collapse
|
28
|
De Stefani E, Nicolini Y, Belluardo M, Ferrari PF. Congenital facial palsy and emotion processing: The case of Moebius syndrome. GENES BRAIN AND BEHAVIOR 2019; 18:e12548. [PMID: 30604920 DOI: 10.1111/gbb.12548] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/17/2018] [Revised: 11/16/2018] [Accepted: 12/15/2018] [Indexed: 12/13/2022]
Abstract
According to the Darwinian perspective, facial expressions of emotions evolved to quickly communicate emotional states and would serve adaptive functions that promote social interactions. Embodied cognition theories suggest that we understand others' emotions by reproducing the perceived expression in our own facial musculature (facial mimicry) and the mere observation of a facial expression can evoke the corresponding emotion in the perceivers. Consequently, the inability to form facial expressions would affect the experience of emotional understanding. In this review, we aimed at providing account on the link between the lack of emotion production and the mechanisms of emotion processing. We address this issue by taking into account Moebius syndrome, a rare neurological disorder that primarily affects the muscles controlling facial expressions. Individuals with Moebius syndrome are born with facial paralysis and inability to form facial expressions. This makes them the ideal population to study whether facial mimicry is necessary for emotion understanding. Here, we discuss behavioral ambiguous/mixed results on emotion recognition deficits in Moebius syndrome suggesting the need to investigate further aspects of emotional processing such as the physiological responses associated with the emotional experience during developmental age.
Collapse
Affiliation(s)
- Elisa De Stefani
- Department of Medicine and Surgery, University of Parma, Parma, Italy
| | - Ylenia Nicolini
- Department of Medicine and Surgery, University of Parma, Parma, Italy
| | - Mauro Belluardo
- Department of Medicine and Surgery, University of Parma, Parma, Italy
| | - Pier Francesco Ferrari
- Department of Medicine and Surgery, University of Parma, Parma, Italy.,Institut des Sciences Cognitives Marc Jeannerod, CNRS, Université de Lyon, Lyon, France
| |
Collapse
|
29
|
Verpaalen IAM, Bijsterbosch G, Mobach L, Bijlstra G, Rinck M, Klein AM. Validating the Radboud faces database from a child's perspective. Cogn Emot 2019; 33:1531-1547. [PMID: 30744534 DOI: 10.1080/02699931.2019.1577220] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
Abstract
Facial expressions play a central role in diverse areas of psychology. However, facial stimuli are often only validated by adults, and there are no face databases validated by school-aged children. Validation by children is important because children still develop emotion recognition skills and may have different perceptions than adults. Therefore, in this study, we validated the adult Caucasian faces of the Radboud Faces Database (RaFD) in 8- to 12-year-old children (N = 652). Additionally, children rated valence, clarity, and model attractiveness. Emotion recognition rates were relatively high (72%; compared to 82% in the original validation by adults). Recognition accuracy was highest for happiness, below average for fear and disgust, and lowest for contempt. Children showed roughly the same emotion recognition pattern as adults, but were less accurate in distinguishing similar emotions. As expected, in general, 10- to 12-year-old children had a higher emotion recognition accuracy than 8- and 9-year-olds. Overall, girls slightly outperformed boys. More nuanced differences in these gender and age effects on recognition rates were visible per emotion. The current study provides researchers with recommendation on how to use the RaFD adult pictures in child studies. Researchers can select appropriate stimuli for their research using the online available validation data.
Collapse
Affiliation(s)
- Iris A M Verpaalen
- Behavioural Science Institute, Radboud University , Nijmegen , The Netherlands
| | - Geraly Bijsterbosch
- Behavioural Science Institute, Radboud University , Nijmegen , The Netherlands
| | - Lynn Mobach
- Behavioural Science Institute, Radboud University , Nijmegen , The Netherlands
| | - Gijsbert Bijlstra
- Behavioural Science Institute, Radboud University , Nijmegen , The Netherlands
| | - Mike Rinck
- Behavioural Science Institute, Radboud University , Nijmegen , The Netherlands
| | - Anke M Klein
- Behavioural Science Institute, Radboud University , Nijmegen , The Netherlands.,Developmental Psychology, Universiteit van Amsterdam , Amsterdam , The Netherlands
| |
Collapse
|
30
|
Prada M, Garrido MV, Camilo C, Rodrigues DL. Subjective ratings and emotional recognition of children's facial expressions from the CAFE set. PLoS One 2018; 13:e0209644. [PMID: 30589868 PMCID: PMC6307702 DOI: 10.1371/journal.pone.0209644] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2018] [Accepted: 12/10/2018] [Indexed: 12/02/2022] Open
Abstract
Access to validated stimuli depicting children's facial expressions is useful for different research domains (e.g., developmental, cognitive or social psychology). Yet, such databases are scarce in comparison to others portraying adult models, and validation procedures are typically restricted to emotional recognition accuracy. This work presents subjective ratings for a sub-set of 283 photographs selected from the Child Affective Facial Expression set (CAFE [1]). Extending beyond the original emotion recognition accuracy norms [2], our main goal was to validate this database across eight subjective dimensions related to the model (e.g., attractiveness, familiarity) or the specific facial expression (e.g., intensity, genuineness), using a sample from a different nationality (N = 450 Portuguese participants). We also assessed emotion recognition (forced-choice task with seven options: anger, disgust, fear, happiness, sadness, surprise and neutral). Overall results show that most photographs were rated as highly clear, genuine and intense facial expressions. The models were rated as both moderately familiar and likely to belong to the in-group, obtaining high attractiveness and arousal ratings. Results also showed that, similarly to the original study, the facial expressions were accurately recognized. Normative and raw data are available as supplementary material at https://osf.io/mjqfx/.
Collapse
Affiliation(s)
- Marília Prada
- Department of Social and Organizational Psychology, Instituto Universitário de Lisboa (ISCTE-IUL), CIS - IUL, Lisboa, Portugal
| | - Margarida V. Garrido
- Department of Social and Organizational Psychology, Instituto Universitário de Lisboa (ISCTE-IUL), CIS - IUL, Lisboa, Portugal
| | - Cláudia Camilo
- Department of Social and Organizational Psychology, Instituto Universitário de Lisboa (ISCTE-IUL), CIS - IUL, Lisboa, Portugal
| | - David L. Rodrigues
- Department of Social and Organizational Psychology, Instituto Universitário de Lisboa (ISCTE-IUL), CIS - IUL, Lisboa, Portugal
| |
Collapse
|
31
|
Grossheinrich N, Firk C, Schulte-Rüther M, von Leupoldt A, Konrad K, Huestegge L. Looking While Unhappy: A Mood-Congruent Attention Bias Toward Sad Adult Faces in Children. Front Psychol 2018; 9:2577. [PMID: 30618993 PMCID: PMC6312126 DOI: 10.3389/fpsyg.2018.02577] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2018] [Accepted: 11/30/2018] [Indexed: 01/01/2023] Open
Abstract
A negative mood-congruent attention bias has been consistently observed, for example, in clinical studies on major depression. This bias is assumed to be dysfunctional in that it supports maintaining a sad mood, whereas a potentially adaptive role has largely been neglected. Previous experiments involving sad mood induction techniques found a negative mood-congruent attention bias specifically for young individuals, explained by an adaptive need for information transfer in the service of mood regulation. In the present study we investigated the attentional bias in typically developing children (aged 6–12 years) when happy and sad moods were induced. Crucially, we manipulated the age (adult vs. child) of the displayed pairs of facial expressions depicting sadness, anger, fear and happiness. The results indicate that sad children indeed exhibited a mood specific attention bias toward sad facial expressions. Additionally, this bias was more pronounced for adult faces. Results are discussed in the context of an information gain which should be stronger when looking at adult faces due to their more expansive life experience. These findings bear implications for both research methods and future interventions.
Collapse
Affiliation(s)
- Nicola Grossheinrich
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry and Psychotherapy, University Hospital of the RWTH Aachen, Aachen, Germany.,Neurophysiological Section, Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, Medical Faculty, University of Cologne, Cologne, Germany.,Department of Social Sciences, Institute of Health Research and Social Psychiatry, Catholic University of Applied Sciences of North Rhine - Westphalia, Cologne, Germany
| | - Christine Firk
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry and Psychotherapy, University Hospital of the RWTH Aachen, Aachen, Germany
| | - Martin Schulte-Rüther
- Translational Brain Medicine in Psychiatry and Neurology, Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, JARA Brain Translational Medicine, University Hospital of the RWTH Aachen, Jülich, Germany
| | | | - Kerstin Konrad
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry and Psychotherapy, University Hospital of the RWTH Aachen, Aachen, Germany.,JARA-Brain Institute II Molecular Neuroscience and Neuroimaging, Research Centre Jülich, Jülich, Germany
| | - Lynn Huestegge
- Institute of Psychology, University of Würzburg, Würzburg, Germany.,Institute of Psychology, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
32
|
Quantifying facial expression signal and intensity use during development. J Exp Child Psychol 2018; 174:41-59. [DOI: 10.1016/j.jecp.2018.05.005] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2017] [Revised: 05/10/2018] [Accepted: 05/11/2018] [Indexed: 11/19/2022]
|
33
|
Developmental changes in the categorical processing of positive and negative facial expressions. PLoS One 2018; 13:e0201521. [PMID: 30075000 PMCID: PMC6075754 DOI: 10.1371/journal.pone.0201521] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2017] [Accepted: 07/17/2018] [Indexed: 12/02/2022] Open
Abstract
Categorical biases in the processing of emotional facial expression have been the subject of much debate in the literature. Opposing views on this topic claim either that positive or negative facial expressions enjoy improved processing in the human brain. The developmental changes in the processing advantages of positive and negative facial expressions are also disputed, with studies using varying paradigms showing seemingly contradictory results. Therefore, to further investigate the development of categorical processing and extraction of emotional information from faces, we tested 6-, 9-, and 12-year-old children, as well as adults, on their ability to categorize various facial expressions as positive or negative as quickly as possible. This was a simplified paradigm designed to explicitly contrast the processing efficiency of positive and negative facial expressions on the broader level of those emotional valence categories, rather than specific single emotional expressions. Our results show an early age processing advantage for positive facial expressions, which disappears in adults who show no such differences in the case of response time measures. In the case of accuracy measures, the early advantage for positive facial expressions gradually disappears and is reversed into a negativity advantage in adults. These findings demonstrate that category-based positive and negative processing advantages are strongly modulated by age over the course of development, and can exhibit opposite effects depending on the developmental stage of the participant.
Collapse
|
34
|
Mancini G, Biolcati R, Agnoli S, Andrei F, Trombini E. Recognition of Facial Emotional Expressions Among Italian Pre-adolescents, and Their Affective Reactions. Front Psychol 2018; 9:1303. [PMID: 30123150 PMCID: PMC6085998 DOI: 10.3389/fpsyg.2018.01303] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2017] [Accepted: 07/09/2018] [Indexed: 11/13/2022] Open
Abstract
The recognition of emotional facial expressions is a central aspect for an effective interpersonal communication. This study aims to investigate whether changes occur in emotion recognition ability and in the affective reactions (self-assessed by participants through valence and arousal ratings) associated with the viewing of basic facial expressions during preadolescence (n = 396, 206 girls, aged 11-14 years, Mage = 12.73, DS = 0.91). Our results confirmed that happiness is the best recognized emotion during preadolescence. However, a significant decrease in recognition accuracy across age emerged for fear expressions. Moreover, participants' affective reactions elicited by the vision of happy facial expressions resulted to be the most pleasant and arousing compared to the other emotional expressions. On the contrary, the viewing of sadness was associated with the most negative affective reactions. Our results also revealed a developmental change in participants' affective reactions to the stimuli. Implications are discussed by taking into account the role of emotion recognition as one of the main factors involved in emotional development.
Collapse
Affiliation(s)
- Giacomo Mancini
- Department of Education, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Roberta Biolcati
- Department of Education, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Sergio Agnoli
- Marconi Institute for Creativity, Alma Mater Studiorum University of Bologna, Sasso Marconi, Italy
| | - Federica Andrei
- Department of Psychology, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Elena Trombini
- Department of Psychology, Alma Mater Studiorum University of Bologna, Bologna, Italy
| |
Collapse
|
35
|
Effects of early institutionalization on emotion processing in 12-year-old youth. Dev Psychopathol 2018; 29:1749-1761. [PMID: 29162181 DOI: 10.1017/s0954579417001377] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
We examined facial emotion recognition in 12-year-olds in a longitudinally followed sample of children with and without exposure to early life psychosocial deprivation (institutional care). Half of the institutionally reared children were randomized into foster care homes during the first years of life. Facial emotion recognition was examined in a behavioral task using morphed images. This same task had been administered when children were 8 years old. Neutral facial expressions were morphed with happy, sad, angry, and fearful emotional facial expressions, and children were asked to identify the emotion of each face, which varied in intensity. Consistent with our previous report, we show that some areas of emotion processing, involving the recognition of happy and fearful faces, are affected by early deprivation, whereas other areas, involving the recognition of sad and angry faces, appear to be unaffected. We also show that early intervention can have a lasting positive impact, normalizing developmental trajectories of processing negative emotions (fear) into the late childhood/preadolescent period.
Collapse
|
36
|
Abstract
FANchild (French Affective Norms for Children) provides norms of valence and arousal for a large corpus of French words (N = 720) rated by 908 French children and adolescents (ages 7, 9, 11, and 13). The ratings were made using the Self-Assessment Manikin (Lang, 1980). Because it combines evaluations of arousal and valence and includes ratings provided by 7-, 9-, 11-, and 13-year-olds, this database complements and extends existing French-language databases. Good response reliability was observed in each of the four age groups. Despite a significant level of consensus, we found age differences in both the valence and arousal ratings: Seven- and 9-year-old children gave higher mean valence and arousal ratings than did the other age groups. Moreover, the tendency to judge words positively (i.e., positive bias) decreased with age. This age- and sex-related database will enable French-speaking researchers to study how the emotional character of words influences their cognitive processing, and how this influence evolves with age. FANchild is available at https://www.researchgate.net/profile/Catherine_Monnier/contributions .
Collapse
|
37
|
Vesker M, Bahn D, Kauschke C, Tschense M, Degé F, Schwarzer G. Auditory Emotion Word Primes Influence Emotional Face Categorization in Children and Adults, but Not Vice Versa. Front Psychol 2018; 9:618. [PMID: 29765346 PMCID: PMC5938388 DOI: 10.3389/fpsyg.2018.00618] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2018] [Accepted: 04/12/2018] [Indexed: 12/02/2022] Open
Abstract
In order to assess how the perception of audible speech and facial expressions influence one another for the perception of emotions, and how this influence might change over the course of development, we conducted two cross-modal priming experiments with three age groups of children (6-, 9-, and 12-years old), as well as college-aged adults. In Experiment 1, 74 children and 24 adult participants were tasked with categorizing photographs of emotional faces as positive or negative as quickly as possible after being primed with emotion words presented via audio in valence-congruent and valence-incongruent trials. In Experiment 2, 67 children and 24 adult participants carried out a similar categorization task, but with faces acting as visual primes, and emotion words acting as auditory targets. The results of Experiment 1 showed that participants made more errors when categorizing positive faces primed by negative words versus positive words, and that 6-year-old children are particularly sensitive to positive word primes, giving faster correct responses regardless of target valence. Meanwhile, the results of Experiment 2 did not show any congruency effects for priming by facial expressions. Thus, audible emotion words seem to exert an influence on the emotional categorization of faces, while faces do not seem to influence the categorization of emotion words in a significant way.
Collapse
Affiliation(s)
- Michael Vesker
- Department of Developmental Psychology, Justus-Liebig-Universität Gießen, Giessen, Germany
| | - Daniela Bahn
- Clinical Linguistics, Department of German Linguistics, Philipps-Universität Marburg, Marburg, Germany
| | - Christina Kauschke
- Clinical Linguistics, Department of German Linguistics, Philipps-Universität Marburg, Marburg, Germany
| | - Monika Tschense
- Clinical Linguistics, Department of German Linguistics, Philipps-Universität Marburg, Marburg, Germany
| | - Franziska Degé
- Department of Developmental Psychology, Justus-Liebig-Universität Gießen, Giessen, Germany
| | - Gudrun Schwarzer
- Department of Developmental Psychology, Justus-Liebig-Universität Gießen, Giessen, Germany
| |
Collapse
|
38
|
Facial emotion recognition in children with or without Attention Deficit/Hyperactivity Disorder: Impact of comorbidity. Encephale 2018; 45:114-120. [PMID: 29580701 DOI: 10.1016/j.encep.2018.01.006] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2017] [Revised: 01/23/2018] [Accepted: 01/26/2018] [Indexed: 11/24/2022]
Abstract
OBJECTIVES This study sought to assess facial emotion recognition deficit in children with Attention Deficit/Hyperactivity Disorder (ADHD) and to test the hypothesis that it is increased by comorbid features. METHOD Forty children diagnosed with ADHD were compared with 40 typically developing children, all aged from 7 to 11years old, on a computerized facial emotion recognition task (based on the Pictures of Facial Affect). Data from parents' ratings of ADHD and comorbid symptoms (on the Conners' Revised Parent Rating Scale) were also collected. RESULTS Children with ADHD had significantly fewer correct answer scores than typically developing controls on the emotional task while they performed similarly on the control task. Recognition of sadness was especially impaired in children with ADHD. While ADHD symptoms were slightly related to facial emotion recognition deficit, oppositional symptoms were related to a decrease in the number of correct answers on sadness and surprise recognition. CONCLUSION Facial emotion recognition deficit in children with ADHD might be related to an impaired emotional process during childhood. Moreover, Oppositional Defiant Disorder seems to be a risk factor for difficulties in emotion recognition especially in children with ADHD.
Collapse
|
39
|
Guarnera M, Hichy Z, Cascio M, Carrubba S, Buccheri SL. Facial Expressions and the Ability to Recognize Emotions from the Eyes or Mouth: A Comparison Between Children and Adults. The Journal of Genetic Psychology 2017; 178:309-318. [DOI: 10.1080/00221325.2017.1361377] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Affiliation(s)
- Maria Guarnera
- Department of Human and Social Sciences, Università degli Studi di Enna Kore, Enna, Italy
| | - Zira Hichy
- Dipartimento di Processi Formativi, University of Catania, Catania, Italy
| | - Maura Cascio
- Department of Human and Social Sciences, Università degli Studi di Enna Kore, Enna, Italy
| | - Stefano Carrubba
- School of Psychology, Social Work and Human Sciences, University of West London, Slough, United Kingdom
| | - Stefania L. Buccheri
- Department of Human and Social Sciences, Università degli Studi di Enna Kore, Enna, Italy
| |
Collapse
|
40
|
Wanting it Too Much: An Inverse Relation Between Social Motivation and Facial Emotion Recognition in Autism Spectrum Disorder. Child Psychiatry Hum Dev 2016; 47:890-902. [PMID: 26743637 PMCID: PMC4936965 DOI: 10.1007/s10578-015-0620-5] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
Abstract
This study examined social motivation and early-stage face perception as frameworks for understanding impairments in facial emotion recognition (FER) in a well-characterized sample of youth with autism spectrum disorders (ASD). Early-stage face perception (N170 event-related potential latency) was recorded while participants completed a standardized FER task, while social motivation was obtained via parent report. Participants with greater social motivation exhibited poorer FER, while those with shorter N170 latencies exhibited better FER for child angry faces stimuli. Social motivation partially mediated the relationship between a faster N170 and better FER. These effects were all robust to variations in IQ, age, and ASD severity. These findings augur against theories implicating social motivation as uniformly valuable for individuals with ASD, and augment models suggesting a close link between early-stage face perception, social motivation, and FER in this population. Broader implications for models and development of FER in ASD are discussed.
Collapse
|
41
|
Dalkıran M, Gultekin G, Yuksek E, Varsak N, Gul H, Kıncır Z, Tasdemir A, Emul M. Facial emotion recognition in psychiatrists and influences of their therapeutic identification on that ability. Compr Psychiatry 2016; 69:30-5. [PMID: 27423342 DOI: 10.1016/j.comppsych.2016.04.008] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/23/2015] [Revised: 03/20/2016] [Accepted: 04/06/2016] [Indexed: 10/22/2022] Open
Abstract
OBJECTIVES Although emotional cues like facial emotion expressions seem to be important in social interaction, there is no specific training about emotional cues for psychiatrists. Here, we aimed to investigate psychiatrists' ability of facial emotion recognition and relation with their clinical identification as psychotherapy-psychopharmacology oriented or being adult and childhood-adolescent psychiatrist. METHODS Facial Emotion Recognition Test was performed to 130 psychiatrists that were constructed by a set of photographs (happy, sad, fearful, angry, surprised, disgusted and neutral faces) from Ekman and Friesen's. RESULTS Psychotherapy oriented adult psychiatrists were significantly better in recognizing sad facial emotion (p=.003) than psychopharmacologists while no significant differences were detected according to therapeutic orientation among child-adolescent psychiatrists (for each, p>.05). Adult psychiatrists were significantly better in recognizing fearful (p=.012) and disgusted (p=.003) facial emotions than child-adolescent psychiatrists while the latter were better in recognizing angry facial emotion (p=.008). CONCLUSION For the first time, we have shown some differences on psychiatrists' facial emotion recognition ability according to therapeutic identification and being adult or child-adolescent psychiatrist. It would be valuable to investigate how these differences or training the ability of facial emotion recognition would affect the quality of patient-clinician interaction and treatment related outcomes.
Collapse
Affiliation(s)
- Mihriban Dalkıran
- Department of Psychiatry, Sisli Etfal Education and Research Hospital, Istanbul, Turkey
| | - Gozde Gultekin
- Department of Psychiatry, Medical School of Cerrahpasa, Istanbul University, Istanbul, Turkey.
| | - Erhan Yuksek
- Clinic of Psychiatry, Viransehir State Hospital, Sanlıurfa, Turkey
| | - Nalan Varsak
- Department of Psychiatry, Konya Education and Research Hospital, Konya, Turkey
| | - Hesna Gul
- Clinic of Psychiatry, Kahramanmaras State Hospital, Kahramanmaras, Turkey
| | - Zeliha Kıncır
- Department of Psychiatry, Bakırkoy Mental Health and Neurological Diseases Education and Research Hospital, Istanbul, Turkey
| | - Akif Tasdemir
- Department of Psychiatry, Bakırkoy Mental Health and Neurological Diseases Education and Research Hospital, Istanbul, Turkey
| | - Murat Emul
- Department of Psychiatry, Medical School of Cerrahpasa, Istanbul University, Istanbul, Turkey
| |
Collapse
|
42
|
Balas B, Huynh C, Saville A, Schmidt J. Orientation biases for facial emotion recognition during childhood and adulthood. J Exp Child Psychol 2015; 140:171-83. [DOI: 10.1016/j.jecp.2015.07.006] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2014] [Revised: 07/03/2015] [Accepted: 07/05/2015] [Indexed: 11/16/2022]
|
43
|
Roy-Charland A, Perron M, Young C, Boulard J, Chamberland JA. The Confusion of Fear and Surprise: A Developmental Study of the Perceptual-Attentional Limitation Hypothesis Using Eye Movements. The Journal of Genetic Psychology 2015; 176:281-98. [PMID: 26244819 DOI: 10.1080/00221325.2015.1066301] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
The goal of the present study was to test the Perceptual-Attentional Limitation Hypothesis in children and adults by manipulating the distinctiveness between expressions and recording eye movements. Children 3-5 and 9-11 years old as well as adults were presented pairs of expressions and required to identify a target emotion. Children 3-5 years old were less accurate than those 9-11 years old and adults. All children viewed pictures longer than adults but did not spend more time attending to the relevant cues. For all participants, accuracy for the recognition of fear was lower than for surprise when the distinctive cue was in the brow only. They also took longer and spent more time in both the mouth and brow zones than when a cue was in the mouth or both areas. Adults and children 9-11 years old made more comparisons between the expressions when fear comprised a single distinctive cue in the brow than when the distinctive cue was in the mouth only or when both cues were present. Children 3-5 years old made more comparisons for brow only than both. The results of the present study extend on the Perceptual-Attentional Limitation Hypothesis showing an importance of both decoder and stimuli, and an interaction between decoder and stimuli characteristics.
Collapse
|
44
|
Wiggins JL, Adleman NE, Kim P, Oakes AH, Hsu D, Reynolds RC, Chen G, Pine DS, Brotman MA, Leibenluft E. Developmental differences in the neural mechanisms of facial emotion labeling. Soc Cogn Affect Neurosci 2015; 11:172-81. [PMID: 26245836 DOI: 10.1093/scan/nsv101] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2015] [Accepted: 07/30/2015] [Indexed: 11/12/2022] Open
Abstract
Adolescence is a time of increased risk for the onset of psychological disorders associated with deficits in face emotion labeling. We used functional magnetic resonance imaging (fMRI) to examine age-related differences in brain activation while adolescents and adults labeled the emotion on fearful, happy and angry faces of varying intensities [0% (i.e. neutral), 50%, 75%, 100%]. Adolescents and adults did not differ on accuracy to label emotions. In the superior temporal sulcus, ventrolateral prefrontal cortex and middle temporal gyrus, adults show an inverted-U-shaped response to increasing intensities of fearful faces and a U-shaped response to increasing intensities of happy faces, whereas adolescents show the opposite patterns. In addition, adults, but not adolescents, show greater inferior occipital gyrus activation to negative (angry, fearful) vs positive (happy) emotions. In sum, when subjects classify subtly varying facial emotions, developmental differences manifest in several 'ventral stream' brain regions. Charting the typical developmental course of the brain mechanisms of socioemotional processes, such as facial emotion labeling, is an important focus for developmental psychopathology research.
Collapse
Affiliation(s)
- Jillian Lee Wiggins
- Emotion and Development Branch, National Institute of Mental Health, National Institutes of Health, Bethesda, MD 20892, USA,
| | - Nancy E Adleman
- Emotion and Development Branch, National Institute of Mental Health, National Institutes of Health, Bethesda, MD 20892, USA, Department of Psychology, The Catholic University of America, Washington, D.C., 20064, USA
| | - Pilyoung Kim
- Emotion and Development Branch, National Institute of Mental Health, National Institutes of Health, Bethesda, MD 20892, USA, Department of Psychology, University of Denver, Denver, CO, 80208, USA, and
| | - Allison H Oakes
- Emotion and Development Branch, National Institute of Mental Health, National Institutes of Health, Bethesda, MD 20892, USA
| | - Derek Hsu
- Emotion and Development Branch, National Institute of Mental Health, National Institutes of Health, Bethesda, MD 20892, USA
| | - Richard C Reynolds
- Scientific and Statistical Computing Core, National Institute of Mental Health, National Institutes of Health, Bethesda, MD 20892, USA
| | - Gang Chen
- Scientific and Statistical Computing Core, National Institute of Mental Health, National Institutes of Health, Bethesda, MD 20892, USA
| | - Daniel S Pine
- Emotion and Development Branch, National Institute of Mental Health, National Institutes of Health, Bethesda, MD 20892, USA
| | - Melissa A Brotman
- Emotion and Development Branch, National Institute of Mental Health, National Institutes of Health, Bethesda, MD 20892, USA
| | - Ellen Leibenluft
- Emotion and Development Branch, National Institute of Mental Health, National Institutes of Health, Bethesda, MD 20892, USA
| |
Collapse
|
45
|
MacNamara A, Vergés A, Kujawa A, Fitzgerald KD, Monk CS, Phan KL. Age-related changes in emotional face processing across childhood and into young adulthood: Evidence from event-related potentials. Dev Psychobiol 2015. [PMID: 26220144 DOI: 10.1002/dev.21341] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Socio-emotional processing is an essential part of development, and age-related changes in its neural correlates can be observed. The late positive potential (LPP) is a measure of motivated attention that can be used to assess emotional processing; however, changes in the LPP elicited by emotional faces have not been assessed across a wide age range in childhood and young adulthood. We used an emotional face matching task to examine behavior and event-related potentials (ERPs) in 33 youth aged 7-19 years old. Younger children were slower when performing the matching task. The LPP elicited by emotional faces but not control stimuli (geometric shapes) decreased with age; by contrast, an earlier ERP (the P1) decreased with age for both faces and shapes, suggesting increased efficiency of early visual processing. Results indicate age-related attenuation in emotional processing that may stem from greater efficiency and regulatory control when performing a socio-emotional task.
Collapse
Affiliation(s)
- Annmarie MacNamara
- Department of Psychiatry, University of Illinois at Chicago, Chicago, IL.
| | - Alvaro Vergés
- Department of Psychiatry, University of Illinois at Chicago, Chicago, IL
| | - Autumn Kujawa
- Department of Psychiatry, University of Illinois at Chicago, Chicago, IL
| | | | | | - K Luan Phan
- Departments of Psychiatry, Psychology and Anatomy and Cell Biology, and the Graduate Program in Neuroscience, University of Illinois at Chicago, Chicago, IL
| |
Collapse
|
46
|
Lawrence K, Campbell R, Skuse D. Age, gender, and puberty influence the development of facial emotion recognition. Front Psychol 2015; 6:761. [PMID: 26136697 PMCID: PMC4468868 DOI: 10.3389/fpsyg.2015.00761] [Citation(s) in RCA: 184] [Impact Index Per Article: 20.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2015] [Accepted: 05/22/2015] [Indexed: 11/13/2022] Open
Abstract
Our ability to differentiate between simple facial expressions of emotion develops between infancy and early adulthood, yet few studies have explored the developmental trajectory of emotion recognition using a single methodology across a wide age-range. We investigated the development of emotion recognition abilities through childhood and adolescence, testing the hypothesis that children's ability to recognize simple emotions is modulated by chronological age, pubertal stage and gender. In order to establish norms, we assessed 478 children aged 6-16 years, using the Ekman-Friesen Pictures of Facial Affect. We then modeled these cross-sectional data in terms of competence in accurate recognition of the six emotions studied, when the positive correlation between emotion recognition and IQ was controlled. Significant linear trends were seen in children's ability to recognize facial expressions of happiness, surprise, fear, and disgust; there was improvement with increasing age. In contrast, for sad and angry expressions there is little or no change in accuracy over the age range 6-16 years; near-adult levels of competence are established by middle-childhood. In a sampled subset, pubertal status influenced the ability to recognize facial expressions of disgust and anger; there was an increase in competence from mid to late puberty, which occurred independently of age. A small female advantage was found in the recognition of some facial expressions. The normative data provided in this study will aid clinicians and researchers in assessing the emotion recognition abilities of children and will facilitate the identification of abnormalities in a skill that is often impaired in neurodevelopmental disorders. If emotion recognition abilities are a good model with which to understand adolescent development, then these results could have implications for the education, mental health provision and legal treatment of teenagers.
Collapse
Affiliation(s)
- Kate Lawrence
- Department of Psychology, St Mary's University, Twickenham, London UK
| | - Ruth Campbell
- Deafness Cognition and Language Centre, University College London London, UK
| | - David Skuse
- Behavioural and Brain Sciences Unit, UCL Institute of Child Health, London UK
| |
Collapse
|
47
|
Guarnera M, Hichy Z, Cascio MI, Carrubba S. Facial Expressions and Ability to Recognize Emotions From Eyes or Mouth in Children. EUROPES JOURNAL OF PSYCHOLOGY 2015; 11:183-96. [PMID: 27247651 PMCID: PMC4873105 DOI: 10.5964/ejop.v11i2.890] [Citation(s) in RCA: 37] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2014] [Accepted: 02/14/2015] [Indexed: 11/20/2022]
Abstract
This research aims to contribute to the literature on the ability to recognize anger, happiness, fear, surprise, sadness, disgust and neutral emotions from facial information. By investigating children’s performance in detecting these emotions from a specific face region, we were interested to know whether children would show differences in recognizing these expressions from the upper or lower face, and if any difference between specific facial regions depended on the emotion in question. For this purpose, a group of 6-7 year-old children was selected. Participants were asked to recognize emotions by using a labeling task with three stimulus types (region of the eyes, of the mouth, and full face). The findings seem to indicate that children correctly recognize basic facial expressions when pictures represent the whole face, except for a neutral expression, which was recognized from the mouth, and sadness, which was recognized from the eyes. Children are also able to identify anger from the eyes as well as from the whole face. With respect to gender differences, there is no female advantage in emotional recognition. The results indicate a significant interaction ‘gender x face region’ only for anger and neutral emotions.
Collapse
Affiliation(s)
- Maria Guarnera
- Faculty of Human and Social Sciences, University of Enna "KORE", Enna, Italy
| | - Zira Hichy
- Department of Educational Sciences, University of Catania, Catania, Italy
| | - Maura I Cascio
- Faculty of Human and Social Sciences, University of Enna "KORE", Enna, Italy
| | - Stefano Carrubba
- School of Psychology, Social Work and Human Sciences, University of West London, London, United Kingdom
| |
Collapse
|
48
|
Rodger H, Vizioli L, Ouyang X, Caldara R. Mapping the development of facial expression recognition. Dev Sci 2015; 18:926-39. [DOI: 10.1111/desc.12281] [Citation(s) in RCA: 81] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2014] [Accepted: 10/16/2014] [Indexed: 11/28/2022]
Affiliation(s)
- Helen Rodger
- Department of Psychology; University of Fribourg; Switzerland
| | - Luca Vizioli
- Department of Psychology; University of Fribourg; Switzerland
| | - Xinyi Ouyang
- Department of Psychology; University of Fribourg; Switzerland
| | - Roberto Caldara
- Department of Psychology; University of Fribourg; Switzerland
| |
Collapse
|
49
|
Olszanowski M, Pochwatko G, Kuklinski K, Scibor-Rylski M, Lewinski P, Ohme RK. Warsaw set of emotional facial expression pictures: a validation study of facial display photographs. Front Psychol 2015; 5:1516. [PMID: 25601846 PMCID: PMC4283518 DOI: 10.3389/fpsyg.2014.01516] [Citation(s) in RCA: 92] [Impact Index Per Article: 10.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2014] [Accepted: 12/09/2014] [Indexed: 11/24/2022] Open
Abstract
Emotional facial expressions play a critical role in theories of emotion and figure prominently in research on almost every aspect of emotion. This article provides a background for a new database of basic emotional expressions. The goal in creating this set was to provide high quality photographs of genuine facial expressions. Thus, after proper training, participants were inclined to express "felt" emotions. The novel approach taken in this study was also used to establish whether a given expression was perceived as intended by untrained judges. The judgment task for perceivers was designed to be sensitive to subtle changes in meaning caused by the way an emotional display was evoked and expressed. Consequently, this allowed us to measure the purity and intensity of emotional displays, which are parameters that validation methods used by other researchers do not capture. The final set is comprised of those pictures that received the highest recognition marks (e.g., accuracy with intended display) from independent judges, totaling 210 high quality photographs of 30 individuals. Descriptions of the accuracy, intensity, and purity of displayed emotion as well as FACS AU's codes are provided for each picture. Given the unique methodology applied to gathering and validating this set of pictures, it may be a useful tool for research using face stimuli. The Warsaw Set of Emotional Facial Expression Pictures (WSEFEP) is freely accessible to the scientific community for non-commercial use by request at http://www.emotional-face.org.
Collapse
Affiliation(s)
- Michal Olszanowski
- Department of Psychology, University of Social Sciences and HumanitiesWarsaw, Poland
| | | | - Krzysztof Kuklinski
- Department of Psychology, University of Social Sciences and HumanitiesWarsaw, Poland
| | - Michal Scibor-Rylski
- Department of Psychology, University of Social Sciences and HumanitiesWarsaw, Poland
| | - Peter Lewinski
- Department of Communication, University of AmsterdamAmsterdam, Netherlands
| | - Rafal K. Ohme
- Faculty in Wroclaw, University of Social Sciences and HumanitiesWroclaw, Poland
| |
Collapse
|
50
|
Gonzalez-Gadea ML, Herrera E, Parra M, Gomez Mendez P, Baez S, Manes F, Ibanez A. Emotion recognition and cognitive empathy deficits in adolescent offenders revealed by context-sensitive tasks. Front Hum Neurosci 2014; 8:850. [PMID: 25374529 PMCID: PMC4204464 DOI: 10.3389/fnhum.2014.00850] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2014] [Accepted: 10/03/2014] [Indexed: 11/16/2022] Open
Abstract
Emotion recognition and empathy abilities require the integration of contextual information in real-life scenarios. Previous reports have explored these domains in adolescent offenders (AOs) but have not used tasks that replicate everyday situations. In this study we included ecological measures with different levels of contextual dependence to evaluate emotion recognition and empathy in AOs relative to non-offenders, controlling for the effect of demographic variables. We also explored the influence of fluid intelligence (FI) and executive functions (EFs) in the prediction of relevant deficits in these domains. Our results showed that AOs exhibit deficits in context-sensitive measures of emotion recognition and cognitive empathy. Difficulties in these tasks were neither explained by demographic variables nor predicted by FI or EFs. However, performance on measures that included simpler stimuli or could be solved by explicit knowledge was either only partially affected by demographic variables or preserved in AOs. These findings indicate that AOs show contextual social-cognition impairments which are relatively independent of basic cognitive functioning and demographic variables.
Collapse
Affiliation(s)
- Maria Luz Gonzalez-Gadea
- Laboratory of Experimental Psychology and Neuroscience, Institute of Cognitive Neurology Buenos Aires, Argentina ; National Scientific and Technical Research Council Buenos Aires, Argentina ; UDP-INECO Foundation Core on Neuroscience, Diego Portales University Santiago, Chile
| | - Eduar Herrera
- Laboratory of Experimental Psychology and Neuroscience, Institute of Cognitive Neurology Buenos Aires, Argentina ; National Scientific and Technical Research Council Buenos Aires, Argentina ; Universidad Autonoma del Caribe Barranquilla, Colombia
| | - Mario Parra
- UDP-INECO Foundation Core on Neuroscience, Diego Portales University Santiago, Chile ; Human Cognitive Neuroscience, Psychology Department, University of Edinburgh Edinburgh, UK ; Scottish Dementia Clinical Research Network Perth, UK ; Neuropsy and Biomedical Unit, Health School, University Surcolombiana Neiva, Colombia
| | | | - Sandra Baez
- Laboratory of Experimental Psychology and Neuroscience, Institute of Cognitive Neurology Buenos Aires, Argentina ; National Scientific and Technical Research Council Buenos Aires, Argentina ; UDP-INECO Foundation Core on Neuroscience, Diego Portales University Santiago, Chile
| | - Facundo Manes
- Laboratory of Experimental Psychology and Neuroscience, Institute of Cognitive Neurology Buenos Aires, Argentina ; National Scientific and Technical Research Council Buenos Aires, Argentina ; UDP-INECO Foundation Core on Neuroscience, Diego Portales University Santiago, Chile ; Centre of Excellence in Cognition and its Disorders, Australian Research Council Sydney, NSW, Australia
| | - Agustin Ibanez
- Laboratory of Experimental Psychology and Neuroscience, Institute of Cognitive Neurology Buenos Aires, Argentina ; National Scientific and Technical Research Council Buenos Aires, Argentina ; UDP-INECO Foundation Core on Neuroscience, Diego Portales University Santiago, Chile ; Universidad Autonoma del Caribe Barranquilla, Colombia ; Centre of Excellence in Cognition and its Disorders, Australian Research Council Sydney, NSW, Australia
| |
Collapse
|