1
|
Xu X, Wang Z, Zhang W, Guo J, Wei W, Zhang M, Ding X, Liu X, Yang Q, Wang K, Zhu Y, Sun J, Song H, Shen Z, Chen L, Shi F, Wang Q, Li Y, Zhang H, Li D. Behavioral observation and assessment protocol for language and social-emotional development study in children aged 0-6: the Chinese baby connectome project. BMC Psychol 2024; 12:533. [PMID: 39367488 PMCID: PMC11451268 DOI: 10.1186/s40359-024-02031-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2024] [Accepted: 09/24/2024] [Indexed: 10/06/2024] Open
Abstract
BACKGROUND The global rise in developmental delays underscores the critical need for a thorough understanding and timely interventions during early childhood. Addressing this issue, the Chinese Baby Connectome Project (CBCP)'s behavior branch is dedicated to examining language acquisition, social-emotional development, and environmental factors affecting Chinese children. The research framework is built around three primary objectives: developing a 0-6 Child Development Assessment Toolkit, implementing an Intelligent Coding System, and investigating environmental influence. METHODS Utilizing an accelerated longitudinal design, the CBCP aims to enlist a minimum of 1000 typically developing Chinese children aged 0-6. The data collected in this branch constitutes parental questionnaires, behavioral assessments, and observational experiments to capture their developmental milestones and environmental influences holistically. The parental questionnaires will gauge children's developmental levels in language and social-emotional domains, alongside parental mental well-being, life events, parenting stress, parenting styles, and family relationships. Behavioral assessments will involve neurofunctional developmental evaluations using tools such as the Griffiths Development Scales and Wechsler Preschool and Primary Scale of Intelligence. Additionally, the assessments will encompass measuring children's executive functions (e.g., Head-Toe-Knee-Shoulder), social cognitive abilities (e.g., theory of mind), and language development (e.g., Early Chinese Vocabulary Test). A series of behavior observation. experiments will be conducted targeting children of different age groups, focusing primarily on aspects such as behavioral inhibition, compliance, self-control, and social-emotional regulation. To achieve the objectives, established international questionnaires will be adapted to suit local contexts and devise customized metrics for evaluating children's language and social-emotional development; deep learning algorithms will be developed in the observational experiments to enable automated behavioral analysis; and statistical models will be built to factor in various environmental variables to comprehensively outline developmental trajectories and relationships. DISCUSSION This study's integration of diverse assessments and AI technology will offer a detailed analysis of early childhood development in China, particularly in the realms of language acquisition and social-emotional skills. The development of a comprehensive assessment toolkit and coding system will enhance our ability to understand and support the development of Chinese children, contributing significantly to the field of early childhood development research. TRIAL REGISTRATION This study was registered with clinicaltrials.gov NCT05040542 on September 10, 2021.
Collapse
Affiliation(s)
- Xinpei Xu
- Shanghai Institute of Early Childhood Education, Shanghai Normal University, Shanghai, China
| | - Zhixin Wang
- School of Biomedical Engineering, ShanghaiTech University, Shanghai, China
| | - Weijia Zhang
- School of Biomedical Engineering, ShanghaiTech University, Shanghai, China
| | - Jiayang Guo
- School of Biomedical Engineering, ShanghaiTech University, Shanghai, China
| | - Wei Wei
- Shanghai Institute of Early Childhood Education, Shanghai Normal University, Shanghai, China
| | - Mingming Zhang
- School of Psychology, Shanghai Normal University, Shanghai, China
| | - Xuechen Ding
- School of Psychology, Shanghai Normal University, Shanghai, China
| | - Xiaohua Liu
- School of Biomedical Engineering, ShanghaiTech University, Shanghai, China
| | - Qing Yang
- School of Biomedical Engineering, ShanghaiTech University, Shanghai, China
| | - Kaidong Wang
- School of Biomedical Engineering, ShanghaiTech University, Shanghai, China
| | - Yitao Zhu
- School of Biomedical Engineering, ShanghaiTech University, Shanghai, China
| | - Jian Sun
- School of Biomedical Engineering, ShanghaiTech University, Shanghai, China
| | - Haoyang Song
- United Imaging Intelligence Co., Ltd, Shanghai, China
| | - Zhenhui Shen
- United Imaging Intelligence Co., Ltd, Shanghai, China
| | - Lei Chen
- United Imaging Intelligence Co., Ltd, Shanghai, China
| | - Feng Shi
- United Imaging Intelligence Co., Ltd, Shanghai, China
| | - Qian Wang
- School of Biomedical Engineering & State Key Laboratory of Advanced Medical Materials and Devices, ShanghaiTech University, Shanghai, China
- Shanghai Clinical Research and Trial Center, Shanghai, China
| | - Yan Li
- Shanghai Institute of Early Childhood Education, Shanghai Normal University, Shanghai, China.
| | - Han Zhang
- School of Biomedical Engineering & State Key Laboratory of Advanced Medical Materials and Devices, ShanghaiTech University, Shanghai, China.
- Shanghai Clinical Research and Trial Center, Shanghai, China.
| | - Dan Li
- School of Psychology, Shanghai Normal University, Shanghai, China.
| |
Collapse
|
2
|
Japee S, Jordan J, Licht J, Lokey S, Chen G, Snow J, Jabs EW, Webb BD, Engle EC, Manoli I, Baker C, Ungerleider LG. Inability to move one's face dampens facial expression perception. Cortex 2023; 169:35-49. [PMID: 37852041 PMCID: PMC10836030 DOI: 10.1016/j.cortex.2023.08.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2023] [Revised: 05/31/2023] [Accepted: 08/02/2023] [Indexed: 10/20/2023]
Abstract
Humans rely heavily on facial expressions for social communication to convey their thoughts and emotions and to understand them in others. One prominent but controversial view is that humans learn to recognize the significance of facial expressions by mimicking the expressions of others. This view predicts that an inability to make facial expressions (e.g., facial paralysis) would result in reduced perceptual sensitivity to others' facial expressions. To test this hypothesis, we developed a diverse battery of sensitive emotion recognition tasks to characterize expression perception in individuals with Moebius Syndrome (MBS), a congenital neurological disorder that causes facial palsy. Using computer-based detection tasks we systematically assessed expression perception thresholds for static and dynamic face and body expressions. We found that while MBS individuals were able to perform challenging perceptual control tasks and body expression tasks, they were less efficient at extracting emotion from facial expressions, compared to matched controls. Exploratory analyses of fMRI data from a small group of MBS participants suggested potentially reduced engagement of the amygdala in MBS participants during expression processing relative to matched controls. Collectively, these results suggest a role for facial mimicry and consequent facial feedback and motor experience in the perception of others' facial expressions.
Collapse
Affiliation(s)
- Shruti Japee
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD, USA.
| | - Jessica Jordan
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD, USA
| | - Judith Licht
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD, USA
| | - Savannah Lokey
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD, USA
| | - Gang Chen
- Scientific and Statistical Computing Core, NIMH, NIH, Bethesda, MD, USA
| | - Joseph Snow
- Office of the Clinical Director, NIMH, NIH, Bethesda, MD, USA
| | - Ethylin Wang Jabs
- Department of Genetics and Genomic Sciences, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Bryn D Webb
- Department of Genetics and Genomic Sciences, Icahn School of Medicine at Mount Sinai, New York, NY, USA; Department of Pediatrics, Division of Genetics and Metabolism, University of Wisconsin-Madison, Madison, WI, USA
| | - Elizabeth C Engle
- Departments of Neurology and Ophthalmology, Boston Children's Hospital and Harvard Medical School, Boston, MA, USA; Howard Hughes Medical Institute, Chevy Chase, MD, USA
| | - Irini Manoli
- Medical Genomics and Metabolic Genetics, NHGRI, NIH, Bethesda, MD, USA
| | - Chris Baker
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD, USA
| | | |
Collapse
|
3
|
Poncet F, Leleu A, Rekow D, Damon F, Dzhelyova MP, Schaal B, Durand K, Faivre L, Rossion B, Baudouin JY. A neural marker of rapid discrimination of facial expression in 3.5- and 7-month-old infants. Front Neurosci 2022; 16:901013. [PMID: 36061610 PMCID: PMC9434348 DOI: 10.3389/fnins.2022.901013] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2022] [Accepted: 07/29/2022] [Indexed: 01/23/2023] Open
Abstract
Infants’ ability to discriminate facial expressions has been widely explored, but little is known about the rapid and automatic ability to discriminate a given expression against many others in a single experiment. Here we investigated the development of facial expression discrimination in infancy with fast periodic visual stimulation coupled with scalp electroencephalography (EEG). EEG was recorded in eighteen 3.5- and eighteen 7-month-old infants presented with a female face expressing disgust, happiness, or a neutral emotion (in different stimulation sequences) at a base stimulation frequency of 6 Hz. Pictures of the same individual expressing other emotions (either anger, disgust, fear, happiness, sadness, or neutrality, randomly and excluding the expression presented at the base frequency) were introduced every six stimuli (at 1 Hz). Frequency-domain analysis revealed an objective (i.e., at the predefined 1-Hz frequency and harmonics) expression-change brain response in both 3.5- and 7-month-olds, indicating the visual discrimination of various expressions from disgust, happiness and neutrality from these early ages. At 3.5 months, the responses to the discrimination from disgust and happiness expressions were located mainly on medial occipital sites, whereas a more lateral topography was found for the response to the discrimination from neutrality, suggesting that expression discrimination from an emotionally neutral face relies on distinct visual cues than discrimination from a disgust or happy face. Finally, expression discrimination from happiness was associated with a reduced activity over posterior areas and an additional response over central frontal scalp regions at 7 months as compared to 3.5 months. This result suggests developmental changes in the processing of happiness expressions as compared to negative/neutral ones within this age range.
Collapse
Affiliation(s)
- Fanny Poncet
- Development of Olfactory Communication and Cognition Laboratory, Centre des Sciences du Goût et de l’Alimentation, CNRS, Université Bourgogne Franche-Comté, INRAE, Institut Agro, Dijon, France
- Université Grenoble Alpes, Saint-Martin-d’Hères, France
- *Correspondence: Fanny Poncet,
| | - Arnaud Leleu
- Development of Olfactory Communication and Cognition Laboratory, Centre des Sciences du Goût et de l’Alimentation, CNRS, Université Bourgogne Franche-Comté, INRAE, Institut Agro, Dijon, France
| | - Diane Rekow
- Development of Olfactory Communication and Cognition Laboratory, Centre des Sciences du Goût et de l’Alimentation, CNRS, Université Bourgogne Franche-Comté, INRAE, Institut Agro, Dijon, France
| | - Fabrice Damon
- Development of Olfactory Communication and Cognition Laboratory, Centre des Sciences du Goût et de l’Alimentation, CNRS, Université Bourgogne Franche-Comté, INRAE, Institut Agro, Dijon, France
| | | | - Benoist Schaal
- Development of Olfactory Communication and Cognition Laboratory, Centre des Sciences du Goût et de l’Alimentation, CNRS, Université Bourgogne Franche-Comté, INRAE, Institut Agro, Dijon, France
| | - Karine Durand
- Development of Olfactory Communication and Cognition Laboratory, Centre des Sciences du Goût et de l’Alimentation, CNRS, Université Bourgogne Franche-Comté, INRAE, Institut Agro, Dijon, France
| | - Laurence Faivre
- Inserm UMR 1231 GAD, Genetics of Developmental Disorders, and Centre de Référence Maladies Rares “Anomalies du Développement et Syndromes Malformatifs,” FHU TRANSLAD, CHU Dijon and Université de Bourgogne-Franche Comté, Dijon, France
| | - Bruno Rossion
- Université de Lorraine, CNRS, CRAN–UMR 7039, Nancy, France
- Service de Neurologie, Université de Lorraine, CHRU-Nancy, Nancy, France
| | - Jean-Yves Baudouin
- Laboratoire “Développement, Individu, Processus, Handicap, Éducation”, Département Psychologie du Développement, de l’Éducation et des Vulnérabilités, Institut de Psychologie, Université de Lyon, Université Lumière Lyon 2, Bron, France
- Jean-Yves Baudouin,
| |
Collapse
|
4
|
Noonan CF, Hunter BK, Markant J. Dynamic emotional messages differentially affect 6-month-old infants' attention to eyes and gaze cues. Infant Behav Dev 2021; 64:101626. [PMID: 34390965 DOI: 10.1016/j.infbeh.2021.101626] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2021] [Revised: 07/16/2021] [Accepted: 08/02/2021] [Indexed: 10/20/2022]
Abstract
Infants often experience interactions in which caregivers use dynamic messages to convey their affective and communicative intent. These dynamic emotional messages may shape the development of emotion discrimination skills and shared attention by influencing infants' attention to internal facial features and their responses to eye gaze cues. However, past research examining infants' responses to emotional faces has predominantly focused on classic, stereotyped expressions (e.g., happy, sad, angry) that may not reflect the variability that infants experience in their daily interactions. The present study therefore examined forty-two 6-month-old infants' attention to eyes vs. mouth and gaze cueing responses across multiple dynamic emotional messages that are common to infant-directed interactions. Overall, infants looked more to the eyes during messages with negative affect, but this increased attention to the eyes during these message conditions did not directly facilitate gaze cueing. Infants instead showed reliable gaze cueing only after messages with positive and neutral affect. We additionally observed gender differences in infants' attention to internal face features and subsequent gaze cueing responses. Female infants spent more time looking at the eyes during the dynamic emotional messages and showed increased initial orienting and longer looking to gaze-cued objects following positive messages, whereas male infants showed these gaze cueing effects following neutral messages. These results suggest that variability in caregivers' communication can shape infants' attention to and processing of emotion and gaze information.
Collapse
Affiliation(s)
- Claire F Noonan
- Department of Psychology, Tulane University, New Orleans, LA 70118, United States
| | - Brianna K Hunter
- Department of Psychology, Tulane University, New Orleans, LA 70118, United States
| | - Julie Markant
- Department of Psychology, Tulane University, New Orleans, LA 70118, United States; Tulane Brain Institute, Tulane University, New Orleans, LA 70118, United States.
| |
Collapse
|
5
|
Prunty JE, Keemink JR, Kelly DJ. Infants scan static and dynamic facial expressions differently. INFANCY 2021; 26:831-856. [PMID: 34288344 DOI: 10.1111/infa.12426] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2020] [Revised: 07/02/2021] [Accepted: 07/08/2021] [Indexed: 11/30/2022]
Abstract
Despite being inherently dynamic phenomena, much of our understanding of how infants attend and scan facial expressions is based on static face stimuli. Here we investigate how six-, nine-, and twelve-month infants allocate their visual attention toward dynamic-interactive videos of the six basic emotional expressions, and compare their responses with static images of the same stimuli. We find infants show clear differences in how they attend and scan dynamic and static expressions, looking longer toward the dynamic-face and lower-face regions. Infants across all age groups show differential interest in expressions, and show precise scanning of regions "diagnostic" for emotion recognition. These data also indicate that infants' attention toward dynamic expressions develops over the first year of life, including relative increases in interest and scanning precision toward some negative facial expressions (e.g., anger, fear, and disgust).
Collapse
Affiliation(s)
| | | | - David J Kelly
- School of Psychology, University of Kent, Canterbury, UK
| |
Collapse
|
6
|
Salvadori EA, Colonnesi C, Vonk HS, Oort FJ, Aktar E. Infant Emotional Mimicry of Strangers: Associations with Parent Emotional Mimicry, Parent-Infant Mutual Attention, and Parent Dispositional Affective Empathy. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2021; 18:ijerph18020654. [PMID: 33466629 PMCID: PMC7828673 DOI: 10.3390/ijerph18020654] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/01/2020] [Revised: 12/30/2020] [Accepted: 01/12/2021] [Indexed: 11/16/2022]
Abstract
Emotional mimicry, the tendency to automatically and spontaneously reproduce others’ facial expressions, characterizes human social interactions from infancy onwards. Yet, little is known about the factors modulating its development in the first year of life. This study investigated infant emotional mimicry and its association with parent emotional mimicry, parent-infant mutual attention, and parent dispositional affective empathy. One hundred and seventeen parent-infant dyads (51 six-month-olds, 66 twelve-month-olds) were observed during video presentation of strangers’ happy, sad, angry, and fearful faces. Infant and parent emotional mimicry (i.e., facial expressions valence-congruent to the video) and their mutual attention (i.e., simultaneous gaze at one another) were systematically coded second-by-second. Parent empathy was assessed via self-report. Path models indicated that infant mimicry of happy stimuli was positively and independently associated with parent mimicry and affective empathy, while infant mimicry of sad stimuli was related to longer parent-infant mutual attention. Findings provide new insights into infants’ and parents’ coordination of mimicry and attention during triadic contexts of interactions, endorsing the social-affiliative function of mimicry already present in infancy: emotional mimicry occurs as an automatic parent-infant shared behavior and early manifestation of empathy only when strangers’ emotional displays are positive, and thus perceived as affiliative.
Collapse
Affiliation(s)
- Eliala A. Salvadori
- Research Institute of Child Development and Education, University of Amsterdam, 1018 WS Amsterdam, The Netherlands; (C.C.); (H.S.V.); (F.J.O.); (E.A.)
- Research Priority Area Yield, University of Amsterdam, 1018 WS Amsterdam, The Netherlands
- Correspondence: ; Tel.: +31-633-853-534
| | - Cristina Colonnesi
- Research Institute of Child Development and Education, University of Amsterdam, 1018 WS Amsterdam, The Netherlands; (C.C.); (H.S.V.); (F.J.O.); (E.A.)
- Research Priority Area Yield, University of Amsterdam, 1018 WS Amsterdam, The Netherlands
| | - Heleen S. Vonk
- Research Institute of Child Development and Education, University of Amsterdam, 1018 WS Amsterdam, The Netherlands; (C.C.); (H.S.V.); (F.J.O.); (E.A.)
| | - Frans J. Oort
- Research Institute of Child Development and Education, University of Amsterdam, 1018 WS Amsterdam, The Netherlands; (C.C.); (H.S.V.); (F.J.O.); (E.A.)
- Research Priority Area Yield, University of Amsterdam, 1018 WS Amsterdam, The Netherlands
| | - Evin Aktar
- Research Institute of Child Development and Education, University of Amsterdam, 1018 WS Amsterdam, The Netherlands; (C.C.); (H.S.V.); (F.J.O.); (E.A.)
- Department of Clinical Psychology, Leiden University, 2333 AK Leiden, The Netherlands
| |
Collapse
|
7
|
Keemink JR, Jenner L, Prunty JE, Wood N, Kelly DJ. Eye Movements and Behavioural Responses to Gaze-Contingent Expressive Faces in Typically Developing Infants and Infant Siblings. Autism Res 2020; 14:973-983. [PMID: 33170549 DOI: 10.1002/aur.2432] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2020] [Revised: 09/30/2020] [Accepted: 10/21/2020] [Indexed: 01/01/2023]
Abstract
Studies with infant siblings of children with Autism Spectrum Disorder have attempted to identify early markers for the disorder and suggest that autistic symptoms emerge between 12 and 24 months of age. Yet, a reliable first-year marker remains elusive. We propose that in order to establish first-year manifestations of this inherently social disorder, we need to develop research methods that are sufficiently socially demanding and realistically interactive. Building on Keemink et al. [2019, Developmental Psychology, 55, 1362-1371], we employed a gaze-contingent eye-tracking paradigm in which infants could interact with face stimuli. Infants could elicit emotional expressions (happiness, sadness, surprise, fear, disgust, anger) from on-screen faces by engaging in eye contact. We collected eye-tracking data and video-recorded behavioural response data from 122 (64 male, 58 female) typically developing infants and 31 infant siblings (17 male, 14 female) aged 6-, 9- and 12-months old. All infants demonstrated a significant Expression by AOI interaction (F(10, 1470) = 10.003, P < 0.001, ŋp 2 = 0.064). Infants' eye movements were "expression-specific" with infants distributing their fixations to AOIs differently per expression. Whereas eye movements provide no evidence of deviancies, behavioural response data show significant aberrancies in reciprocity for infant siblings. Infant siblings show reduced social responsiveness at the group level (F(1, 147) = 4.10, P = 0.042, ŋp 2 = 0.028) and individual level (Fischer's Exact, P = 0.032). We conclude that the gaze-contingency paradigm provides a realistically interactive experience capable of detecting deviancies in social responsiveness early, and we discuss our results in relation to subsequent infant sibling development. LAY SUMMARY: We investigated how infant siblings of children with autism spectrum disorder respond to interactive faces presented on a computer screen. Our study demonstrates that infant siblings are less responsive when interacting with faces on a computer screen (e.g., they smile and imitate less) in comparison to infants without an older sibling with autism. Reduced responsiveness within social interaction could potentially have implications for how parents and carers interact with these infants. Autism Res 2021, 14: 973-983. © 2020 International Society for Autism Research and Wiley Periodicals LLC.
Collapse
Affiliation(s)
- Jolie R Keemink
- University of Kent, School of Psychology, Keynes College, Canterbury, Kent, UK
| | - Lauren Jenner
- University of Kent, School of Psychology, Keynes College, Canterbury, Kent, UK
| | - Jonathan E Prunty
- University of Kent, School of Psychology, Keynes College, Canterbury, Kent, UK
| | - Nicky Wood
- East Kent Hospitals University NHS Foundation Trust, Canterbury, Kent, UK
| | - David J Kelly
- University of Kent, School of Psychology, Keynes College, Canterbury, Kent, UK
| |
Collapse
|
8
|
Bayet L, Perdue KL, Behrendt HF, Richards JE, Westerlund A, Cataldo JK, Nelson CA. Neural responses to happy, fearful and angry faces of varying identities in 5- and 7-month-old infants. Dev Cogn Neurosci 2020; 47:100882. [PMID: 33246304 PMCID: PMC7695867 DOI: 10.1016/j.dcn.2020.100882] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2020] [Revised: 10/19/2020] [Accepted: 11/03/2020] [Indexed: 11/30/2022] Open
Abstract
fNIRS and looking responses to emotional faces were measured in 5- and 7-month-olds. Emotional faces had varying identities within happy, angry, and fearful blocks. Temporo-parietal and frontal activations were observed, particularly to happy faces. Infants looked longer to the mouth region of angry faces. No difference in behavior or neural activity observed between 5- and 7-month-olds.
The processing of facial emotion is an important social skill that develops throughout infancy and early childhood. Here we investigate the neural underpinnings of the ability to process facial emotion across changes in facial identity in cross-sectional groups of 5- and 7-month-old infants. We simultaneously measured neural metabolic, behavioral, and autonomic responses to happy, fearful, and angry faces of different female models using functional near-infrared spectroscopy (fNIRS), eye-tracking, and heart rate measures. We observed significant neural activation to these facial emotions in a distributed set of frontal and temporal brain regions, and longer looking to the mouth region of angry faces compared to happy and fearful faces. No differences in looking behavior or neural activations were observed between 5- and 7-month-olds, although several exploratory, age-independent associations between neural activations and looking behavior were noted. Overall, these findings suggest more developmental stability than previously thought in responses to emotional facial expressions of varying identities between 5- and 7-months of age.
Collapse
Affiliation(s)
- Laurie Bayet
- Boston Children's Hospital, Boston, MA, USA; Harvard Medical School, Boston, MA, USA
| | - Katherine L Perdue
- Boston Children's Hospital, Boston, MA, USA; Harvard Medical School, Boston, MA, USA
| | - Hannah F Behrendt
- Boston Children's Hospital, Boston, MA, USA; Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital RWTH Aachen, Aachen, Germany
| | | | | | | | - Charles A Nelson
- Boston Children's Hospital, Boston, MA, USA; Harvard Medical School, Boston, MA, USA; Harvard Graduate School of Education, Cambridge, MA, USA.
| |
Collapse
|
9
|
Segal SC, Moulson MC. Dynamic Advances in Emotion Processing: Differential Attention towards the Critical Features of Dynamic Emotional Expressions in 7-Month-Old Infants. Brain Sci 2020; 10:brainsci10090585. [PMID: 32847037 PMCID: PMC7564740 DOI: 10.3390/brainsci10090585] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Revised: 08/13/2020] [Accepted: 08/18/2020] [Indexed: 11/16/2022] Open
Abstract
Infants' visual processing of emotion undergoes significant development across the first year of life, yet our knowledge regarding the mechanisms underlying these advances is limited. Additionally, infant emotion processing is commonly examined using static faces, which do not accurately depict real-world emotional displays. The goal of this study was to characterize 7-month-olds' visual scanning strategies when passively viewing dynamic emotional expressions to examine whether infants modify their scanning patterns depending on the emotion. Eye-tracking measures revealed differential attention towards the critical features (eyes, mouth) of expressions. The eyes captured the greatest attention for angry and neutral faces, and the mouth captured the greatest attention for happy faces. A time-course analysis further elucidated at what point during the trial differential scanning patterns emerged. The current results suggest that 7-month-olds are sensitive to the critical features of emotional expressions and scan them differently depending on the emotion. The scanning patterns presented in this study may serve as a link to understanding how infants begin to differentiate between expressions in the context of emotion recognition.
Collapse
|
10
|
Measuring the evolution of facial ‘expression’ using multi-species FACS. Neurosci Biobehav Rev 2020; 113:1-11. [DOI: 10.1016/j.neubiorev.2020.02.031] [Citation(s) in RCA: 38] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2019] [Revised: 01/30/2020] [Accepted: 02/23/2020] [Indexed: 11/24/2022]
|
11
|
Minio-Paluello I, Porciello G, Gandolfo M, Boukarras S, Aglioti SM. The enfacement illusion boosts facial mimicry. Cortex 2020; 123:113-123. [DOI: 10.1016/j.cortex.2019.10.001] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2019] [Revised: 08/19/2019] [Accepted: 10/01/2019] [Indexed: 12/19/2022]
|
12
|
Barabanschikov V, Korolkova O. Perception of “Live” Facial Expressions. EXPERIMENTAL PSYCHOLOGY (RUSSIA) 2020. [DOI: 10.17759/exppsy.2020130305] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
The article provides a review of experimental studies of interpersonal perception on the material of static and dynamic facial expressions as a unique source of information about the person’s inner world. The focus is on the patterns of perception of a moving face, included in the processes of communication and joint activities (an alternative to the most commonly studied perception of static images of a person outside of a behavioral context). The review includes four interrelated topics: face statics and dynamics in the recognition of emotional expressions; specificity of perception of moving face expressions; multimodal integration of emotional cues; generation and perception of facial expressions in communication processes. The analysis identifies the most promising areas of research of face in motion. We show that the static and dynamic modes of facial perception complement each other, and describe the role of qualitative features of the facial expression dynamics in assessing the emotional state of a person. Facial expression is considered as part of a holistic multimodal manifestation of emotions. The importance of facial movements as an instrument of social interaction is emphasized.
Collapse
|
13
|
Watching happy faces potentiates incentive salience but not hedonic reactions to palatable food cues in overweight/obese adults. Appetite 2018; 133:83-92. [PMID: 30367892 DOI: 10.1016/j.appet.2018.10.024] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2018] [Revised: 10/19/2018] [Accepted: 10/21/2018] [Indexed: 12/11/2022]
Abstract
'Wanting' and 'liking' are mediated by distinct brain reward systems but their dissociation in human appetite and overeating remains debated. Further, the influence of socioemotional cues on food reward is little explored. We examined these issues in overweight/obese (OW/OB) and normal-weight (NW) participants who watched food images varying in palatability in the same time as videoclips of avatars looking at the food images while displaying facial expressions (happy, disgust or neutral) with their gaze directed only toward the food or consecutively toward the food and participants. We measured heart rate (HR) deceleration as an index of attentional/incentive salience, facial EMG activity as an index of hedonic or disgust reactions, and self-report of wanting and liking. OW/OB participants exhibited a larger HR deceleration to palatable food pictures than NW participants suggesting that they attributed greater incentive salience to food cues. However, in contrast to NW participants, they did not display increased hedonic facial reactions to the liked food cues. Subjective ratings of wanting and liking did not differentiate the two groups. Further, OW/OB participants had more pronounced HR deceleration than NW participants to palatable food cues when they watched avatars' happy faces gazing at the food. In line with the "incentive-sensitization" hypothesis, our data suggest that incentive salience attribution and not hedonic reactivity is increased in OW/OB individuals and that happy faces, as social reward cues, potentiate implicit wanting in OW/OB people.
Collapse
|