1
|
Morgenstern A. Children's multimodal language development from an interactional, usage-based, and cognitive perspective. WILEY INTERDISCIPLINARY REVIEWS. COGNITIVE SCIENCE 2023; 14:e1631. [PMID: 36377962 DOI: 10.1002/wcs.1631] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/28/2021] [Accepted: 08/26/2022] [Indexed: 11/16/2022]
Abstract
Through daily exposure to the surrounding input structured in conversations, children's language gradually develops into rich linguistic constructions that contain multiple cross-modal elements subtly used together for rich communicative functions. Children demonstrate their skills to resort to multiple semiotic resources in their daily interactions and expertly use them according to their expressive needs and communicative intents. Usage-based (Tomasello, 2003) and cognitive linguistics (Langacker, 1988) as well as construction grammar (Goldberg, 2006) have enriched our comprehension of the processes at work. Those approaches need to be combined to gesture studies (Kendon, 1988; McNeill, 1992) and multimodal approaches (Andren, 2010; Morgenstern, 2014) to fully capture the orchestration of the semiotic resources at play (Cienki, 2012; Müller, 2009). But child language development cannot be understood outside its interactional, dialogic context (Bakhtin, 1981) and without taking into account the role of expert languagers (Vygotsky, 1934) in routines or formats (Bruner, 1975). The first section thus extensively focuses on a productive combination of theoretical approaches and methods, which have been essential to understand child language development, but analyzing child language is also necessary in turn to ground socio-cognitive and interactional approaches to language. The salient features of the variably multimodal child's development are presented in the second section. The third section illustrates longitudinal pathways into multimodal languaging thanks to detailed analyses of adult-child interactive sequences. This article is categorized under: Cognitive Biology > Cognitive Development Computer Science and Robotics > Natural Language Processing Linguistics > Language Acquisition Linguistics > Cognitive Linguistics.
Collapse
|
2
|
McDonald DQ, Zampella CJ, Sariyanidi E, Manakiwala A, DeJardin E, Herrington JD, Schultz RT, Tunç B. Head Movement Patterns during Face-to-Face Conversations Vary with Age. ICMI'22 COMPANION : COMPANION PUBLICATION OF THE 2022 INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION : NOVEMBER 7-11, 2022, BANGALORE, INDIA. ICMI (CONFERENCE) (2022 : BANGALORE, INDIA) 2022; 2022:185-195. [PMID: 37975062 PMCID: PMC10652276 DOI: 10.1145/3536220.3563366] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/19/2023]
Abstract
Advances in computational behavior analysis have the potential to increase our understanding of behavioral patterns and developmental trajectories in neurotypical individuals, as well as in individuals with mental health conditions marked by motor, social, and emotional difficulties. This study focuses on investigating how head movement patterns during face-to-face conversations vary with age from childhood through adulthood. We rely on computer vision techniques due to their suitability for analysis of social behaviors in naturalistic settings, since video data capture can be unobtrusively embedded within conversations between two social partners. The methods in this work include unsupervised learning for movement pattern clustering, and supervised classification and regression as a function of age. The results demonstrate that 3-minute video recordings of head movements during conversations show patterns that distinguish between participants that are younger vs. older than 12 years with 78% accuracy. Additionally, we extract relevant patterns of head movement upon which the age distinction was determined by our models.
Collapse
Affiliation(s)
| | | | | | - Aashvi Manakiwala
- University of Pennsylvania, Philadelphia, PA, USA
- Children's Hospital of Philadelphia, Philadelphia, PA, USA
| | - Ellis DeJardin
- Children's Hospital of Philadelphia, Philadelphia, PA, USA
| | - John D Herrington
- Children's Hospital of Philadelphia, Philadelphia, PA, USA
- University of Pennsylvania, Philadelphia, PA, USA
| | - Robert T Schultz
- Children's Hospital of Philadelphia, Philadelphia, PA, USA
- University of Pennsylvania, Philadelphia, PA, USA
| | - Birkan Tunç
- Children's Hospital of Philadelphia, Philadelphia, PA, USA
- University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
3
|
Chester M, Plate RC, Powell T, Rodriguez Y, Wagner NJ, Waller R. The COVID-19 pandemic, mask-wearing, and emotion recognition during late-childhood. SOCIAL DEVELOPMENT 2022; 32:SODE12631. [PMID: 36246541 PMCID: PMC9538546 DOI: 10.1111/sode.12631] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2021] [Revised: 07/10/2022] [Accepted: 07/16/2022] [Indexed: 11/28/2022]
Abstract
Face masks are an effective and important tool to prevent the spread of COVID-19, including among children. However, occluding parts of the face can impact emotion recognition, which is fundamental to effective social interactions. Social distancing, stress, and changes to routines because of the pandemic have also altered the social landscape of children, with implications for social development. To better understand how social input and context impact emotion recognition, the current study investigated emotion recognition in children (7-12 years old, N = 131) using images of both masked and unmasked emotional faces. We also assessed a subsample of participants ("pre-pandemic subsample," n = 35) who had completed the same emotion recognition task with unmasked faces before and during the pandemic. Masking of faces was related to worse emotion recognition, with more pronounced effects for happy, sad, and fearful faces than angry and neutral faces. Masking was more strongly related to emotion recognition among children whose families reported greater social disruption in response to the pandemic. Finally, in the pre-pandemic subsample, emotion recognition of sad faces was lower during versus before the pandemic relative to other emotions. Together, findings show that occluding face parts and the broader social context (i.e., global pandemic) both impact emotion-relevant judgments in school-aged children.
Collapse
Affiliation(s)
- Maia Chester
- Department of PsychologyUniversity of PennsylvaniaPhiladelphiaPennsylvaniaUSA
| | - Rista C. Plate
- Department of PsychologyUniversity of PennsylvaniaPhiladelphiaPennsylvaniaUSA
| | - Tralucia Powell
- Department of Psychological and Brain SciencesBoston UniversityBostonMassachusettsUSA
| | - Yuheiry Rodriguez
- Department of PsychologyUniversity of PennsylvaniaPhiladelphiaPennsylvaniaUSA
| | - Nicholas J. Wagner
- Department of Psychological and Brain SciencesBoston UniversityBostonMassachusettsUSA
| | - Rebecca Waller
- Department of PsychologyUniversity of PennsylvaniaPhiladelphiaPennsylvaniaUSA
| |
Collapse
|
4
|
Luo W, Berson IR, Berson MJ. Bi-directional Emotional Contagion: An Analysis of Chinese Parents’ Social Media Data. COMPUTERS AND EDUCATION OPEN 2022. [DOI: 10.1016/j.caeo.2022.100092] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/29/2023] Open
|
5
|
Jones AC, Gutierrez R, Ludlow AK. Emotion production of facial expressions: A comparison of deaf and hearing children. JOURNAL OF COMMUNICATION DISORDERS 2021; 92:106113. [PMID: 34098333 DOI: 10.1016/j.jcomdis.2021.106113] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Revised: 12/01/2020] [Accepted: 05/15/2021] [Indexed: 06/12/2023]
Abstract
The production of facial expressions is an important skill that allows children to share and adapt emotions during social interactions. While deaf children are reported to show delays in their social and emotion understanding, the way in which they produce facial expressions of emotions has been relatively unexplored. The present study investigated the production of facial expressions of emotions by young congenitally deaf children. Six facial expressions of emotions produced by 5 congenitally deaf children and 5 hearing children (control group) were filmed across three tasks: 1) voluntarily posed expression of emotion 2) responding to social stories 3) intentionally mimicking expressions of emotion. The recorded videos were analysed using a software based of the Facial Action Coding System (FACS), and then judged by adult raters using two different scales: according to the emotion elicited (i.e. accuracy) and the intensity of the emotion produced. The results of both measurement scales showed that all children (deaf and hearing) were able to produce socially recognisable prototypical configuration of facial expressions. However, the deaf children were rated by adults as expressing their emotions with greater intensity compared to the hearing children. The results suggest deaf children may show more exaggerated facial expressions of emotion, possibly to avoid any ambiguity in communication.
Collapse
Affiliation(s)
- A C Jones
- Department of Psychology, Sport and Geography, University of Hertfordshire, School of Life and Medical Sciences, Hatfield, AL10 9AB.
| | - R Gutierrez
- Department of Psychology, Sport and Geography, University of Hertfordshire, School of Life and Medical Sciences, Hatfield, AL10 9AB
| | - A K Ludlow
- Department of Psychology, Sport and Geography, University of Hertfordshire, School of Life and Medical Sciences, Hatfield, AL10 9AB
| |
Collapse
|
6
|
Cordoni G, Favilli E, Palagi E. Earlier than previously thought: Yawn contagion in preschool children. Dev Psychobiol 2021; 63:931-944. [PMID: 33506489 DOI: 10.1002/dev.22094] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2020] [Revised: 12/22/2020] [Accepted: 12/26/2020] [Indexed: 12/18/2022]
Abstract
Yawning is a primitive and stereotyped motor action involving orofacial, laryngeal, pharyngeal, thoracic and abdominal muscles. Contagious yawning, an involuntarily action induced by viewing or listening to others' yawns, has been demonstrated in human and several non-human species. Previous studies with humans showed that infants and preschool children, socially separated during video experiments, were not infected by others' yawns. Here, we tested the occurrence of yawn contagion in 129 preschool children (ranging from 2.5 to 5.5 years) belonging to five different classes by video recording them in their classrooms during the ordinary school activities. As it occurs in adult humans, children of all ages were infected by others' yawns within the 2 min after the perception of the stimulus. The yawn contagion occurred earlier than previously thought. For children, it appears that the natural social setting is more conducive to yawn contagion than the inherently artificial experimental approach. Moreover, children's gender did not affect the level of contagious yawning. The neural, emotional and behavioural traits of preschool children are probably not sufficiently mature to express variability between boys and girls; nevertheless, children appeared to be already well equipped with the 'neural toolkit' necessary for expressing yawn contagion.
Collapse
Affiliation(s)
- Giada Cordoni
- Natural History Museum, University of Pisa, Calci Pisa, Italy
| | | | - Elisabetta Palagi
- Natural History Museum, University of Pisa, Calci Pisa, Italy.,Unit of Ethology, Department of Biology, University of Pisa, Pisa, Italy
| |
Collapse
|
7
|
Chen J, Zhang Y, Zhao G. The Qingdao Preschooler Facial Expression Set: Acquisition and Validation of Chinese Children's Facial Emotion Stimuli. Front Psychol 2021; 11:554821. [PMID: 33551893 PMCID: PMC7858654 DOI: 10.3389/fpsyg.2020.554821] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2020] [Accepted: 12/17/2020] [Indexed: 12/03/2022] Open
Abstract
Traditional research on emotion-face processing has primarily focused on the expression of basic emotions using adult emotional face stimuli. Stimulus sets featuring child faces or emotions other than basic emotions are rare. The current study describes the acquisition and evaluation of the Qingdao Preschooler Facial Expression (QPFE) set, a facial stimulus set with images featuring 54 Chinese preschoolers' emotion expressions. The set includes 712 standardized color photographs of six basic emotions (joy, fear, anger, sadness, surprise, and disgust), five discrete positive emotions (interest, contentment, relief, pride, and amusement), and a neutral expression. The validity of the pictures was examined based on 43 adult raters' online evaluation, including agreement between designated emotions and raters' labels, as well as intensity and representativeness scores. Overall, these data should contribute to the developmental and cross-cultural research on children's emotion expressions and provide insights for future research on positive emotions.
Collapse
Affiliation(s)
- Jie Chen
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Yulin Zhang
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Guozhen Zhao
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
8
|
Yang Y. A preliminary evaluation of still face images by deep learning: A potential screening test for childhood developmental disabilities. Med Hypotheses 2020; 144:109978. [PMID: 32540607 DOI: 10.1016/j.mehy.2020.109978] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2020] [Revised: 05/22/2020] [Accepted: 06/05/2020] [Indexed: 12/01/2022]
Abstract
Most developmental disorders are defined by their clinical symptoms and many disorders share common features. The main objective of this research is to evaluate still facial images as a potential screening test for childhood developmental disabilities, which is free of any biases of subjective judgments of human observers. Via supervised machine learning, a classifier of convolution neural network (CNN) was built by using 908 facial images, half of those were photos of children labeled with "autism", which may include some developmental disorders with autism-like features. Then face images were generated for two categories of photos. Above all, the most important discovery of this research is that face images labeled "autism" and normal controls populate two quite distinctive manifolds. Different pattern was found to be distributed in the eyes and mouth in the generated photos for two categories of faces by deep learning. It is showed that supervised machine learning can obtain facial features, which could possibly be applicable to improve early screening for childhood developmental disabilities by facial expression. A simple computer-based screening test of still face images may prove to be a useful adjunct in many clinical settings.
Collapse
Affiliation(s)
- You Yang
- Department of Developmental and Behavioral Pediatrics, Shanghai Children's Medical Center, Shanghai Jiaotong University School of Medicine, Shanghai 200127, PR China.
| |
Collapse
|
9
|
Famelart N, Diene G, Çabal-Berthoumieu S, Glattard M, Molinas C, Guidetti M, Tauber M. Equivocal expression of emotions in children with Prader-Willi syndrome: what are the consequences for emotional abilities and social adjustment? Orphanet J Rare Dis 2020; 15:55. [PMID: 32085791 PMCID: PMC7035757 DOI: 10.1186/s13023-020-1333-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2019] [Accepted: 02/07/2020] [Indexed: 11/25/2022] Open
Abstract
Background People with Prader-Willi Syndrome (PWS) experience great difficulties in social adaptation that could be explained by disturbances in emotional competencies. However, current knowledge about the emotional functioning of people with PWS is incomplete. In particular, despite being the foundation of social adaptation, their emotional expression abilities have never been investigated. In addition, motor and cognitive difficulties - characteristic of PWS - could further impair these abilities. Method To explore the expression abilities of children with PWS, twenty-five children with PWS aged 5 to 10 years were assessed for 1) their emotional facial reactions to a funny video-clip and 2) their ability to produce on demand the facial and bodily expressions of joy, anger, fear and sadness. Their productions were compared to those of two groups of children with typical development, matched to PWS children by chronological age and by developmental age. The analyses focused on the proportion of expressive patterns relating to the target emotion and to untargeted emotions in the children’s productions. Results The results showed that the facial and bodily emotional expressions of children with PWS were particularly difficult to interpret, involving a pronounced mixture of different emotional patterns. In addition, it was observed that the emotions produced on demand by PWS children were particularly poor and equivocal. Conclusions As far as we know, this study is the first to highlight the existence of particularities in the expression of emotions in PWS children. These results shed new light on emotional dysfunction in PWS and consequently on the adaptive abilities of those affected in daily life.
Collapse
Affiliation(s)
- Nawelle Famelart
- CLLE, University of Toulouse, CNRS, Toulouse, France. .,Université Toulouse Jean Jaurès, Maison de la Recherche, Laboratoire CLLE, 5, allée Antonio Machado, 31058, Toulouse Cedex 9, France.
| | - Gwenaelle Diene
- Centre de Référence du Syndrome de Prader-Willi, CHU Toulouse, Toulouse, France
| | | | - Mélanie Glattard
- Centre de Référence du Syndrome de Prader-Willi, CHU Toulouse, Toulouse, France
| | - Catherine Molinas
- Centre de Référence du Syndrome de Prader-Willi, CHU Toulouse, Toulouse, France
| | | | - Maithe Tauber
- Centre de Référence du Syndrome de Prader-Willi, CHU Toulouse, Toulouse, France.,CPTP, University of Toulouse, CNRS, INSERM, Toulouse, France
| |
Collapse
|
10
|
Grossard C, Dapogny A, Cohen D, Bernheim S, Juillet E, Hamel F, Hun S, Bourgeois J, Pellerin H, Serret S, Bailly K, Chaby L. Children with autism spectrum disorder produce more ambiguous and less socially meaningful facial expressions: an experimental study using random forest classifiers. Mol Autism 2020; 11:5. [PMID: 31956394 PMCID: PMC6958757 DOI: 10.1186/s13229-020-0312-2] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2019] [Accepted: 01/01/2020] [Indexed: 01/19/2023] Open
Abstract
Background Computer vision combined with human annotation could offer a novel method for exploring facial expression (FE) dynamics in children with autism spectrum disorder (ASD). Methods We recruited 157 children with typical development (TD) and 36 children with ASD in Paris and Nice to perform two experimental tasks to produce FEs with emotional valence. FEs were explored by judging ratings and by random forest (RF) classifiers. To do so, we located a set of 49 facial landmarks in the task videos, we generated a set of geometric and appearance features and we used RF classifiers to explore how children with ASD differed from TD children when producing FEs. Results Using multivariate models including other factors known to predict FEs (age, gender, intellectual quotient, emotion subtype, cultural background), ratings from expert raters showed that children with ASD had more difficulty producing FEs than TD children. In addition, when we explored how RF classifiers performed, we found that classification tasks, except for those for sadness, were highly accurate and that RF classifiers needed more facial landmarks to achieve the best classification for children with ASD. Confusion matrices showed that when RF classifiers were tested in children with ASD, anger was often confounded with happiness. Limitations The sample size of the group of children with ASD was lower than that of the group of TD children. By using several control calculations, we tried to compensate for this limitation. Conclusion Children with ASD have more difficulty producing socially meaningful FEs. The computer vision methods we used to explore FE dynamics also highlight that the production of FEs in children with ASD carries more ambiguity.
Collapse
Affiliation(s)
- Charline Grossard
- 1Service de Psychiatrie de l'Enfant et de l'Adolescent, GH Pitié-Salpêtrière Charles Foix, APHP.6, Paris, France.,2Institut des Systèmes Intelligents et de Robotique, Sorbonne Université, ISIR CNRS UMR 7222, Paris, France
| | - Arnaud Dapogny
- 2Institut des Systèmes Intelligents et de Robotique, Sorbonne Université, ISIR CNRS UMR 7222, Paris, France
| | - David Cohen
- 1Service de Psychiatrie de l'Enfant et de l'Adolescent, GH Pitié-Salpêtrière Charles Foix, APHP.6, Paris, France.,2Institut des Systèmes Intelligents et de Robotique, Sorbonne Université, ISIR CNRS UMR 7222, Paris, France
| | - Sacha Bernheim
- 2Institut des Systèmes Intelligents et de Robotique, Sorbonne Université, ISIR CNRS UMR 7222, Paris, France
| | - Estelle Juillet
- 1Service de Psychiatrie de l'Enfant et de l'Adolescent, GH Pitié-Salpêtrière Charles Foix, APHP.6, Paris, France
| | - Fanny Hamel
- 1Service de Psychiatrie de l'Enfant et de l'Adolescent, GH Pitié-Salpêtrière Charles Foix, APHP.6, Paris, France
| | | | | | - Hugues Pellerin
- 1Service de Psychiatrie de l'Enfant et de l'Adolescent, GH Pitié-Salpêtrière Charles Foix, APHP.6, Paris, France
| | | | - Kevin Bailly
- 2Institut des Systèmes Intelligents et de Robotique, Sorbonne Université, ISIR CNRS UMR 7222, Paris, France
| | - Laurence Chaby
- 1Service de Psychiatrie de l'Enfant et de l'Adolescent, GH Pitié-Salpêtrière Charles Foix, APHP.6, Paris, France.,2Institut des Systèmes Intelligents et de Robotique, Sorbonne Université, ISIR CNRS UMR 7222, Paris, France.,4Institut de Psychologie, Université de Paris, 92100 Boulogne-Billancourt, France
| |
Collapse
|
11
|
Abstract
Social cognition refers to a complex set of mental abilities underlying social stimulus perception, processing, interpretation, and response. Together, these abilities support the development of adequate social competence and adaptation. Social cognition has a protracted development through infancy to adulthood. Given the preponderance of social dysfunctions across neurologic conditions, social cognition is now recognized as a core domain of functioning that warrants clinical attention. This chapter provides an overview of the construct of social cognition, defines some of the most clinically significant sociocognitive abilities (face processing, facial expression processing, joint attention, theory of mind, empathy, and moral processing), and introduces the neural networks and frameworks associated with these abilities. Broad principles for understanding the development of social cognition are presented, and a summary of normative developmental milestones of clinically relevant sociocognitive abilities is proposed. General guidelines for sound social cognition assessment in children and adolescents are summarized.
Collapse
Affiliation(s)
- Cindy Beaudoin
- Research Centre, Centre Hospitalier Universitaire Sainte-Justine, Department of Psychology, Université de Montréal, Montréal, QC, Canada
| | - Miriam H Beauchamp
- Research Centre, Centre Hospitalier Universitaire Sainte-Justine, Department of Psychology, Université de Montréal, Montréal, QC, Canada.
| |
Collapse
|
12
|
Mimicking Others’ Nonverbal Signals is Associated with Increased Attitude Contagion. JOURNAL OF NONVERBAL BEHAVIOR 2019. [DOI: 10.1007/s10919-019-00322-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
13
|
Emotional expressions with minimal facial muscle actions. Report 1: Cues and targets. CURRENT PSYCHOLOGY 2019. [DOI: 10.1007/s12144-019-0151-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|