1
|
Troncoso A, Soto V, Gomila A, Martínez-Pernía D. Moving beyond the lab: investigating empathy through the Empirical 5E approach. Front Psychol 2023; 14:1119469. [PMID: 37519389 PMCID: PMC10374225 DOI: 10.3389/fpsyg.2023.1119469] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2022] [Accepted: 06/05/2023] [Indexed: 08/01/2023] Open
Abstract
Empathy is a complex and multifaceted phenomenon that plays a crucial role in human social interactions. Recent developments in social neuroscience have provided valuable insights into the neural underpinnings and bodily mechanisms underlying empathy. This methodology often prioritizes precision, replicability, internal validity, and confound control. However, fully understanding the complexity of empathy seems unattainable by solely relying on artificial and controlled laboratory settings, while overlooking a comprehensive view of empathy through an ecological experimental approach. In this article, we propose articulating an integrative theoretical and methodological framework based on the 5E approach (the "E"s stand for embodied, embedded, enacted, emotional, and extended perspectives of empathy), highlighting the relevance of studying empathy as an active interaction between embodied agents, embedded in a shared real-world environment. In addition, we illustrate how a novel multimodal approach including mobile brain and body imaging (MoBi) combined with phenomenological methods, and the implementation of interactive paradigms in a natural context, are adequate procedures to study empathy from the 5E approach. In doing so, we present the Empirical 5E approach (E5E) as an integrative scientific framework to bridge brain/body and phenomenological attributes in an interbody interactive setting. Progressing toward an E5E approach can be crucial to understanding empathy in accordance with the complexity of how it is experienced in the real world.
Collapse
Affiliation(s)
- Alejandro Troncoso
- Center for Social and Cognitive Neuroscience, School of Psychology, Adolfo Ibáñez University, Santiago, Chile
| | - Vicente Soto
- Center for Social and Cognitive Neuroscience, School of Psychology, Adolfo Ibáñez University, Santiago, Chile
| | - Antoni Gomila
- Department of Psychology, University of the Balearic Islands, Palma de Mallorca, Spain
| | - David Martínez-Pernía
- Center for Social and Cognitive Neuroscience, School of Psychology, Adolfo Ibáñez University, Santiago, Chile
| |
Collapse
|
2
|
Hepach R, Gerdemann SC. How "peer-fear" of others' evaluations can regulate young children's cooperation. Behav Brain Sci 2023; 46:e64. [PMID: 37154366 DOI: 10.1017/s0140525x22001893] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
Children's cooperation with peers undergoes substantial developmental changes between 3 and 10 years of age. Here we stipulate that young children's initial fearfulness of peers' behaviour develops into older children's fearfulness of peers' evaluations of their own behaviour. Cooperation may constitute an adaptive environment in which the expressions of fear and self-conscious emotions regulate the quality of children's peer relationships.
Collapse
Affiliation(s)
- Robert Hepach
- Department of Experimental Psychology, University of Oxford, Oxford OX2 6GG, UK ; https://www.psy.ox.ac.uk/people/robert-hepach
| | - Stella Claire Gerdemann
- Department of Experimental Psychology, University of Oxford, Oxford OX2 6GG, UK ; https://www.psy.ox.ac.uk/people/robert-hepach
- Department of Early Child Development, Leipzig University, 04109Leipzig, Germany ; https://www.lfe.uni-leipzig.de/en/employee/stella-gerdemann-2/
| |
Collapse
|
3
|
Grossmann T. Extending and refining the fearful ape hypothesis. Behav Brain Sci 2023; 46:e81. [PMID: 37154374 DOI: 10.1017/s0140525x22002837] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
The fearful ape hypothesis (FAH) presents an evolutionary-developmental framework stipulating that in the context of cooperative caregiving, unique to human great ape group life, heightened fearfulness was adaptive. This is because from early in human ontogeny fearfulness expressed and perceived enhanced care-based responding and cooperation with mothers and others. This response extends and refines the FAH by incorporating the commentaries' suggestions and additional lines of empirical work, providing a more comprehensive and nuanced version of the FAH. Specifically, it encourages and hopes to inspire cross-species and cross-cultural, longitudinal work elucidating evolutionary and developmental functions of fear in context. Beyond fear, it can be seen as a call for an evolutionary-developmental approach to affective science.
Collapse
Affiliation(s)
- Tobias Grossmann
- Department of Psychology, University of Virginia, Charlottesville, VA 22904,
| |
Collapse
|
4
|
Ward IL, Raven EP, de la Rosa S, Jones DK, Teufel C, von dem Hagen E. White matter microstructure in face and body networks predicts facial expression and body posture perception across development. Hum Brain Mapp 2023; 44:2307-2322. [PMID: 36661194 PMCID: PMC10028674 DOI: 10.1002/hbm.26211] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2022] [Revised: 12/05/2022] [Accepted: 01/07/2023] [Indexed: 01/21/2023] Open
Abstract
Facial expression and body posture recognition have protracted developmental trajectories. Interactions between face and body perception, such as the influence of body posture on facial expression perception, also change with development. While the brain regions underpinning face and body processing are well-defined, little is known about how white-matter tracts linking these regions relate to perceptual development. Here, we obtained complementary diffusion magnetic resonance imaging (MRI) measures (fractional anisotropy [FA], spherical mean Ṧμ ), and a quantitative MRI myelin-proxy measure (R1), within white-matter tracts of face- and body-selective networks in children and adolescents and related these to perceptual development. In tracts linking occipital and fusiform face areas, facial expression perception was predicted by age-related maturation, as measured by Ṧμ and R1, as well as age-independent individual differences in microstructure, captured by FA and R1. Tract microstructure measures linking posterior superior temporal sulcus body region with anterior temporal lobe (ATL) were related to the influence of body on facial expression perception, supporting ATL as a site of face and body network convergence. Overall, our results highlight age-dependent and age-independent constraints that white-matter microstructure poses on perceptual abilities during development and the importance of complementary microstructural measures in linking brain structure and behaviour.
Collapse
Affiliation(s)
- Isobel L. Ward
- Cardiff University Brain Research Imaging Centre, School of PsychologyCardiff UniversityCardiffUK
| | - Erika P. Raven
- Cardiff University Brain Research Imaging Centre, School of PsychologyCardiff UniversityCardiffUK
- Center for Biomedical Imaging, Department of RadiologyNew York University Grossman School of MedicineNew YorkNew YorkUSA
| | | | - Derek K. Jones
- Cardiff University Brain Research Imaging Centre, School of PsychologyCardiff UniversityCardiffUK
| | - Christoph Teufel
- Cardiff University Brain Research Imaging Centre, School of PsychologyCardiff UniversityCardiffUK
| | - Elisabeth von dem Hagen
- Cardiff University Brain Research Imaging Centre, School of PsychologyCardiff UniversityCardiffUK
| |
Collapse
|
5
|
Geangu E, Vuong QC. Seven-months-old infants show increased arousal to static emotion body expressions: Evidence from pupil dilation. INFANCY 2023. [PMID: 36917082 DOI: 10.1111/infa.12535] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2022] [Revised: 12/23/2022] [Accepted: 02/10/2023] [Indexed: 03/16/2023]
Abstract
Human body postures provide perceptual cues that can be used to discriminate and recognize emotions. It was previously found that 7-months-olds' fixation patterns discriminated fear from other emotion body expressions but it is not clear whether they also process the emotional content of those expressions. The emotional content of visual stimuli can increase arousal level resulting in pupil dilations. To provide evidence that infants also process the emotional content of expressions, we analyzed variations in pupil in response to emotion stimuli. Forty-eight 7-months-old infants viewed adult body postures expressing anger, fear, happiness and neutral expressions, while their pupil size was measured. There was a significant emotion effect between 1040 and 1640 ms after image onset, when fear elicited larger pupil dilations than neutral expressions. A similar trend was found for anger expressions. Our results suggest that infants have increased arousal to negative-valence body expressions. Thus, in combination with previous fixation results, the pupil data show that infants as young as 7-months can perceptually discriminate static body expressions and process the emotional content of those expressions. The results extend information about infant processing of emotion expressions conveyed through other means (e.g., faces).
Collapse
Affiliation(s)
- Elena Geangu
- Department of Psychology, University of York, York, UK
| | - Quoc C Vuong
- Biosciences Institute and School of Psychology, Newcastle University, Newcastle upon Tyne, UK
| |
Collapse
|
6
|
Aran Ö, Garcia SE, Hankin BL, Hyde DC, Davis EP. Signatures of emotional face processing measured by event-related potentials in 7-month-old infants. Dev Psychobiol 2023; 65:e22361. [PMID: 36811377 PMCID: PMC9978929 DOI: 10.1002/dev.22361] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2022] [Revised: 11/17/2022] [Accepted: 11/19/2022] [Indexed: 12/29/2022]
Abstract
The ability to distinguish facial emotions emerges in infancy. Although this ability has been shown to emerge between 5 and 7 months of age, the literature is less clear regarding the extent to which neural correlates of perception and attention play a role in processing of specific emotions. This study's main goal was to examine this question among infants. To this end, we presented angry, fearful, and happy faces to 7-month-old infants (N = 107, 51% female) while recording event-related brain potentials. The perceptual N290 component showed a heightened response for fearful and happy relative to angry faces. Attentional processing, indexed by the P400, showed some evidence of a heightened response for fearful relative to happy and angry faces. We did not observe robust differences by emotion in the negative central (Nc) component, although trends were consistent with previous work suggesting a heightened response to negatively valenced expressions. Results suggest that perceptual (N290) and attentional (P400) processing is sensitive to emotions in faces, but these processes do not provide evidence for a fear-specific bias across components.
Collapse
Affiliation(s)
- Özlü Aran
- University of Denver, Department of Psychology
| | - Sarah E. Garcia
- Department of Psychiatry, Columbia University Irving Medical Center
| | | | - Daniel C. Hyde
- University of Illinois at Urbana-Champaign, Department of Psychology
| | - Elysia Poggi Davis
- University of Denver, Department of Psychology
- University of California, Irvine, Department of Pediatrics
| |
Collapse
|
7
|
Calbi M, Montalti M, Pederzani C, Arcuri E, Umiltà MA, Gallese V, Mirabella G. Emotional body postures affect inhibitory control only when task-relevant. Front Psychol 2022; 13:1035328. [PMID: 36405118 PMCID: PMC9669573 DOI: 10.3389/fpsyg.2022.1035328] [Citation(s) in RCA: 14] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2022] [Accepted: 10/10/2022] [Indexed: 08/05/2023] Open
Abstract
A classical theoretical frame to interpret motor reactions to emotional stimuli is that such stimuli, particularly those threat-related, are processed preferentially, i.e., they are capable of capturing and grabbing attention automatically. Research has recently challenged this view, showing that the task relevance of emotional stimuli is crucial to having a reliable behavioral effect. Such evidence indicated that emotional facial expressions do not automatically influence motor responses in healthy young adults, but they do so only when intrinsically pertinent to the ongoing subject's goals. Given the theoretical relevance of these findings, it is essential to assess their generalizability to different, socially relevant emotional stimuli such as emotional body postures. To address this issue, we compared the performance of 36 right-handed participants in two different versions of a Go/No-go task. In the Emotional Discrimination task, participants were required to withhold their responses at the display of emotional body postures (fearful or happy) and to move at the presentation of neutral postures. Differently, in the control task, the same images were shown, but participants had to respond according to the color of the actor/actress' t-shirt, disregarding the emotional content. Results showed that participants made more commission errors (instances in which they moved even though the No-go signal was presented) for happy than fearful body postures in the Emotional Discrimination task. However, this difference disappeared in the control task. Such evidence indicates that, like facial emotion, emotional body expressions do not influence motor control automatically, but only when they are task-relevant.
Collapse
Affiliation(s)
- Marta Calbi
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy
- Lab Neuroscience & Humanities, University of Parma, Parma, Italy
- Department of Philosophy, State University of Milan, Milan, Italy
| | - Martina Montalti
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy
- Lab Neuroscience & Humanities, University of Parma, Parma, Italy
- Department of Clinical and Experimental Sciences, University of Brescia, Brescia, Italy
| | - Carlotta Pederzani
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy
| | - Edoardo Arcuri
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy
- Lab Neuroscience & Humanities, University of Parma, Parma, Italy
| | - Maria Alessandra Umiltà
- Lab Neuroscience & Humanities, University of Parma, Parma, Italy
- Department of Food and Drug Sciences, University of Parma, Parma, Italy
| | - Vittorio Gallese
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy
- Lab Neuroscience & Humanities, University of Parma, Parma, Italy
| | - Giovanni Mirabella
- Department of Clinical and Experimental Sciences, University of Brescia, Brescia, Italy
- IRCCS Neuromed, Pozzilli, Italy
| |
Collapse
|
8
|
Song S, Wu M, Feng C. Early Influence of Emotional Scenes on the Encoding of Fearful Expressions With Different Intensities: An Event-Related Potential Study. Front Hum Neurosci 2022; 16:866253. [PMID: 35652009 PMCID: PMC9150066 DOI: 10.3389/fnhum.2022.866253] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2022] [Accepted: 04/26/2022] [Indexed: 12/03/2022] Open
Abstract
Contextual affective information influences the processing of facial expressions at the relatively early stages of face processing, but the effect of the context on the processing of facial expressions with varying intensities remains unclear. In this study, we investigated the influence of emotional scenes (fearful, happy, and neutral) on the processing of fear expressions at different levels of intensity (high, medium, and low) during the early stages of facial recognition using event-related potential (ERP) technology. EEG data were collected while participants performed a fearful facial expression recognition task. The results showed that (1) the recognition of high-intensity fear expression was higher than that of medium- and low-intensity fear expressions. Facial expression recognition was the highest when faces appeared in fearful scenes. (2) Emotional scenes modulated the amplitudes of N170 for fear expressions with different intensities. Specifically, the N170 amplitude, induced by high-intensity fear expressions, was significantly higher than that induced by low-intensity fear expressions when faces appeared in both neutral and fearful scenes. No significant differences were found between the N170 amplitudes induced by high-, medium-, and low-intensity fear expressions when faces appeared in happy scenes. These results suggest that individuals may tend to allocate their attention resources to the processing of face information when the valence between emotional context and expression conflicts i.e., when the conflict is absent (fear scene and fearful faces) or is low (neutral scene and fearful faces).
Collapse
Affiliation(s)
- Sutao Song
- School of Information Science and Engineering, Shandong Normal University, Jinan, China
- School of Education and Psychology, University of Jinan, Jinan, China
- *Correspondence: Sutao Song,
| | - Meiyun Wu
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, China
| | - Chunliang Feng
- Key Laboratory of Brain, Cognition and Education Sciences, Ministry of Education, School of Psychology, Center for Studies of Psychological Application, Guangdong Key Laboratory of Mental Health and Cognitive Science, South China Normal University, Guangzhou, China
- Chunliang Feng,
| |
Collapse
|
9
|
|
10
|
Pollux PM. Age-of-actor effects in body expression recognition of children. Acta Psychol (Amst) 2021; 220:103421. [PMID: 34564027 DOI: 10.1016/j.actpsy.2021.103421] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2020] [Revised: 09/14/2021] [Accepted: 09/20/2021] [Indexed: 10/20/2022] Open
Abstract
Investigations of developmental trajectories for emotion recognition suggest that both face- and body expression recognition increases rapidly in early childhood and reaches adult levels of performance near the age of ten. So far, little is known about whether children's ability to recognise body expressions is influenced by the age of the person they are observing. This question is investigated here by presenting 119 children and 42 young adults with videos of children, young adults and older adults expressing emotions with their whole body. The results revealed an own-age advantage for children, reflected in adult-level accuracy for videos of children for most expressions but reduced accuracy for videos of older adults. Children's recognition of older adults' expressions was not correlated with children's estimated amount of contact with older adults. Support for potential influences of social biases on performance measures was minimal. The own-age advantage was explained in terms of children's reduced familiarity with body expressions of older adults due to aging related changes in the kinematics characteristics of movements and potentially due to stronger embodiment of other children's bodily movements.
Collapse
|
11
|
Recognizing emotions in bodies: Vagus nerve stimulation enhances recognition of anger while impairing sadness. COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2021; 21:1246-1261. [PMID: 34268714 PMCID: PMC8563521 DOI: 10.3758/s13415-021-00928-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Accepted: 06/14/2021] [Indexed: 11/08/2022]
Abstract
According to the Polyvagal theory, the vagus nerve is the key phylogenetic substrate that supports efficient emotion recognition for promoting safety and survival. Previous studies showed that the vagus nerve affects people's ability to recognize emotions based on eye regions and whole facial images, but not static bodies. The purpose of this study was to verify whether the previously suggested causal link between vagal activity and emotion recognition can be generalized to situations in which emotions must be inferred from images of whole moving bodies. We employed transcutaneous vagus nerve stimulation (tVNS), a noninvasive brain stimulation technique that stimulates the vagus nerve by a mild electrical stimulation to the auricular branch of the vagus, located in the anterior protuberance of the outer ear. In two sessions, participants received active or sham tVNS before and while performing three emotion recognition tasks, aimed at indexing their ability to recognize emotions from static or moving bodily expressions by actors. Active tVNS, compared to sham stimulation, enhanced the recognition of anger but reduced the ability to recognize sadness, regardless of the type of stimulus (static vs. moving). Convergent with the idea of hierarchical involvement of the vagus in establishing safety, as put forward by the Polyvagal theory, we argue that our findings may be explained by vagus-evoked differential adjustment strategies to emotional expressions. Taken together, our findings fit with an evolutionary perspective on the vagus nerve and its involvement in emotion recognition for the benefit of survival.
Collapse
|
12
|
Ryan NP, Greenham M, Gordon AL, Ditchfield M, Coleman L, Cooper A, Crowe L, Hunt RW, Monagle P, Mackay MT, Anderson V. Social Cognitive Dysfunction Following Pediatric Arterial Ischemic Stroke: Evidence From a Prospective Cohort Study. Stroke 2021; 52:1609-1617. [PMID: 33827249 DOI: 10.1161/strokeaha.120.032955] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
[Figure: see text].
Collapse
Affiliation(s)
- Nicholas P Ryan
- Clinical Sciences, Murdoch Children's Research Institute, Melbourne, Australia (N.P.R., M.G., L.C., A.C, L.C., R.W.H., P.M., M.T.M., V.A.).,School of Psychology, Deakin University, Geelong, Australia (N.P.R.).,Department of Paediatrics, The University of Melbourne, Australia (N.P.R., A.C., R.W.H., P.M., M.T.M., V.A.)
| | - Mardee Greenham
- Clinical Sciences, Murdoch Children's Research Institute, Melbourne, Australia (N.P.R., M.G., L.C., A.C, L.C., R.W.H., P.M., M.T.M., V.A.)
| | - Anne L Gordon
- Paediatric Neuroscience Department, Evelina London Children's Hospital, Guy's & St Thomas' NHS Foundation Trust, London, United Kingdom (A.L.G.).,Department of Population Health Sciences, Kings College London, United Kingdom (A.L.G.)
| | - Michael Ditchfield
- Paediatric Imaging, Monash Children's Hospital, Melbourne, Australia (M.D.).,Department of Radiology and Paediatrics, Monash University, Melbourne, Australia (M.D.)
| | - Lee Coleman
- Clinical Sciences, Murdoch Children's Research Institute, Melbourne, Australia (N.P.R., M.G., L.C., A.C, L.C., R.W.H., P.M., M.T.M., V.A.).,Department of Medical Imaging (L.C.), The Royal Children's Hospital, Melbourne, Australia
| | - Anna Cooper
- Clinical Sciences, Murdoch Children's Research Institute, Melbourne, Australia (N.P.R., M.G., L.C., A.C, L.C., R.W.H., P.M., M.T.M., V.A.).,Department of Paediatrics, The University of Melbourne, Australia (N.P.R., A.C., R.W.H., P.M., M.T.M., V.A.)
| | - Louise Crowe
- Clinical Sciences, Murdoch Children's Research Institute, Melbourne, Australia (N.P.R., M.G., L.C., A.C, L.C., R.W.H., P.M., M.T.M., V.A.)
| | - Rod W Hunt
- Clinical Sciences, Murdoch Children's Research Institute, Melbourne, Australia (N.P.R., M.G., L.C., A.C, L.C., R.W.H., P.M., M.T.M., V.A.).,Department of Paediatrics, The University of Melbourne, Australia (N.P.R., A.C., R.W.H., P.M., M.T.M., V.A.).,Department of Neonatal Medicine (R.W.H.), The Royal Children's Hospital, Melbourne, Australia
| | - Paul Monagle
- Clinical Sciences, Murdoch Children's Research Institute, Melbourne, Australia (N.P.R., M.G., L.C., A.C, L.C., R.W.H., P.M., M.T.M., V.A.).,Department of Paediatrics, The University of Melbourne, Australia (N.P.R., A.C., R.W.H., P.M., M.T.M., V.A.).,Department of Haematology (P.M.), The Royal Children's Hospital, Melbourne, Australia
| | - Mark T Mackay
- Clinical Sciences, Murdoch Children's Research Institute, Melbourne, Australia (N.P.R., M.G., L.C., A.C, L.C., R.W.H., P.M., M.T.M., V.A.).,Department of Paediatrics, The University of Melbourne, Australia (N.P.R., A.C., R.W.H., P.M., M.T.M., V.A.).,Department of Neurology (M.T.M.), The Royal Children's Hospital, Melbourne, Australia
| | - Vicki Anderson
- Clinical Sciences, Murdoch Children's Research Institute, Melbourne, Australia (N.P.R., M.G., L.C., A.C, L.C., R.W.H., P.M., M.T.M., V.A.).,Department of Paediatrics, The University of Melbourne, Australia (N.P.R., A.C., R.W.H., P.M., M.T.M., V.A.).,Department of Psychology (V.A.), The Royal Children's Hospital, Melbourne, Australia
| |
Collapse
|
13
|
Della-Torre ME, Zavagno D, Actis-Grosso R. The Interpretation of E-Motions in Faces and Bodies Derived from Static Artworks by Individuals with High Functioning Autistic Spectrum. Vision (Basel) 2021; 5:17. [PMID: 33805957 PMCID: PMC8103258 DOI: 10.3390/vision5020017] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2021] [Revised: 03/16/2021] [Accepted: 03/19/2021] [Indexed: 11/29/2022] Open
Abstract
E-motions are defined as those affective states the expressions of which-conveyed either by static faces or body posture-embody a dynamic component and, consequently, convey a higher sense of dynamicity than other emotional expressions. An experiment is presented, aimed at testing whether e-motions are perceived as such also by individuals with autism spectrum disorders (ASDs), which have been associated with impairments in emotion recognition and in motion perception. To this aim we replicate with ASD individuals a study, originally conducted with typically developed individuals (TDs), in which we showed to both ASD and TD participants 14 bodiless heads and 14 headless bodies taken from eleven static artworks and four drawings. The Experiment was divided into two sessions. In Session 1 participants were asked to freely associate each stimulus to an emotion or an affective state (Task 1, option A); if they were unable to find a specific emotion, the experimenter showed them a list of eight possible emotions (words) and asked them to choose one from such list, that best described the affective state portrayed in the image (Task 1, option B). After their choice, they were asked to rate the intensity of the perceived emotion on a seven point Likert scale (Task 2). In Session 2 participants were requested to evaluate the degree of dynamicity conveyed by each stimulus on a 7 point Likert scale. Results showed that ASDs and TDs shared a similar range of verbal expressions defining emotions; however, ASDs (i) showed an impairment in the ability to spontaneously assign an emotion to a headless body, and (ii) they more frequently used terms denoting negative emotions (for both faces and bodies) as compared to neutral emotions, which in turn were more frequently used by TDs. No difference emerged between the two groups for positive emotions, with happiness being the emotion better recognized in both faces and in bodies. Although overall there are no significant differences between the two groups with respect to the emotions assigned to the images and the degree of perceived dynamicity, the interaction Artwork x Group showed that for some images ASDs assigned a different value than TDs to perceived dynamicity. Moreover, two images were interpreted by ASDs as conveying completely different emotions than those perceived by TDs. Results are discussed in light of the ability of ASDs to resolve ambiguity, and of possible different cognitive styles characterizing the aesthetical/emotional experience.
Collapse
Affiliation(s)
| | | | - Rossana Actis-Grosso
- Department of Psychology, Università di Milano-Bicocca, 20126 Milano, Italy; (M.E.D.-T.); (D.Z.)
| |
Collapse
|
14
|
Meaning before grammar: A review of ERP experiments on the neurodevelopmental origins of semantic processing. Psychon Bull Rev 2020; 27:441-464. [PMID: 31950458 DOI: 10.3758/s13423-019-01677-8] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
According to traditional linguistic theories, the construction of complex meanings relies firmly on syntactic structure-building operations. Recently, however, new models have been proposed in which semantics is viewed as being partly autonomous from syntax. In this paper, we discuss some of the developmental implications of syntax-based and autonomous models of semantics. We review event-related brain potential (ERP) studies on semantic processing in infants and toddlers, focusing on experiments reporting modulations of N400 amplitudes using visual or auditory stimuli and different temporal structures of trials. Our review suggests that infants can relate or integrate semantic information from temporally overlapping stimuli across modalities by 6 months of age. The ability to relate or integrate semantic information over time, within and across modalities, emerges by 9 months. The capacity to relate or integrate information from spoken words in sequences and sentences appears by 18 months. We also review behavioral and ERP studies showing that grammatical and syntactic processing skills develop only later, between 18 and 32 months. These results provide preliminary evidence for the availability of some semantic processes prior to the full developmental emergence of syntax: non-syntactic meaning-building operations are available to infants, albeit in restricted ways, months before the abstract machinery of grammar is in place. We discuss this hypothesis in light of research on early language acquisition and human brain development.
Collapse
|
15
|
Krol KM, Grossmann T. Impression Formation in the Human Infant Brain. Cereb Cortex Commun 2020; 1:tgaa070. [PMID: 33134930 PMCID: PMC7592636 DOI: 10.1093/texcom/tgaa070] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2020] [Revised: 08/26/2020] [Accepted: 09/19/2020] [Indexed: 01/12/2023] Open
Abstract
Forming an impression of another person is an essential aspect of human social cognition linked to medial prefrontal cortex (mPFC) function in adults. The current study examined the neurodevelopmental origins of impression formation by testing the hypothesis that infants rely on processes localized in mPFC when forming impressions about individuals who appear friendly or threatening. Infants’ brain responses were measured using functional near-infrared spectroscopy while watching 4 different face identities displaying either smiles or frowns directed toward or away from them (N = 77). This was followed by a looking preference test for these face identities (now displaying a neutral expression) using eyetracking. Our results show that infants’ mPFC responses distinguish between smiling and frowning faces when directed at them and that these responses predicted their subsequent person preferences. This suggests that the mPFC is involved in impression formation in human infants, attesting to the early ontogenetic emergence of brain systems supporting person perception and adaptive behavior.
Collapse
Affiliation(s)
- Kathleen M Krol
- Department of Psychology, University of Virginia, Charlottesville, VA 22903, USA
| | - Tobias Grossmann
- Department of Psychology, University of Virginia, Charlottesville, VA 22903, USA
| |
Collapse
|
16
|
Visual exploration of emotional body language: a behavioural and eye-tracking study. PSYCHOLOGICAL RESEARCH 2020; 85:2326-2339. [PMID: 32920675 DOI: 10.1007/s00426-020-01416-y] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2020] [Accepted: 09/01/2020] [Indexed: 10/23/2022]
Abstract
Bodily postures are essential to correctly comprehend others' emotions and intentions. Nonetheless, very few studies focused on the pattern of eye movements implicated in the recognition of emotional body language (EBL), demonstrating significant differences in relation to different emotions. A yet unanswered question regards the presence of the "left-gaze bias" (i.e. the tendency to look first, to make more fixations and to spend more looking time on the left side of centrally presented stimuli) while scanning bodies. Hence, the present study aims at exploring both the presence of a left-gaze bias and the modulation of EBL visual exploration mechanisms, by investigating the fixation patterns (number of fixations and latency of the first fixation) of participants while judging the emotional intensity of static bodily postures (Angry, Happy and Neutral, without head). While results on the latency of first fixations demonstrate for the first time the presence of the left-gaze bias while scanning bodies, suggesting that it could be related to the stronger expressiveness of the left hand (from the observer's point of view), results on fixations' number only partially fulfil our hypothesis. Moreover, an opposite viewing pattern between Angry and Happy bodily postures is showed. In sum, the present results, by integrating the spatial and temporal dimension of gaze exploration patterns, shed new light on EBL visual exploration mechanisms.
Collapse
|
17
|
Geangu E, Vuong QC. Look up to the body: An eye-tracking investigation of 7-months-old infants' visual exploration of emotional body expressions. Infant Behav Dev 2020; 60:101473. [PMID: 32739668 DOI: 10.1016/j.infbeh.2020.101473] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2019] [Revised: 07/22/2020] [Accepted: 07/22/2020] [Indexed: 02/02/2023]
Abstract
The human body is an important source of information to infer a person's emotional state. Research with adult observers indicate that the posture of the torso, arms and hands provide important perceptual cues for recognising anger, fear and happy expressions. Much less is known about whether infants process body regions differently for different body expressions. To address this issue, we used eye tracking to investigate whether infants' visual exploration patterns differed when viewing body expressions. Forty-eight 7-months-old infants were randomly presented with static images of adult female bodies expressing anger, fear and happiness, as well as an emotionally-neutral posture. Facial cues to emotional state were removed by masking the faces. We measured the proportion of looking time, proportion and number of fixations, and duration of fixations on the head, upper body and lower body regions for the different expressions. We showed that infants explored the upper body more than the lower body. Importantly, infants at this age fixated differently on different body regions depending on the expression of the body posture. In particular, infants spent a larger proportion of their looking times and had longer fixation durations on the upper body for fear relative to the other expressions. These results extend and replicate the information about infant processing of emotional expressions displayed by human bodies, and they support the hypothesis that infants' visual exploration of human bodies is driven by the upper body.
Collapse
|
18
|
Jessen S, Grossmann T. The developmental origins of subliminal face processing. Neurosci Biobehav Rev 2020; 116:454-460. [PMID: 32659286 DOI: 10.1016/j.neubiorev.2020.07.003] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2019] [Revised: 02/18/2020] [Accepted: 07/08/2020] [Indexed: 11/28/2022]
Abstract
Sensitive responding to facial information is of key importance during human social interactions. Research shows that adults glean much information from another person's face without conscious perception, attesting to the robustness of face processing in the service of adaptive social functioning. Until recently, it was unclear whether such subliminal face processing is an outcome of extensive learning, resulting in adult face processing skills, or an early defining feature of human face processing. Here, we review recent research examining the early ontogeny and brain correlates of subliminal face processing, demonstrating that subliminal face processing: (1) emerges during the first year of life; (2) is multifaceted in response to transient (gaze, emotion) and stable (trustworthiness) facial cues; (3) systematically elicits frontal brain responses linked to attention allocation. The synthesized research suggests that subliminal face processing emerges early in human development and thus may play a foundational role during human social interactions. This offers a fresh look at the ontogenetic origins of unconscious face processing and informs theoretical accounts of human sociality.
Collapse
Affiliation(s)
- Sarah Jessen
- Department of Neurology, University of Lübeck, Germany.
| | - Tobias Grossmann
- Department of Psychology, University of Virginia, 310 Gilmer Hall, 485 McCormick Rd, Charlottesville, VA 22903, USA; Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
19
|
Ross P, Atkinson AP. Expanding Simulation Models of Emotional Understanding: The Case for Different Modalities, Body-State Simulation Prominence, and Developmental Trajectories. Front Psychol 2020; 11:309. [PMID: 32194476 PMCID: PMC7063097 DOI: 10.3389/fpsyg.2020.00309] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2019] [Accepted: 02/10/2020] [Indexed: 12/14/2022] Open
Abstract
Recent models of emotion recognition suggest that when people perceive an emotional expression, they partially activate the respective emotion in themselves, providing a basis for the recognition of that emotion. Much of the focus of these models and of their evidential basis has been on sensorimotor simulation as a basis for facial expression recognition - the idea, in short, that coming to know what another feels involves simulating in your brain the motor plans and associated sensory representations engaged by the other person's brain in producing the facial expression that you see. In this review article, we argue that simulation accounts of emotion recognition would benefit from three key extensions. First, that fuller consideration be given to simulation of bodily and vocal expressions, given that the body and voice are also important expressive channels for providing cues to another's emotional state. Second, that simulation of other aspects of the perceived emotional state, such as changes in the autonomic nervous system and viscera, might have a more prominent role in underpinning emotion recognition than is typically proposed. Sensorimotor simulation models tend to relegate such body-state simulation to a subsidiary role, despite the plausibility of body-state simulation being able to underpin emotion recognition in the absence of typical sensorimotor simulation. Third, that simulation models of emotion recognition be extended to address how embodied processes and emotion recognition abilities develop through the lifespan. It is not currently clear how this system of sensorimotor and body-state simulation develops and in particular how this affects the development of emotion recognition ability. We review recent findings from the emotional body recognition literature and integrate recent evidence regarding the development of mimicry and interoception to significantly expand simulation models of emotion recognition.
Collapse
Affiliation(s)
- Paddy Ross
- Department of Psychology, Durham University, Durham, United Kingdom
| | | |
Collapse
|
20
|
Ogren M, Kaplan B, Peng Y, Johnson KL, Johnson SP. Motion or emotion: Infants discriminate emotional biological motion based on low-level visual information. Infant Behav Dev 2019; 57:101324. [PMID: 31112859 PMCID: PMC6859203 DOI: 10.1016/j.infbeh.2019.04.006] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2019] [Revised: 04/17/2019] [Accepted: 04/17/2019] [Indexed: 10/26/2022]
Abstract
Infants' ability to discriminate emotional facial expressions and tones of voice is well-established, yet little is known about infant discrimination of emotional body movements. Here, we asked if 10-20-month-old infants rely on high-level emotional cues or low-level motion related cues when discriminating between emotional point-light displays (PLDs). In Study 1, infants viewed 18 pairs of angry, happy, sad, or neutral PLDs. Infants looked more at angry vs. neutral, happy vs. neutral, and neutral vs. sad. Motion analyses revealed that infants preferred the PLD with more total body movement in each pairing. Study 2, in which infants viewed inverted versions of the same pairings, yielded similar findings except for sad-neutral. Study 3 directly paired all three emotional stimuli in both orientations. The angry and happy stimuli did not significantly differ in terms of total motion, but both had more motion than the sad stimuli. Infants looked more at angry vs. sad, more at happy vs. sad, and about equally to angry vs. happy in both orientations. Again, therefore, infants preferred PLDs with more total body movement. Overall, the results indicate that a low-level motion preference may drive infants' discrimination of emotional human walking motions.
Collapse
Affiliation(s)
- Marissa Ogren
- Department of Psychology, University of California, Los Angeles, United States.
| | - Brianna Kaplan
- Department of Psychology, New York University, United States
| | - Yujia Peng
- Department of Psychology, University of California, Los Angeles, United States
| | - Kerri L Johnson
- Department of Psychology, University of California, Los Angeles, United States
| | - Scott P Johnson
- Department of Psychology, University of California, Los Angeles, United States
| |
Collapse
|
21
|
Zhang M, Liu T, Jin Y, He W, Huang Y, Luo W. The asynchronous influence of facial expressions on bodily expressions. Acta Psychol (Amst) 2019; 200:102941. [PMID: 31677428 DOI: 10.1016/j.actpsy.2019.102941] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2018] [Revised: 08/19/2019] [Accepted: 09/20/2019] [Indexed: 10/25/2022] Open
Abstract
The ability to extract correct emotional information from facial and bodily expressions is fundamental for the development of social skills. Previous studies have shown that bodily expressions affect the recognition of basic facial expressions dramatically. However, few studies have considered the view that facial expressions may influence the recognition of bodily expressions. Further, previous studies have failed to consider a comprehensive set of emotional categories. The present study sought to examine whether facial expressions would impact the recognition of bodily expressions asynchronously, using four basic emotions. Participants performed an affective priming task, in which the priming stimuli included four facial expressions (happy, sad, fearful, and angry), and the target stimuli were bodily expressions matching the same emotions. The results indicated that the perception of affective facial expressions significantly influenced the accuracy and reaction time for body-based emotion categorization, particularly for bodily expression of happiness. The recognition accuracy of congruent expressions was higher, relative to that of incongruent expressions. The findings show that facial expressions influence the recognition of bodily expressions, despite the asynchrony.
Collapse
|
22
|
Shamay-Tsoory SG, Mendelsohn A. Real-Life Neuroscience: An Ecological Approach to Brain and Behavior Research. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2019; 14:841-859. [DOI: 10.1177/1745691619856350] [Citation(s) in RCA: 82] [Impact Index Per Article: 16.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
Owing to advances in neuroimaging technology, the past couple of decades have witnessed a surge of research on brain mechanisms that underlie human cognition. Despite the immense development in cognitive neuroscience, the vast majority of neuroimaging experiments examine isolated agents carrying out artificial tasks in sensory and socially deprived environments. Thus, the understanding of the mechanisms of various domains in cognitive neuroscience, including social cognition and episodic memory, is sorely lacking. Here we focus on social and memory research as representatives of cognitive functions and propose that mainstream, lab-based experimental designs in these fields suffer from two fundamental limitations, pertaining to person-dependent and situation-dependent factors. The person-dependent factor addresses the issue of limiting the active role of the participants in lab-based paradigms that may interfere with their sense of agency and embodiment. The situation-dependent factor addresses the issue of the artificial decontextualized environment in most available paradigms. Building on recent findings showing that real-life as opposed to controlled experimental paradigms involve different mechanisms, we argue that adopting a real-life approach may radically change our understanding of brain and behavior. Therefore, we advocate in favor of a paradigm shift toward a nonreductionist approach, exploiting portable technology in semicontrolled environments, to explore behavior in real life.
Collapse
Affiliation(s)
- Simone G. Shamay-Tsoory
- Department of Psychology, University of Haifa
- The Integrated Brain and Behavior Research
Center (IBBR), University of Haifa
| | - Avi Mendelsohn
- The Integrated Brain and Behavior Research
Center (IBBR), University of Haifa
- Department of Neurobiology, University of Haifa
- Institute of Information Processing and Decision Making, University of Haifa
| |
Collapse
|
23
|
Oldershaw A, Startup H, Lavender T. Anorexia Nervosa and a Lost Emotional Self: A Psychological Formulation of the Development, Maintenance, and Treatment of Anorexia Nervosa. Front Psychol 2019; 10:219. [PMID: 30886593 PMCID: PMC6410927 DOI: 10.3389/fpsyg.2019.00219] [Citation(s) in RCA: 55] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2018] [Accepted: 01/22/2019] [Indexed: 12/17/2022] Open
Abstract
In this paper, we argue that Anorexia Nervosa (AN) can be explained as arising from a 'lost sense of emotional self.' We begin by briefly reviewing evidence accumulated to date supporting the consensus that a complex range of genetic, biological, psychological, and socio-environmental risk and maintenance factors contribute to the development and maintenance of AN. We consider how current interventions seek to tackle these factors in psychotherapy and potential limitations. We then propose our theory that many risk and maintenance factors may be unified by an underpinning explanation of emotional processing difficulties leading to a lost sense of 'emotional self.' Further, we discuss how, once established, AN becomes 'self-perpetuating' and the 'lost sense of emotional self' relentlessly deepens. We outline these arguments in detail, drawing on empirical and neuroscientific data, before discussing the implications of this model for understanding AN and informing clinical intervention. We argue that experiential models of therapy (e.g., emotion-focused therapy; schema therapy) be employed to achieve emergence and integration of an 'emotional self' which can be flexibly and adaptively used to direct an individual's needs and relationships. Furthermore, we assert that this should be a primary goal of therapy for adults with established AN.
Collapse
Affiliation(s)
- Anna Oldershaw
- Salmons Centre for Applied Psychology, Canterbury Christ Church University, Canterbury, United Kingdom
- Kent and Medway All Age Eating Disorder Service, North East London NHS Foundation Trust, London, United Kingdom
| | - Helen Startup
- Sussex Eating Disorders Service and Research and Development Department, Sussex Partnership NHS Foundation Trust, Sussex, United Kingdom
| | - Tony Lavender
- Salmons Centre for Applied Psychology, Canterbury Christ Church University, Canterbury, United Kingdom
| |
Collapse
|
24
|
How to build a helpful baby: a look at the roots of prosociality in infancy. Curr Opin Psychol 2018; 20:21-24. [DOI: 10.1016/j.copsyc.2017.08.007] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2017] [Revised: 06/26/2017] [Accepted: 08/03/2017] [Indexed: 12/21/2022]
|
25
|
Heck A, Chroust A, White H, Jubran R, Bhatt RS. Development of body emotion perception in infancy: From discrimination to recognition. Infant Behav Dev 2017; 50:42-51. [PMID: 29131968 DOI: 10.1016/j.infbeh.2017.10.007] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2017] [Revised: 09/21/2017] [Accepted: 10/24/2017] [Indexed: 11/25/2022]
Abstract
Research suggests that infants progress from discrimination to recognition of emotions in faces during the first half year of life. It is unknown whether the perception of emotions from bodies develops in a similar manner. In the current study, when presented with happy and angry body videos and voices, 5-month-olds looked longer at the matching video when they were presented upright but not when they were inverted. In contrast, 3.5-month-olds failed to match even with upright videos. Thus, 5-month-olds but not 3.5-month-olds exhibited evidence of recognition of emotions from bodies by demonstrating intermodal matching. In a subsequent experiment, younger infants did discriminate between body emotion videos but failed to exhibit an inversion effect, suggesting that discrimination may be based on low-level stimulus features. These results document a developmental change from discrimination based on non-emotional information at 3.5 months to recognition of body emotions at 5 months. This pattern of development is similar to face emotion knowledge development and suggests that both the face and body emotion perception systems develop rapidly during the first half year of life.
Collapse
|
26
|
The inherently contextualized nature of facial emotion perception. Curr Opin Psychol 2017; 17:47-54. [DOI: 10.1016/j.copsyc.2017.06.006] [Citation(s) in RCA: 58] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2017] [Revised: 04/28/2017] [Accepted: 06/14/2017] [Indexed: 11/20/2022]
|
27
|
Hock A, Oberst L, Jubran R, White H, Heck A, Bhatt RS. Integrated Emotion Processing in Infancy: Matching of Faces and Bodies. INFANCY 2017; 22:608-625. [PMID: 29623007 DOI: 10.1111/infa.12177] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
Abstract
Accurate assessment of emotion requires the coordination of information from different sources such as faces, bodies, and voices. Adults readily integrate facial and bodily emotions. However, not much is known about the developmental origin of this capacity. Using a familiarization paired-comparison procedure, 6.5-month-olds in the current experiments were familiarized to happy, angry, or sad emotions in faces or bodies and tested with the opposite image type portraying the familiar emotion paired with a novel emotion. Infants looked longer at the familiar emotion across faces and bodies (except when familiarized to angry images and tested on the happy/angry contrast). This matching occurred not only for emotions from different affective categories (happy, angry) but also within the negative affective category (angry, sad). Thus, 6.5-month-olds, like adults, integrate emotions from bodies and faces in a fairly sophisticated manner, suggesting rapid development of emotion processing early in life.
Collapse
|
28
|
Grossmann T, Jessen S. When in infancy does the “fear bias” develop? J Exp Child Psychol 2017; 153:149-154. [DOI: 10.1016/j.jecp.2016.06.018] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2016] [Revised: 06/02/2016] [Accepted: 06/02/2016] [Indexed: 10/20/2022]
|