1
|
Becker C, Conduit R, Chouinard PA, Laycock R. Can deepfakes be used to study emotion perception? A comparison of dynamic face stimuli. Behav Res Methods 2024:10.3758/s13428-024-02443-y. [PMID: 38834812 DOI: 10.3758/s13428-024-02443-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/11/2024] [Indexed: 06/06/2024]
Abstract
Video recordings accurately capture facial expression movements; however, they are difficult for face perception researchers to standardise and manipulate. For this reason, dynamic morphs of photographs are often used, despite their lack of naturalistic facial motion. This study aimed to investigate how humans perceive emotions from faces using real videos and two different approaches to artificially generating dynamic expressions - dynamic morphs, and AI-synthesised deepfakes. Our participants perceived dynamic morphed expressions as less intense when compared with videos (all emotions) and deepfakes (fearful, happy, sad). Videos and deepfakes were perceived similarly. Additionally, they perceived morphed happiness and sadness, but not morphed anger or fear, as less genuine than other formats. Our findings support previous research indicating that social responses to morphed emotions are not representative of those to video recordings. The findings also suggest that deepfakes may offer a more suitable standardized stimulus type compared to morphs. Additionally, qualitative data were collected from participants and analysed using ChatGPT, a large language model. ChatGPT successfully identified themes in the data consistent with those identified by an independent human researcher. According to this analysis, our participants perceived dynamic morphs as less natural compared with videos and deepfakes. That participants perceived deepfakes and videos similarly suggests that deepfakes effectively replicate natural facial movements, making them a promising alternative for face perception research. The study contributes to the growing body of research exploring the usefulness of generative artificial intelligence for advancing the study of human perception.
Collapse
|
2
|
Abstract
UNLABELLED Social cognition (SC) comprises an array of cognitive and affective abilities such as social perception, theory of mind, empathy, and social behavior. Previous studies have suggested the existence of deficits in several SC abilities in Parkinson disease (PD), although not unanimously. OBJECTIVE The aim of this study is to assess the SC construct and to explore its relationship with cognitive state in PD patients. METHOD We compare 19 PD patients with cognitive decline, 27 cognitively preserved PD patients, and 29 healthy control (HC) individuals in social perception (static and dynamic emotional facial recognition), theory of mind, empathy, and social behavior tasks. We also assess processing speed, executive functions, memory, language, and visuospatial ability. RESULTS PD patients with cognitive decline perform worse than the other groups in both facial expression recognition tasks and theory of mind. Cognitively preserved PD patients only score worse than HCs in the static facial expression recognition task. We find several significant correlations between each of the SC deficits and diverse cognitive processes. CONCLUSIONS The results indicate that some components of SC are impaired in PD patients. These problems seem to be related to a global cognitive decline rather than to specific deficits. Considering the importance of these abilities for social interaction, we suggest that SC be included in the assessment protocols in PD.
Collapse
|
3
|
Valence-Dependent Coupling of Prefrontal-Amygdala Effective Connectivity during Facial Affect Processing. eNeuro 2019; 6:ENEURO.0079-19.2019. [PMID: 31289107 PMCID: PMC6658918 DOI: 10.1523/eneuro.0079-19.2019] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2019] [Revised: 06/18/2019] [Accepted: 06/19/2019] [Indexed: 01/10/2023] Open
Abstract
Despite the importance of the prefrontal-amygdala (AMY) network for emotion processing, valence-dependent coupling within this network remains elusive. In this study, we assessed the effect of emotional valence on brain activity and effective connectivity. We tested which functional pathways within the prefrontal-AMY network are specifically engaged during the processing of emotional valence. Thirty-three healthy adults were examined with functional magnetic resonance imaging while performing a dynamic faces and dynamic shapes matching task. The valence of the facial expressions varied systematically between positive, negative, and neutral across the task. Functional contrasts determined core areas of the emotion processing circuitry, comprising the medial prefrontal cortex (MPFC), the right lateral prefrontal cortex (LPFC), the AMY, and the right fusiform face area (FFA). Dynamic causal modelling demonstrated that the bidirectional coupling within the prefrontal-AMY circuitry is modulated by emotional valence. Additionally, Bayesian model averaging showed significant bottom-up connectivity from the AMY to the MPFC during negative and neutral, but not positive, valence. Thus, our study provides strong evidence for alterations of bottom-up coupling within the prefrontal-AMY network as a function of emotional valence. Thereby our results not only advance the understanding of the human prefrontal-AMY circuitry in varying valence context, but, moreover, provide a model to examine mechanisms of valence-sensitive emotional dysregulation in neuropsychiatric disorders.
Collapse
|
4
|
Children with facial paralysis due to Moebius syndrome exhibit reduced autonomic modulation during emotion processing. J Neurodev Disord 2019; 11:12. [PMID: 31291910 PMCID: PMC6617955 DOI: 10.1186/s11689-019-9272-2] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/17/2018] [Accepted: 06/21/2019] [Indexed: 12/31/2022] Open
Abstract
BACKGROUND Facial mimicry is crucial in the recognition of others' emotional state. Thus, the observation of others' facial expressions activates the same neural representation of that affective state in the observer, along with related autonomic and somatic responses. What happens, therefore, when someone cannot mimic others' facial expressions? METHODS We investigated whether psychophysiological emotional responses to others' facial expressions were impaired in 13 children (9 years) with Moebius syndrome (MBS), an extremely rare neurological disorder (1/250,000 live births) characterized by congenital facial paralysis. We inspected autonomic responses and vagal regulation through facial cutaneous thermal variations and by the computation of respiratory sinus arrhythmia (RSA). These parameters provide measures of emotional arousal and show the autonomic adaptation to others' social cues. Physiological responses in children with MBS were recorded during dynamic facial expression observation and were compared to those of a control group (16 non-affected children, 9 years). RESULTS There were significant group effects on thermal patterns and RSA, with lower values in children with MBS. We also observed a mild deficit in emotion recognition in these patients. CONCLUSION Results support "embodied" theory, whereby the congenital inability to produce facial expressions induces alterations in the processing of facial expression of emotions. Such alterations may constitute a risk for emotion dysregulation.
Collapse
|
5
|
Sato W, Kochiyama T, Uono S, Sawada R, Kubota Y, Yoshimura S, Toichi M. Widespread and lateralized social brain activity for processing dynamic facial expressions. Hum Brain Mapp 2019; 40:3753-3768. [PMID: 31090126 DOI: 10.1002/hbm.24629] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2019] [Revised: 04/28/2019] [Accepted: 05/02/2019] [Indexed: 11/07/2022] Open
Abstract
Dynamic facial expressions of emotions constitute natural and powerful means of social communication in daily life. A number of previous neuroimaging studies have explored the neural mechanisms underlying the processing of dynamic facial expressions, and indicated the activation of certain social brain regions (e.g., the amygdala) during such tasks. However, the activated brain regions were inconsistent across studies, and their laterality was rarely evaluated. To investigate these issues, we measured brain activity using functional magnetic resonance imaging in a relatively large sample (n = 51) during the observation of dynamic facial expressions of anger and happiness and their corresponding dynamic mosaic images. The observation of dynamic facial expressions, compared with dynamic mosaics, elicited stronger activity in the bilateral posterior cortices, including the inferior occipital gyri, fusiform gyri, and superior temporal sulci. The dynamic facial expressions also activated bilateral limbic regions, including the amygdalae and ventromedial prefrontal cortices, more strongly versus mosaics. In the same manner, activation was found in the right inferior frontal gyrus (IFG) and left cerebellum. Laterality analyses comparing original and flipped images revealed right hemispheric dominance in the superior temporal sulcus and IFG and left hemispheric dominance in the cerebellum. These results indicated that the neural mechanisms underlying processing of dynamic facial expressions include widespread social brain regions associated with perceptual, emotional, and motor functions, and include a clearly lateralized (right cortical and left cerebellar) network like that involved in language processing.
Collapse
Affiliation(s)
- Wataru Sato
- Kokoro Research Center, Kyoto University, Kyoto, Japan
| | | | - Shota Uono
- Department of Neurodevelopmental Psychiatry, Habilitation and Rehabilitation, Kyoto University, Kyoto, Japan
| | - Reiko Sawada
- Department of Neurodevelopmental Psychiatry, Habilitation and Rehabilitation, Kyoto University, Kyoto, Japan
| | - Yasutaka Kubota
- Health and Medical Services Center, Shiga University, Hikone, Shiga, Japan
| | - Sayaka Yoshimura
- Department of Neurodevelopmental Psychiatry, Habilitation and Rehabilitation, Kyoto University, Kyoto, Japan
| | - Motomi Toichi
- Faculty of Human Health Science, Kyoto University, Kyoto, Japan.,The Organization for Promoting Neurodevelopmental Disorder Research, Kyoto, Japan
| |
Collapse
|
6
|
Darke H, Cropper SJ, Carter O. A Novel Dynamic Morphed Stimuli Set to Assess Sensitivity to Identity and Emotion Attributes in Faces. Front Psychol 2019; 10:757. [PMID: 31024397 PMCID: PMC6465610 DOI: 10.3389/fpsyg.2019.00757] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2018] [Accepted: 03/19/2019] [Indexed: 11/13/2022] Open
Abstract
Face-based tasks are used ubiquitously in the study of human perception and cognition. Video-based (dynamic) face stimuli are increasingly utilized by researchers because they have higher ecological validity than static images. However, there are few ready-to-use dynamic stimulus sets currently available to researchers that include non-emotional and non-face control stimuli. This paper outlines the development of three original dynamic stimulus sets: a set of emotional faces (fear and disgust), a set of non-emotional faces, and a set of car animations. Morphing software was employed to vary the intensity of the expression shown and to vary the similarity between actors. Manipulating these dimensions permits us to create tasks of varying difficulty that can be optimized to detect more subtle differences in face-processing ability. Using these new stimuli, two preliminary experiments were conducted to evaluate different aspects of facial identity recognition, emotion recognition, and non-face object discrimination. Results suggest that these five different tasks successfully avoided floor and ceiling effects in a healthy sample. A second experiment found that dynamic versions of the emotional stimuli were recognized more accurately than static versions, both for labeling, and discrimination paradigms. This indicates that, like previous emotion-only stimuli sets, the use of dynamic stimuli confers an advantage over image-based stimuli. These stimuli therefore provide a useful resource for researchers looking to investigate both emotional and non-emotional face-processing using dynamic stimuli. Moreover, these stimuli vary across crucial dimensions (i.e., face similarity and intensity of emotion) which allows researchers to modify task difficulty as required.
Collapse
Affiliation(s)
- Hayley Darke
- Melbourne School of Psychological Sciences, The University of Melbourne, Melbourne, VIC, Australia
| | - Simon J Cropper
- Melbourne School of Psychological Sciences, The University of Melbourne, Melbourne, VIC, Australia
| | - Olivia Carter
- Melbourne School of Psychological Sciences, The University of Melbourne, Melbourne, VIC, Australia
| |
Collapse
|
7
|
Rymarczyk K, Żurawski Ł, Jankowiak-Siuda K, Szatkowska I. Empathy in Facial Mimicry of Fear and Disgust: Simultaneous EMG-fMRI Recordings During Observation of Static and Dynamic Facial Expressions. Front Psychol 2019; 10:701. [PMID: 30971997 PMCID: PMC6445885 DOI: 10.3389/fpsyg.2019.00701] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2018] [Accepted: 03/13/2019] [Indexed: 01/18/2023] Open
Abstract
Real-life faces are dynamic by nature, particularly when expressing emotion. Increasing evidence suggests that the perception of dynamic displays enhances facial mimicry and induces activation in widespread brain structures considered to be part of the mirror neuron system, a neuronal network linked to empathy. The present study is the first to investigate the relations among facial muscle responses, brain activity, and empathy traits while participants observed static and dynamic (videos) facial expressions of fear and disgust. During display presentation, blood-oxygen level-dependent (BOLD) signal as well as muscle reactions of the corrugator supercilii and levator labii were recorded simultaneously from 46 healthy individuals (21 females). It was shown that both fear and disgust faces caused activity in the corrugator supercilii muscle, while perception of disgust produced facial activity additionally in the levator labii muscle, supporting a specific pattern of facial mimicry for these emotions. Moreover, individuals with higher, compared to individuals with lower, empathy traits showed greater activity in the corrugator supercilii and levator labii muscles; however, these responses were not differentiable between static and dynamic mode. Conversely, neuroimaging data revealed motion and emotional-related brain structures in response to dynamic rather than static stimuli among high empathy individuals. In line with this, there was a correlation between electromyography (EMG) responses and brain activity suggesting that the Mirror Neuron System, the anterior insula and the amygdala might constitute the neural correlates of automatic facial mimicry for fear and disgust. These results revealed that the dynamic property of (emotional) stimuli facilitates the emotional-related processing of facial expressions, especially among whose with high trait empathy.
Collapse
Affiliation(s)
- Krystyna Rymarczyk
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, SWPS University of Social Sciences and Humanities, Warsaw, Poland
| | - Łukasz Żurawski
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology, Polish Academy of Sciences (PAS), Warsaw, Poland
| | - Kamila Jankowiak-Siuda
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, SWPS University of Social Sciences and Humanities, Warsaw, Poland
| | - Iwona Szatkowska
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology, Polish Academy of Sciences (PAS), Warsaw, Poland
| |
Collapse
|
8
|
Horta M, Ziaei M, Lin T, Porges EC, Fischer H, Feifel D, Spreng RN, Ebner NC. Oxytocin alters patterns of brain activity and amygdalar connectivity by age during dynamic facial emotion identification. Neurobiol Aging 2019; 78:42-51. [PMID: 30870779 DOI: 10.1016/j.neurobiolaging.2019.01.016] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2018] [Revised: 01/03/2019] [Accepted: 01/21/2019] [Indexed: 10/27/2022]
Abstract
Aging is associated with increased difficulty in facial emotion identification, possibly due to age-related network change. The neuropeptide oxytocin (OT) facilitates emotion identification, but this is understudied in aging. To determine the effects of OT on dynamic facial emotion identification across adulthood, 46 young and 48 older participants self-administered intranasal OT or a placebo in a randomized, double-blind procedure. Older participants were slower and less accurate in identifying emotions. Although there was no behavioral treatment effect, partial least squares analysis supported treatment effects on brain patterns during emotion identification that varied by age and emotion. For young participants, OT altered the processing of sadness and happiness, whereas for older participants, OT only affected the processing of sadness (15.3% covariance, p = 0.004). Furthermore, seed partial least squares analysis showed that older participants in the OT group recruited a large-scale amygdalar network that was positively correlated for anger, fear, and happiness, whereas older participants in the placebo group recruited a smaller, negatively correlated network (7% covariance, p = 0.002). Advancing the literature, these findings show that OT alters brain activity and amygdalar connectivity by age and emotion.
Collapse
Affiliation(s)
- Marilyn Horta
- Department of Psychology, University of Florida, Gainesville, FL, USA.
| | - Maryam Ziaei
- Centre for Advanced Imaging, University of Queensland, Brisbane, Australia
| | - Tian Lin
- Department of Psychology, University of Florida, Gainesville, FL, USA
| | - Eric C Porges
- Department of Clinical and Health Psychology, Center for Cognitive Aging and Memory, University of Florida, Gainesville, FL, USA
| | - Håkan Fischer
- Department of Psychology, Stockholm University, Stockholm, Sweden
| | - David Feifel
- Department of Psychiatry, University of California, San Diego, CA, USA
| | - R Nathan Spreng
- Department of Neurology and Neurosurgery, Montreal Neurological Institute, McGill University, Montreal, Quebec, Canada; Departments of Psychology and Psychiatry, McGill University, Montreal, Quebec, Canada
| | - Natalie C Ebner
- Department of Psychology, University of Florida, Gainesville, FL, USA; Department of Clinical and Health Psychology, Center for Cognitive Aging and Memory, University of Florida, Gainesville, FL, USA; Department of Aging and Geriatric Research, Institute on Aging, University of Florida, Gainesville, FL, USA
| |
Collapse
|
9
|
Shared facial emotion processing functional network findings in medication-naïve major depressive disorder and healthy individuals: detection by sICA. BMC Psychiatry 2018; 18:96. [PMID: 29636031 PMCID: PMC5891939 DOI: 10.1186/s12888-018-1631-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/29/2016] [Accepted: 02/09/2018] [Indexed: 12/11/2022] Open
Abstract
BACKGROUND The fundamental mechanism underlying emotional processing in major depressive disorder (MDD) remains unclear. To better understand the neural correlates of emotional processing in MDD, we investigated the role of multiple functional networks (FNs) during emotional stimuli processing. METHODS Thirty-two medication-naïve subjects with MDD and 36 healthy controls (HCs) underwent an emotional faces fMRI task that included neutral, happy and fearful expressions. Spatial independent component analysis (sICA) and general linear model (GLM) were conducted to examine the main effect of task condition and group, and two-way interactions of group and task conditions. RESULTS In sICA analysis, MDD patients and HCs together showed significant differences in task-related modulations in five FNs across task conditions. One FN mainly involving the ventral medial prefrontal cortex showed lower activation during fearful relative to happy condition. Two FNs mainly involving the bilateral inferior frontal gyrus and temporal cortex, showed opposing modulation relative to the ventral medial prefrontal cortex FN, i.e., greater activation during fearful relative to happy condition. Two remaining FNs involving the fronto-parietal and occipital cortices, showed reduced activation during both fearful and happy conditions relative to the neutral condition. However, MDD and HCs did not show significant differences in expression-related modulations in any FNs in this sample. CONCLUSIONS SICA revealed differing functional activation patterns than typical GLM-based analyses. The sICA findings demonstrated unique FNs involved in processing happy and fearful facial expressions. Potential differences between MDD and HCs in expression-related FN modulation should be investigated further.
Collapse
|
10
|
Argaud S, Vérin M, Sauleau P, Grandjean D. Facial emotion recognition in Parkinson's disease: A review and new hypotheses. Mov Disord 2018; 33:554-567. [PMID: 29473661 PMCID: PMC5900878 DOI: 10.1002/mds.27305] [Citation(s) in RCA: 109] [Impact Index Per Article: 18.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2017] [Revised: 12/21/2017] [Accepted: 12/22/2017] [Indexed: 02/02/2023] Open
Abstract
Parkinson's disease is a neurodegenerative disorder classically characterized by motor symptoms. Among them, hypomimia affects facial expressiveness and social communication and has a highly negative impact on patients' and relatives' quality of life. Patients also frequently experience nonmotor symptoms, including emotional-processing impairments, leading to difficulty in recognizing emotions from faces. Aside from its theoretical importance, understanding the disruption of facial emotion recognition in PD is crucial for improving quality of life for both patients and caregivers, as this impairment is associated with heightened interpersonal difficulties. However, studies assessing abilities in recognizing facial emotions in PD still report contradictory outcomes. The origins of this inconsistency are unclear, and several questions (regarding the role of dopamine replacement therapy or the possible consequences of hypomimia) remain unanswered. We therefore undertook a fresh review of relevant articles focusing on facial emotion recognition in PD to deepen current understanding of this nonmotor feature, exploring multiple significant potential confounding factors, both clinical and methodological, and discussing probable pathophysiological mechanisms. This led us to examine recent proposals about the role of basal ganglia-based circuits in emotion and to consider the involvement of facial mimicry in this deficit from the perspective of embodied simulation theory. We believe our findings will inform clinical practice and increase fundamental knowledge, particularly in relation to potential embodied emotion impairment in PD. © 2018 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and Movement Disorder Society.
Collapse
Affiliation(s)
- Soizic Argaud
- Behavior and Basal Ganglia Research Unit (EA4712)University of Rennes 1RennesFrance
- Neuroscience of Emotion and Affective Dynamics laboratory, Department of Psychology and Educational SciencesUniversity of GenevaGenevaSwitzerland
| | - Marc Vérin
- Behavior and Basal Ganglia Research Unit (EA4712)University of Rennes 1RennesFrance
- Department of NeurologyRennes University HospitalRennesFrance
| | - Paul Sauleau
- Behavior and Basal Ganglia Research Unit (EA4712)University of Rennes 1RennesFrance
- Department of NeurophysiologyRennes University HospitalRennesFrance
| | - Didier Grandjean
- Neuroscience of Emotion and Affective Dynamics laboratory, Department of Psychology and Educational SciencesUniversity of GenevaGenevaSwitzerland
- Swiss Center for Affective SciencesCampus BiotechGenevaSwitzerland
| |
Collapse
|
11
|
Rymarczyk K, Żurawski Ł, Jankowiak-Siuda K, Szatkowska I. Neural Correlates of Facial Mimicry: Simultaneous Measurements of EMG and BOLD Responses during Perception of Dynamic Compared to Static Facial Expressions. Front Psychol 2018; 9:52. [PMID: 29467691 PMCID: PMC5807922 DOI: 10.3389/fpsyg.2018.00052] [Citation(s) in RCA: 33] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2017] [Accepted: 01/12/2018] [Indexed: 11/13/2022] Open
Abstract
Facial mimicry (FM) is an automatic response to imitate the facial expressions of others. However, neural correlates of the phenomenon are as yet not well established. We investigated this issue using simultaneously recorded EMG and BOLD signals during perception of dynamic and static emotional facial expressions of happiness and anger. During display presentations, BOLD signals and zygomaticus major (ZM), corrugator supercilii (CS) and orbicularis oculi (OO) EMG responses were recorded simultaneously from 46 healthy individuals. Subjects reacted spontaneously to happy facial expressions with increased EMG activity in ZM and OO muscles and decreased CS activity, which was interpreted as FM. Facial muscle responses correlated with BOLD activity in regions associated with motor simulation of facial expressions [i.e., inferior frontal gyrus, a classical Mirror Neuron System (MNS)]. Further, we also found correlations for regions associated with emotional processing (i.e., insula, part of the extended MNS). It is concluded that FM involves both motor and emotional brain structures, especially during perception of natural emotional expressions.
Collapse
Affiliation(s)
- Krystyna Rymarczyk
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, SWPS University of Social Sciences and Humanities, Warsaw, Poland
| | - Łukasz Żurawski
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland
| | - Kamila Jankowiak-Siuda
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, SWPS University of Social Sciences and Humanities, Warsaw, Poland
| | - Iwona Szatkowska
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland
| |
Collapse
|
12
|
Sinko K, Jagsch R, Drog C, Mosgoeller W, Wutzl A, Millesi G, Klug C. Facial esthetics and the assignment of personality traits before and after orthognathic surgery rated on video clips. PLoS One 2018; 13:e0191718. [PMID: 29390018 PMCID: PMC5794088 DOI: 10.1371/journal.pone.0191718] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2017] [Accepted: 12/27/2017] [Indexed: 11/30/2022] Open
Abstract
Typically, before and after surgical correction faces are assessed on still images by surgeons, orthodontists, the patients, and family members. We hypothesized that judgment of faces in motion and by naïve raters may closer reflect the impact on patients’ real life, and the treatment impact on e.g. career chances. Therefore we assessed faces from dysgnathic patients (Class II, III and Laterognathia) on video clips. Class I faces served as anchor and controls. Each patient’s face was assessed twice before and after treatment in changing sequence, by 155 naïve raters with similar age to the patients. The raters provided independent estimates on aesthetic trait pairs like ugly /beautiful, and personality trait pairs like dominant /flexible. Furthermore the perception of attractiveness, intelligence, health, the persons’ erotic aura, faithfulness, and five additional items were rated. We estimated the significance of the perceived treatment related differences and the respective effect size by general linear models for repeated measures. The obtained results were comparable to our previous rating on still images. There was an overall trend, that faces in video clips are rated along common stereotypes to a lesser extent than photographs. We observed significant class differences and treatment related changes of most aesthetic traits (e.g. beauty, attractiveness), these were comparable to intelligence, erotic aura and to some extend healthy appearance. While some personality traits (e.g. faithfulness) did not differ between the classes and between baseline and after treatment, we found that the intervention significantly and effectively altered the perception of the personality trait self-confidence. The effect size was highest in Class III patients, smallest in Class II patients, and in between for patients with Laterognathia. All dysgnathic patients benefitted from orthognathic surgery. We conclude that motion can mitigate marked stereotypes but does not entirely offset the mostly negative perception of dysgnathic faces.
Collapse
Affiliation(s)
- Klaus Sinko
- Clinic for Cranio-Maxillofacial and Oral Surgery, Medical University Vienna, Vienna, Austria
- * E-mail:
| | - Reinhold Jagsch
- Clinical Psychology and Health Psychology, Department of Psychology, University of Vienna, Vienna, Austria
| | - Claudio Drog
- University Clinic of Dentistry, Medical University Vienna, Vienna, Austria
| | - Wilhelm Mosgoeller
- Institute of Cancer Research, Medical University Vienna, Vienna, Austria
| | - Arno Wutzl
- Clinic for Cranio-Maxillofacial and Oral Surgery, Medical University Vienna, Vienna, Austria
| | - Gabriele Millesi
- Clinic for Cranio-Maxillofacial and Oral Surgery, Medical University Vienna, Vienna, Austria
| | - Clemens Klug
- Clinic for Cranio-Maxillofacial and Oral Surgery, Medical University Vienna, Vienna, Austria
| |
Collapse
|
13
|
Heller J, Mirzazade S, Romanzetti S, Habel U, Derntl B, Freitag NM, Schulz JB, Dogan I, Reetz K. Impact of gender and genetics on emotion processing in Parkinson's disease - A multimodal study. NEUROIMAGE-CLINICAL 2018; 18:305-314. [PMID: 29876251 PMCID: PMC5987844 DOI: 10.1016/j.nicl.2018.01.034] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/29/2017] [Revised: 01/25/2018] [Accepted: 01/28/2018] [Indexed: 01/07/2023]
Abstract
Understanding of the phenotypic heterogeneity of Parkinson's disease is needed. Gender and genetics determine manifestation and progression of Parkinson's disease. Altered emotion processing in Parkinson's disease is specific to male patients. This is influenced by endocrinal and genetic factors in both genders. This finding may impact the diagnosis and treatment of emerging clinical features.
Collapse
Key Words
- BAI, Beck anxiety inventory
- BDI-II, Beck depression inventory version II
- BFRT, Benton facial recognition test
- BOLD, blood‑oxygen-level dependent
- COMT, catechol-O-methyltransferase
- EPI, echo planar imaging
- Emotion
- Functional magnetic resonance imaging (fMRI)
- GM, gray matter
- Gender
- Genetics
- H&Y, Hoehn and Yahr rating scale
- HC, healthy controls
- LEDD, levodopa equivalence daily dose
- MCI, mild cognitive impairment
- MMSE, Mini-Mental State Examination
- MRI, magnetic resonance imaging
- MoCA, Montreal Cognitive Assessment
- NMS, non-motor symptoms
- PD, Parkinson's disease
- Parkinson's disease (PD)
- UPDRS, Unified Parkinson's disease rating scale
- VBM, voxel-based morphometry
- fMRI, functional magnetic resonance imaging
Collapse
Affiliation(s)
- Julia Heller
- Department of Neurology, RWTH Aachen University, Pauwelsstraße 30, Aachen, Germany; JARA-BRAIN Institute Molecular Neuroscience and Neuroimaging, Forschungszentrum Jülich GmbH and RWTH Aachen University, Germany
| | - Shahram Mirzazade
- Department of Neurology, RWTH Aachen University, Pauwelsstraße 30, Aachen, Germany; JARA-BRAIN Institute Molecular Neuroscience and Neuroimaging, Forschungszentrum Jülich GmbH and RWTH Aachen University, Germany
| | - Sandro Romanzetti
- Department of Neurology, RWTH Aachen University, Pauwelsstraße 30, Aachen, Germany; JARA-BRAIN Institute Molecular Neuroscience and Neuroimaging, Forschungszentrum Jülich GmbH and RWTH Aachen University, Germany
| | - Ute Habel
- Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen University, Pauwelsstraße 30, Aachen, Germany; JARA-BRAIN Institute Brain Structure-Function Relationships: Decoding the Human Brain at Systemic Levels, Forschungszentrum Jülich GmbH and RWTH Aachen University, Germany
| | - Birgit Derntl
- Department of Psychiatry and Psychotherapy, University of Tübingen, Osianderstraße 24, Tübingen, Germany
| | - Nils M Freitag
- II. Institute of Physics B and JARA-FIT, RWTH Aachen University, Otto-Blumenthal-Straße, Aachen, Germany
| | - Jörg B Schulz
- Department of Neurology, RWTH Aachen University, Pauwelsstraße 30, Aachen, Germany; JARA-BRAIN Institute Molecular Neuroscience and Neuroimaging, Forschungszentrum Jülich GmbH and RWTH Aachen University, Germany
| | - Imis Dogan
- Department of Neurology, RWTH Aachen University, Pauwelsstraße 30, Aachen, Germany; JARA-BRAIN Institute Molecular Neuroscience and Neuroimaging, Forschungszentrum Jülich GmbH and RWTH Aachen University, Germany
| | - Kathrin Reetz
- Department of Neurology, RWTH Aachen University, Pauwelsstraße 30, Aachen, Germany; JARA-BRAIN Institute Molecular Neuroscience and Neuroimaging, Forschungszentrum Jülich GmbH and RWTH Aachen University, Germany.
| |
Collapse
|
14
|
Rymarczyk K, Żurawski Ł, Jankowiak-Siuda K, Szatkowska I. Emotional Empathy and Facial Mimicry for Static and Dynamic Facial Expressions of Fear and Disgust. Front Psychol 2016; 7:1853. [PMID: 27933022 PMCID: PMC5120108 DOI: 10.3389/fpsyg.2016.01853] [Citation(s) in RCA: 42] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2016] [Accepted: 11/09/2016] [Indexed: 11/13/2022] Open
Abstract
Facial mimicry is the tendency to imitate the emotional facial expressions of others. Increasing evidence suggests that the perception of dynamic displays leads to enhanced facial mimicry, especially for happiness and anger. However, little is known about the impact of dynamic stimuli on facial mimicry for fear and disgust. To investigate this issue, facial EMG responses were recorded in the corrugator supercilii, levator labii, and lateral frontalis muscles, while participants viewed static (photos) and dynamic (videos) facial emotional expressions. Moreover, we tested whether emotional empathy modulated facial mimicry for emotional facial expressions. In accordance with our predictions, the highly empathic group responded with larger activity in the corrugator supercilii and levator labii muscles. Moreover, dynamic compared to static facial expressions of fear revealed enhanced mimicry in the high-empathic group in the frontalis and corrugator supercilii muscles. In the low-empathic group the facial reactions were not differentiated between fear and disgust for both dynamic and static facial expressions. We conclude that highly empathic subjects are more sensitive in their facial reactions to the facial expressions of fear and disgust compared to low empathetic counterparts. Our data confirms that personal characteristics, i.e., empathy traits as well as modality of the presented stimuli, modulate the strength of facial mimicry. In addition, measures of EMG activity of the levator labii and frontalis muscles may be a useful index of empathic responses of fear and disgust.
Collapse
Affiliation(s)
- Krystyna Rymarczyk
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of SciencesWarsaw, Poland; Department of Experimental Psychology, Faculty of Psychology, Institute of Cognitive and Behavioural Neuroscience, University of Social Sciences and HumanitiesWarsaw, Poland
| | - Łukasz Żurawski
- Department of Experimental Psychology, Faculty of Psychology, Institute of Cognitive and Behavioural Neuroscience, University of Social Sciences and Humanities Warsaw, Poland
| | - Kamila Jankowiak-Siuda
- Department of Experimental Psychology, Faculty of Psychology, Institute of Cognitive and Behavioural Neuroscience, University of Social Sciences and Humanities Warsaw, Poland
| | - Iwona Szatkowska
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences Warsaw, Poland
| |
Collapse
|
15
|
Rymarczyk K, Żurawski Ł, Jankowiak-Siuda K, Szatkowska I. Do Dynamic Compared to Static Facial Expressions of Happiness and Anger Reveal Enhanced Facial Mimicry? PLoS One 2016; 11:e0158534. [PMID: 27390867 PMCID: PMC4938565 DOI: 10.1371/journal.pone.0158534] [Citation(s) in RCA: 42] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2015] [Accepted: 06/17/2016] [Indexed: 11/18/2022] Open
Abstract
Facial mimicry is the spontaneous response to others’ facial expressions by mirroring or matching the interaction partner. Recent evidence suggested that mimicry may not be only an automatic reaction but could be dependent on many factors, including social context, type of task in which the participant is engaged, or stimulus properties (dynamic vs static presentation). In the present study, we investigated the impact of dynamic facial expression and sex differences on facial mimicry and judgment of emotional intensity. Electromyography recordings were recorded from the corrugator supercilii, zygomaticus major, and orbicularis oculi muscles during passive observation of static and dynamic images of happiness and anger. The ratings of the emotional intensity of facial expressions were also analysed. As predicted, dynamic expressions were rated as more intense than static ones. Compared to static images, dynamic displays of happiness also evoked stronger activity in the zygomaticus major and orbicularis oculi, suggesting that subjects experienced positive emotion. No muscles showed mimicry activity in response to angry faces. Moreover, we found that women exhibited greater zygomaticus major muscle activity in response to dynamic happiness stimuli than static stimuli. Our data support the hypothesis that people mimic positive emotions and confirm the importance of dynamic stimuli in some emotional processing.
Collapse
Affiliation(s)
- Krystyna Rymarczyk
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, University of Social Sciences and Humanities, Warsaw, Poland
- * E-mail: (KR); (ŁŻ)
| | - Łukasz Żurawski
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland
- * E-mail: (KR); (ŁŻ)
| | - Kamila Jankowiak-Siuda
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, University of Social Sciences and Humanities, Warsaw, Poland
| | - Iwona Szatkowska
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland
| |
Collapse
|
16
|
Reinl M, Bartels A. Perception of temporal asymmetries in dynamic facial expressions. Front Psychol 2015; 6:1107. [PMID: 26300807 PMCID: PMC4523710 DOI: 10.3389/fpsyg.2015.01107] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2015] [Accepted: 07/20/2015] [Indexed: 11/13/2022] Open
Abstract
In the current study we examined whether timeline-reversals and emotional direction of dynamic facial expressions affect subjective experience of human observers. We recorded natural movies of faces that increased or decreased their expressions of fear, and played them either in the natural frame order or reversed from last to first frame (reversed timeline). This led to four conditions of increasing or decreasing fear, either following the natural or reversed temporal trajectory of facial dynamics. This 2-by-2 factorial design controlled for visual low-level properties, static visual content, and motion energy across the different factors. It allowed us to examine perceptual consequences that would occur if the timeline trajectory of facial muscle movements during the increase of an emotion are not the exact mirror of the timeline during the decrease. It additionally allowed us to study perceptual differences between increasing and decreasing emotional expressions. Perception of these time-dependent asymmetries have not yet been quantified. We found that three emotional measures, emotional intensity, artificialness of facial movement, and convincingness or plausibility of emotion portrayal, were affected by timeline-reversals as well as by the emotional direction of the facial expressions. Our results imply that natural dynamic facial expressions contain temporal asymmetries, and show that deviations from the natural timeline lead to a reduction of perceived emotional intensity and convincingness, and to an increase of perceived artificialness of the dynamic facial expression. In addition, they show that decreasing facial expressions are judged as less plausible than increasing facial expressions. Our findings are of relevance for both, behavioral as well as neuroimaging studies, as processing and perception are influenced by temporal asymmetries.
Collapse
Affiliation(s)
| | - Andreas Bartels
- Vision and Cognition Lab, Centre for Integrative Neuroscience, University of Tübingen, Tübingen, Germany
| |
Collapse
|
17
|
Missana M, Rajhans P, Atkinson AP, Grossmann T. Discrimination of fearful and happy body postures in 8-month-old infants: an event-related potential study. Front Hum Neurosci 2014; 8:531. [PMID: 25104929 PMCID: PMC4109437 DOI: 10.3389/fnhum.2014.00531] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2014] [Accepted: 06/30/2014] [Indexed: 01/07/2023] Open
Abstract
Responding to others' emotional body expressions is an essential social skill in humans. Adults readily detect emotions from body postures, but it is unclear whether infants are sensitive to emotional body postures. We examined 8-month-old infants' brain responses to emotional body postures by measuring event-related potentials (ERPs) to happy and fearful bodies. Our results revealed two emotion-sensitive ERP components: body postures evoked an early N290 at occipital electrodes and a later Nc at fronto-central electrodes that were enhanced in response to fearful (relative to happy) expressions. These findings demonstrate that: (a) 8-month-old infants discriminate between static emotional body postures; and (b) similar to infant emotional face perception, the sensitivity to emotional body postures is reflected in early perceptual (N290) and later attentional (Nc) neural processes. This provides evidence for an early developmental emergence of the neural processes involved in the discrimination of emotional body postures.
Collapse
Affiliation(s)
- Manuela Missana
- Early Social Development Group, Max Planck Institute for Human Cognitive and Brain SciencesLeipzig, Germany
| | - Purva Rajhans
- Early Social Development Group, Max Planck Institute for Human Cognitive and Brain SciencesLeipzig, Germany
| | | | - Tobias Grossmann
- Early Social Development Group, Max Planck Institute for Human Cognitive and Brain SciencesLeipzig, Germany
| |
Collapse
|
18
|
Kaufman J, Johnston PJ. Facial motion engages predictive visual mechanisms. PLoS One 2014; 9:e91038. [PMID: 24632821 PMCID: PMC3954613 DOI: 10.1371/journal.pone.0091038] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2013] [Accepted: 02/10/2014] [Indexed: 11/18/2022] Open
Abstract
We employed a novel cuing paradigm to assess whether dynamically versus statically presented facial expressions differentially engaged predictive visual mechanisms. Participants were presented with a cueing stimulus that was either the static depiction of a low intensity expressed emotion; or a dynamic sequence evolving from a neutral expression to the low intensity expressed emotion. Following this cue and a backwards mask, participants were presented with a probe face that displayed either the same emotion (congruent) or a different emotion (incongruent) with respect to that displayed by the cue although expressed at a high intensity. The probe face had either the same or different identity from the cued face. The participants' task was to indicate whether or not the probe face showed the same emotion as the cue. Dynamic cues and same identity cues both led to a greater tendency towards congruent responding, although these factors did not interact. Facial motion also led to faster responding when the probe face was emotionally congruent to the cue. We interpret these results as indicating that dynamic facial displays preferentially invoke predictive visual mechanisms, and suggest that motoric simulation may provide an important basis for the generation of predictions in the visual system.
Collapse
Affiliation(s)
- Jordy Kaufman
- Swinburne University of Technology, Hawthorn, Victoria, Australia
| | | |
Collapse
|
19
|
Stoesz BM, Jakobson LS. Developmental changes in attention to faces and bodies in static and dynamic scenes. Front Psychol 2014; 5:193. [PMID: 24639664 PMCID: PMC3944146 DOI: 10.3389/fpsyg.2014.00193] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2013] [Accepted: 02/18/2014] [Indexed: 11/13/2022] Open
Abstract
Typically developing individuals show a strong visual preference for faces and face-like stimuli; however, this may come at the expense of attending to bodies or to other aspects of a scene. The primary goal of the present study was to provide additional insight into the development of attentional mechanisms that underlie perception of real people in naturalistic scenes. We examined the looking behaviors of typical children, adolescents, and young adults as they viewed static and dynamic scenes depicting one or more people. Overall, participants showed a bias to attend to faces more than on other parts of the scenes. Adding motion cues led to a reduction in the number, but an increase in the average duration of face fixations in single-character scenes. When multiple characters appeared in a scene, motion-related effects were attenuated and participants shifted their gaze from faces to bodies, or made off-screen glances. Children showed the largest effects related to the introduction of motion cues or additional characters, suggesting that they find dynamic faces difficult to process, and are especially prone to look away from faces when viewing complex social scenes-a strategy that could reduce the cognitive and the affective load imposed by having to divide one's attention between multiple faces. Our findings provide new insights into the typical development of social attention during natural scene viewing, and lay the foundation for future work examining gaze behaviors in typical and atypical development.
Collapse
Affiliation(s)
- Brenda M. Stoesz
- Department of Psychology, University of ManitobaWinnipeg, MB, Canada
| | - Lorna S. Jakobson
- Department of Psychology, University of ManitobaWinnipeg, MB, Canada
| |
Collapse
|
20
|
Metzger CD, van der Werf YD, Walter M. Functional mapping of thalamic nuclei and their integration into cortico-striatal-thalamo-cortical loops via ultra-high resolution imaging-from animal anatomy to in vivo imaging in humans. Front Neurosci 2013; 7:24. [PMID: 23658535 PMCID: PMC3647142 DOI: 10.3389/fnins.2013.00024] [Citation(s) in RCA: 53] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2012] [Accepted: 03/15/2013] [Indexed: 02/05/2023] Open
Abstract
The thalamus, a crucial node in the well-described cortico-striatal-thalamo-cortical circuits, has been the focus of functional and structural imaging studies investigating human emotion, cognition and memory. Invasive work in animals and post-mortem investigations have revealed the rich cytoarchitectonics and functional specificity of the thalamus. Given current restrictions in the spatial resolution of non-invasive imaging modalities, there is, however, a translational gap between functional and structural information on these circuits in humans and animals as well as between histological and cellular evidence and their relationship to psychological functioning. With the advance of higher field strengths for MR approaches, better spatial resolution is now available promising to overcome this conceptual problem. We here review these two levels, which exist for both neuroscientific and clinical investigations, and then focus on current attempts to overcome conceptual boundaries of these observations with the help of ultra-high resolution imaging.
Collapse
Affiliation(s)
- Coraline D Metzger
- Clinical Affective Neuroimaging Laboratory, Department of Psychiatry and Psychotherapy, Center for Behavioral Brain Sciences, Otto-von-Guericke University Magdeburg, Germany ; Department of Behavioral Neurology, Leibniz Institute for Neurobiology Magdeburg, Germany
| | | | | |
Collapse
|
21
|
Neural correlates of interindividual differences in children's audiovisual speech perception. J Neurosci 2011; 31:13963-71. [PMID: 21957257 DOI: 10.1523/jneurosci.2605-11.2011] [Citation(s) in RCA: 51] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/15/2023] Open
Abstract
Children use information from both the auditory and visual modalities to aid in understanding speech. A dramatic illustration of this multisensory integration is the McGurk effect, an illusion in which an auditory syllable is perceived differently when it is paired with an incongruent mouth movement. However, there are significant interindividual differences in McGurk perception: some children never perceive the illusion, while others always do. Because converging evidence suggests that the posterior superior temporal sulcus (STS) is a critical site for multisensory integration, we hypothesized that activity within the STS would predict susceptibility to the McGurk effect. To test this idea, we used BOLD fMRI in 17 children aged 6-12 years to measure brain responses to the following three audiovisual stimulus categories: McGurk incongruent, non-McGurk incongruent, and congruent syllables. Two separate analysis approaches, one using independent functional localizers and another using whole-brain voxel-based regression, showed differences in the left STS between perceivers and nonperceivers. The STS of McGurk perceivers responded significantly more than that of nonperceivers to McGurk syllables, but not to other stimuli, and perceivers' hemodynamic responses in the STS were significantly prolonged. In addition to the STS, weaker differences between perceivers and nonperceivers were observed in the fusiform face area and extrastriate visual cortex. These results suggest that the STS is an important source of interindividual variability in children's audiovisual speech perception.
Collapse
|