1
|
Mohapatra AN, Jabarin R, Ray N, Netser S, Wagner S. Impaired emotion recognition in Cntnap2-deficient mice is associated with hyper-synchronous prefrontal cortex neuronal activity. Mol Psychiatry 2024:10.1038/s41380-024-02754-8. [PMID: 39289476 DOI: 10.1038/s41380-024-02754-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/07/2023] [Accepted: 09/11/2024] [Indexed: 09/19/2024]
Abstract
Individuals diagnosed with autism spectrum disorder (ASD) show difficulty in recognizing emotions in others, a process termed emotion recognition. While human fMRI studies linked multiple brain areas to emotion recognition, the specific mechanisms underlying impaired emotion recognition in ASD are not clear. Here, we employed an emotional state preference (ESP) task to show that Cntnap2-knockout (KO) mice, an established ASD model, do not distinguish between conspecifics according to their emotional state. We assessed brain-wide local-field potential (LFP) signals during various social behavior tasks and found that Cntnap2-KO mice exhibited higher LFP theta and gamma rhythmicity than did C57BL/6J mice, even at rest. Specifically, Cntnap2-KO mice showed increased theta coherence, especially between the prelimbic cortex (PrL) and the hypothalamic paraventricular nucleus, during social behavior. Moreover, we observed significantly increased Granger causality of theta rhythmicity between these two brain areas, across several types of social behavior tasks. Finally, optogenetic stimulation of PrL pyramidal neurons in C57BL/6J mice impaired their social discrimination abilities, including in ESP. Together, these results suggest that increased rhythmicity of PrL pyramidal neuronal activity and its hyper-synchronization with specific brain regions are involved in the impaired emotion recognition exhibited by Cntnap2-KO mice.
Collapse
Affiliation(s)
- Alok Nath Mohapatra
- Sagol Department of Neurobiology, Faculty of Natural Sciences, University of Haifa, Haifa, Israel.
| | - Renad Jabarin
- Sagol Department of Neurobiology, Faculty of Natural Sciences, University of Haifa, Haifa, Israel
| | - Natali Ray
- Sagol Department of Neurobiology, Faculty of Natural Sciences, University of Haifa, Haifa, Israel
| | - Shai Netser
- Sagol Department of Neurobiology, Faculty of Natural Sciences, University of Haifa, Haifa, Israel
| | - Shlomo Wagner
- Sagol Department of Neurobiology, Faculty of Natural Sciences, University of Haifa, Haifa, Israel
| |
Collapse
|
2
|
Morningstar M, Hughes C, French RC, Grannis C, Mattson WI, Nelson EE. Functional connectivity during facial and vocal emotion recognition: Preliminary evidence for dissociations in developmental change by nonverbal modality. Neuropsychologia 2024; 202:108946. [PMID: 38945440 DOI: 10.1016/j.neuropsychologia.2024.108946] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2024] [Revised: 05/15/2024] [Accepted: 06/27/2024] [Indexed: 07/02/2024]
Abstract
The developmental trajectory of emotion recognition (ER) skills is thought to vary by nonverbal modality, with vocal ER becoming mature later than facial ER. To investigate potential neural mechanisms contributing to this dissociation at a behavioural level, the current study examined whether youth's neural functional connectivity during vocal and facial ER tasks showed differential developmental change across time. Youth ages 8-19 (n = 41) completed facial and vocal ER tasks while undergoing functional magnetic resonance imaging, at two timepoints (1 year apart; n = 36 for behavioural data, n = 28 for neural data). Partial least squares analyses revealed that functional connectivity during ER is both distinguishable by modality (with different patterns of connectivity for facial vs. vocal ER) and across time-with changes in connectivity being particularly pronounced for vocal ER. ER accuracy was greater for faces than voices, and positively associated with age; although task performance did not change appreciably across a 1-year period, changes in latent functional connectivity patterns across time predicted participants' ER accuracy at Time 2. Taken together, these results suggest that vocal and facial ER are supported by distinguishable neural correlates that may undergo different developmental trajectories. Our findings are also preliminary evidence that changes in network integration may support the development of ER skills in childhood and adolescence.
Collapse
Affiliation(s)
- M Morningstar
- Department of Psychology, Queen's University, Canada; Centre for Neuroscience Studies, Queen's University, Canada.
| | - C Hughes
- Department of Neurology and Neurosurgery, Montreal Neurological Institute, McGill University, Canada
| | - R C French
- Center for Biobehavioral Health, Abigail Wexner Research Institute, Nationwide Children's Hospital, Columbus, OH, USA; Department of Psychological and Brain Sciences, Indiana University, Bloomington, USA
| | - C Grannis
- Center for Biobehavioral Health, Abigail Wexner Research Institute, Nationwide Children's Hospital, Columbus, OH, USA
| | - W I Mattson
- Center for Biobehavioral Health, Abigail Wexner Research Institute, Nationwide Children's Hospital, Columbus, OH, USA
| | - E E Nelson
- Center for Biobehavioral Health, Abigail Wexner Research Institute, Nationwide Children's Hospital, Columbus, OH, USA; Department of Pediatrics, Ohio State University Wexner College of Medicine, Columbus, OH, USA
| |
Collapse
|
3
|
Del Vecchio M, Avanzini P, Gerbella M, Costa S, Zauli FM, d'Orio P, Focacci E, Sartori I, Caruana F. Anatomo-functional basis of emotional and motor resonance elicited by facial expressions. Brain 2024; 147:3018-3031. [PMID: 38365267 DOI: 10.1093/brain/awae050] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2023] [Revised: 12/21/2023] [Accepted: 01/28/2024] [Indexed: 02/18/2024] Open
Abstract
Simulation theories predict that the observation of other's expressions modulates neural activity in the same centres controlling their production. This hypothesis has been developed by two models, postulating that the visual input is directly projected either to the motor system for action recognition (motor resonance) or to emotional/interoceptive regions for emotional contagion and social synchronization (emotional resonance). Here we investigated the role of frontal/insular regions in the processing of observed emotional expressions by combining intracranial recording, electrical stimulation and effective connectivity. First, we intracranially recorded from prefrontal, premotor or anterior insular regions of 44 patients during the passive observation of emotional expressions, finding widespread modulations in prefrontal/insular regions (anterior cingulate cortex, anterior insula, orbitofrontal cortex and inferior frontal gyrus) and motor territories (Rolandic operculum and inferior frontal junction). Subsequently, we electrically stimulated the activated sites, finding that (i) in the anterior cingulate cortex and anterior insula, the stimulation elicited emotional/interoceptive responses, as predicted by the 'emotional resonance model'; (ii) in the Rolandic operculum it evoked face/mouth sensorimotor responses, in line with the 'motor resonance' model; and (iii) all other regions were unresponsive or revealed functions unrelated to the processing of facial expressions. Finally, we traced the effective connectivity to sketch a network-level description of these regions, finding that the anterior cingulate cortex and the anterior insula are reciprocally interconnected while the Rolandic operculum is part of the parieto-frontal circuits and poorly connected with the former. These results support the hypothesis that the pathways hypothesized by the 'emotional resonance' and the 'motor resonance' models work in parallel, differing in terms of spatio-temporal fingerprints, reactivity to electrical stimulation and connectivity patterns.
Collapse
Affiliation(s)
- Maria Del Vecchio
- Institute of Neuroscience, National Research Council of Italy (CNR), 43125 Parma, Italy
| | - Pietro Avanzini
- Institute of Neuroscience, National Research Council of Italy (CNR), 43125 Parma, Italy
| | - Marzio Gerbella
- Department of Medicine and Surgery, University of Parma, 43125 Parma, Italy
| | - Sara Costa
- Department of Medicine and Surgery, University of Parma, 43125 Parma, Italy
| | - Flavia Maria Zauli
- 'Claudio Munari' Epilepsy Surgery Center, ASST GOM Niguarda, 20142 Milan, Italy
| | - Piergiorgio d'Orio
- 'Claudio Munari' Epilepsy Surgery Center, ASST GOM Niguarda, 20142 Milan, Italy
| | - Elena Focacci
- Department of Medicine and Surgery, University of Parma, 43125 Parma, Italy
| | - Ivana Sartori
- 'Claudio Munari' Epilepsy Surgery Center, ASST GOM Niguarda, 20142 Milan, Italy
| | - Fausto Caruana
- Institute of Neuroscience, National Research Council of Italy (CNR), 43125 Parma, Italy
| |
Collapse
|
4
|
Wu X, Xu K, Li T, Wang L, Fu Y, Ma Z, Wu X, Wang Y, Chen F, Song J, Song Y, Lv Y. Abnormal intrinsic functional hubs and connectivity in patients with post-stroke depression. Ann Clin Transl Neurol 2024; 11:1852-1867. [PMID: 38775214 PMCID: PMC11251479 DOI: 10.1002/acn3.52091] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2023] [Revised: 04/08/2024] [Accepted: 05/06/2024] [Indexed: 07/17/2024] Open
Abstract
OBJECTIVE The present study aimed to investigate the specific alterations of brain networks in patients with post-stroke depression (PSD), and further assist in elucidating the brain mechanisms underlying the PSD which would provide supporting evidence for early diagnosis and interventions for the disease. METHODS Resting-state functional magnetic resonace imaging data were acquired from 82 nondepressed stroke patients (Stroke), 39 PSD patients, and 74 healthy controls (HC). Voxel-wise degree centrality (DC) conjoined with seed-based functional connectivity (FC) analyses were performed to investigate the PSD-related connectivity alterations. The relationship between these alterations and depression severity was further examined in PSD patients. RESULTS Relative to both Stroke and HC groups, (1) PSD showed increased centrality in regions within the default mode network (DMN), including contralesional angular gyrus (ANG), posterior cingulate cortex (PCC), and hippocampus (HIP). DC values in contralesional ANG positively correlated with the Patient Health Questionnaire-9 (PHQ-9) scores in PSD group. (2) PSD exhibited increased connectivity between these three seeds showing altered DC and regions within the DMN: bilateral medial prefrontal cortex and middle temporal gyrus and ipsilesional superior parietal gyrus, and regions outside the DMN: bilateral calcarine, ipsilesional inferior occipital gyrus and contralesional lingual gyrus, while decreased connectivity between contralesional ANG and contralesional supramarginal gyrus. Moreover, these FC alterations could predict PHQ-9 scores in PSD group. INTERPRETATION These findings highlight that PSD was related with increased functional connectivity strength in some areas within the DMN, which might be attribute to the specific alterations of connectivity between within DMN and outside DMN regions in PSD.
Collapse
Affiliation(s)
- Xiumei Wu
- Center for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina
- Zhejiang Key Laboratory for Research in Assessment of Cognitive ImpairmentsHangzhouZhejiangChina
| | - Kang Xu
- Center for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina
- Zhejiang Key Laboratory for Research in Assessment of Cognitive ImpairmentsHangzhouZhejiangChina
| | - Tongyue Li
- Center for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina
- Zhejiang Key Laboratory for Research in Assessment of Cognitive ImpairmentsHangzhouZhejiangChina
| | - Luoyu Wang
- School of Biomedical EngineeringShanghaiTech UniversityShanghaiChina
| | - Yanhui Fu
- Department of NeurologyAnshan Changda HospitalAnshanLiaoningChina
| | - Zhenqiang Ma
- Department of NeurologyAnshan Changda HospitalAnshanLiaoningChina
| | - Xiaoyan Wu
- Department of ImageAnshan Changda HospitalAnshanLiaoningChina
| | - Yiying Wang
- Department of UltrasonicsAnshan Changda HospitalAnshanLiaoningChina
| | - Fenyang Chen
- The Fourth Clinical Medical CollegeZhejiang Chinese Medical UniversityHangzhouZhejiangChina
| | - Jinyi Song
- III Department of Clinic MedicineZhejiang UniversityHangzhouZhejiangChina
| | - Yulin Song
- Department of NeurologyAnshan Changda HospitalAnshanLiaoningChina
| | - Yating Lv
- Center for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina
- Zhejiang Key Laboratory for Research in Assessment of Cognitive ImpairmentsHangzhouZhejiangChina
| |
Collapse
|
5
|
Laukka P, Månsson KNT, Cortes DS, Manzouri A, Frick A, Fredborg W, Fischer H. Neural correlates of individual differences in multimodal emotion recognition ability. Cortex 2024; 175:1-11. [PMID: 38691922 DOI: 10.1016/j.cortex.2024.03.009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2023] [Revised: 03/11/2024] [Accepted: 04/01/2024] [Indexed: 05/03/2024]
Abstract
Studies have reported substantial variability in emotion recognition ability (ERA) - an important social skill - but possible neural underpinnings for such individual differences are not well understood. This functional magnetic resonance imaging (fMRI) study investigated neural responses during emotion recognition in young adults (N = 49) who were selected for inclusion based on their performance (high or low) during previous testing of ERA. Participants were asked to judge brief video recordings in a forced-choice emotion recognition task, wherein stimuli were presented in visual, auditory and multimodal (audiovisual) blocks. Emotion recognition rates during brain scanning confirmed that individuals with high (vs low) ERA received higher accuracy for all presentation blocks. fMRI-analyses focused on key regions of interest (ROIs) involved in the processing of multimodal emotion expressions, based on previous meta-analyses. In neural response to emotional stimuli contrasted with neutral stimuli, individuals with high (vs low) ERA showed higher activation in the following ROIs during the multimodal condition: right middle superior temporal gyrus (mSTG), right posterior superior temporal sulcus (PSTS), and right inferior frontal cortex (IFC). Overall, results suggest that individual variability in ERA may be reflected across several stages of decisional processing, including extraction (mSTG), integration (PSTS) and evaluation (IFC) of emotional information.
Collapse
Affiliation(s)
- Petri Laukka
- Department of Psychology, Stockholm University, Stockholm, Sweden; Department of Psychology, Uppsala University, Uppsala, Sweden.
| | - Kristoffer N T Månsson
- Centre for Psychiatry Research, Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden; Department of Clinical Psychology and Psychotherapy, Babeș-Bolyai University, Cluj-Napoca, Romania
| | - Diana S Cortes
- Department of Psychology, Stockholm University, Stockholm, Sweden
| | - Amirhossein Manzouri
- Department of Psychology, Stockholm University, Stockholm, Sweden; Centre for Psychiatry Research, Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - Andreas Frick
- Department of Medical Sciences, Psychiatry, Uppsala University, Uppsala, Sweden
| | - William Fredborg
- Department of Psychology, Stockholm University, Stockholm, Sweden
| | - Håkan Fischer
- Department of Psychology, Stockholm University, Stockholm, Sweden; Stockholm University Brain Imaging Centre (SUBIC), Stockholm University, Stockholm, Sweden; Aging Research Center, Department of Neurobiology, Care Sciences and Society, Karolinska Institutet and Stockholm University, Stockholm, Sweden
| |
Collapse
|
6
|
Ikeda S. Interoceptive sensitivity and perception of others' emotions: an investigation based on a two-stage model. Cogn Process 2024; 25:229-239. [PMID: 38383909 DOI: 10.1007/s10339-024-01176-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2023] [Accepted: 01/04/2024] [Indexed: 02/23/2024]
Abstract
Recent research shows that sensitivity to interoceptive sensitivity is associated with a more granular experience of emotions. These studies suggest that individuals sensitive to their interoceptive signals can better perceive somatic physiological changes as compared to their counterparts. Therefore, they discriminate among a wide and subtle range of emotions. Further, the perception of others' emotions could be based on our own emotional experiences. However, whether interoceptive sensitivity is related to the perception of others' emotions remains unclear. Therefore, this study examined the relationship between interoceptive sensitivity and emotional perception. Considering the model that emotion perception comprises two processes, categorization of facial expressions and approach-avoidance responses, this study examined both categorizations of facial expressions and approach-avoidance responses. The results showed no relationship between interoceptive sensitivity and the perception of emotion, which suggests that interoceptive sensitivity is related to the experience of emotion but does not affect the granularity of emotional perception. Future studies should diversely and empirically examine the role of the body in emotional perception from the perspective of interoceptive sensitivity.
Collapse
Affiliation(s)
- Shinnosuke Ikeda
- Human and Social Administration Department, Kanazawa University, Kanazawa University Kakuma-Machi, Kanazawa, Ishikawa Prefecture, 920-1192, Japan.
| |
Collapse
|
7
|
Landin-Romero R, Kumfor F, Ys Lee A, Leyton C, Piguet O. Clinical and cortical trajectories in non-fluent primary progressive aphasia and Alzheimer's disease: A role for emotion processing. Brain Res 2024; 1829:148777. [PMID: 38286395 DOI: 10.1016/j.brainres.2024.148777] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2023] [Revised: 01/12/2024] [Accepted: 01/18/2024] [Indexed: 01/31/2024]
Abstract
OBJECTIVES To examine the clinical trajectories and neural correlates of cognitive and emotion processing changes in the non-fluent/agrammatic (nfvPPA) and the logopenic (lvPPA) variants of primary progressive aphasia (PPA). DESIGN Observational case-control longitudinal cohort study. SETTING Research clinic of frontotemporal dementia. PARTICIPANTS This study recruited 29 non-semantic PPA patients (15 nfvPPA and 14 lvPPA) and compared them with 15 Alzheimer's disease (AD) patients and 14 healthy controls. MEASUREMENTS Participants completed an annual assessment (median = 2 years; range = 1-5 years) of general cognition, emotion processing and structural MRI. Linear mixed effects models investigated clinical and imaging trajectories between groups. RESULTS Over time, lvPPA showed the greatest cognitive deterioration. In contrast, nfvPPA showed significant decline in emotion recognition, whereas AD showed preserved emotion recognition, even with disease progression. Importantly, lvPPA also developed emotion processing impairments, with disease progression. Both nfvPPA and lvPPA showed continuing cortical atrophy in hallmark language-processing regions associated with these syndromes, together with progressive involvement of the right hemisphere regions, mirroring left hemisphere atrophy patterns at presentation. Decline in emotion processing was associated with bilateral frontal atrophy in nfvPPA and right temporal atrophy in lvPPA. CONCLUSIONS Our results show divergent clinical courses in nfvPPA and lvPPA, with rapid cognitive and neural deterioration in lvPPA and emotion processing decline in both groups and support the concurrent assessment of cognition and emotion processing in the clinic to inform diagnosis and monitoring in the non-semantic variants of PPA.
Collapse
Affiliation(s)
- Ramon Landin-Romero
- Sydney School of Health Sciences & Brain and Mind Centre, The University of Sydney, Australia; ARC Centre of Excellence in Cognition and its Disorders, Australia.
| | - Fiona Kumfor
- School of Psychology & Brain and Mind Centre, The University of Sydney, Australia; ARC Centre of Excellence in Cognition and its Disorders, Australia
| | - Austin Ys Lee
- ARC Centre of Excellence in Cognition and its Disorders, Australia
| | - Cristian Leyton
- School of Psychology & Brain and Mind Centre, The University of Sydney, Australia; ARC Centre of Excellence in Cognition and its Disorders, Australia
| | - Olivier Piguet
- School of Psychology & Brain and Mind Centre, The University of Sydney, Australia; ARC Centre of Excellence in Cognition and its Disorders, Australia
| |
Collapse
|
8
|
Borgomaneri S, Vitale F, Battaglia S, de Vega M, Avenanti A. Task-related modulation of motor response to emotional bodies: A TMS motor-evoked potential study. Cortex 2024; 171:235-246. [PMID: 38096756 DOI: 10.1016/j.cortex.2023.10.013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Revised: 09/19/2023] [Accepted: 10/06/2023] [Indexed: 02/12/2024]
Abstract
Exposure to emotional body postures during perceptual decision-making tasks has been linked to transient suppression of motor reactivity, supporting the monitoring of emotionally relevant information. However, it remains unclear whether this effect occurs implicitly, i.e., when emotional information is irrelevant to the task. To investigate this issue, we used single-pulse transcranial magnetic stimulation (TMS) to assess motor excitability while healthy participants were asked to categorize pictures of body expressions as emotional or neutral (emotion recognition task) or as belonging to a male or a female actor (gender recognition task) while receiving TMS over the motor cortex at 100 and 125 ms after picture onset. Results demonstrated that motor-evoked potentials (MEPs) were reduced for emotional body postures relative to neutral postures during the emotion recognition task. Conversely, MEPs increased for emotional body postures relative to neutral postures during the gender recognition task. These findings indicate that motor inhibition, contingent upon observing emotional body postures, is selectively associated with actively monitoring emotional features. In contrast, observing emotional body postures prompts motor facilitation when task-relevant features are non-emotional. These findings contribute to embodied cognition models that link emotion perception and action tendencies.
Collapse
Affiliation(s)
- Sara Borgomaneri
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Alma Mater Studiorum Università di Bologna, Campus di Cesena, Cesena, Italy.
| | - Francesca Vitale
- Instituto Universitario de Neurociencia (IUNE), Universidad de La Laguna, Santa Cruz de Tenerife, Spain
| | - Simone Battaglia
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Alma Mater Studiorum Università di Bologna, Campus di Cesena, Cesena, Italy
| | - Manuel de Vega
- Instituto Universitario de Neurociencia (IUNE), Universidad de La Laguna, Santa Cruz de Tenerife, Spain
| | - Alessio Avenanti
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Alma Mater Studiorum Università di Bologna, Campus di Cesena, Cesena, Italy; Centro de Investigación en Neuropsicología y Neurosciencias Cognitivas, Universidad Católica Del Maule, Talca, Chile.
| |
Collapse
|
9
|
Mortillaro M, Schlegel K. Embracing the Emotion in Emotional Intelligence Measurement: Insights from Emotion Theory and Research. J Intell 2023; 11:210. [PMID: 37998709 PMCID: PMC10672494 DOI: 10.3390/jintelligence11110210] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2023] [Revised: 10/16/2023] [Accepted: 10/28/2023] [Indexed: 11/25/2023] Open
Abstract
Emotional intelligence (EI) has gained significant popularity as a scientific construct over the past three decades, yet its conceptualization and measurement still face limitations. Applied EI research often overlooks its components, treating it as a global characteristic, and there are few widely used performance-based tests for assessing ability EI. The present paper proposes avenues for advancing ability EI measurement by connecting the main EI components to models and theories from the emotion science literature and related fields. For emotion understanding and emotion recognition, we discuss the implications of basic emotion theory, dimensional models, and appraisal models of emotion for creating stimuli, scenarios, and response options. For the regulation and management of one's own and others' emotions, we discuss how the process model of emotion regulation and its extensions to interpersonal processes can inform the creation of situational judgment items. In addition, we emphasize the importance of incorporating context, cross-cultural variability, and attentional and motivational factors into future models and measures of ability EI. We hope this article will foster exchange among scholars in the fields of ability EI, basic emotion science, social cognition, and emotion regulation, leading to an enhanced understanding of the individual differences in successful emotional functioning and communication.
Collapse
Affiliation(s)
- Marcello Mortillaro
- Swiss Center for Affective Sciences, University of Geneva, 1202 Geneva, Switzerland
| | - Katja Schlegel
- Institute of Psychology, University of Bern, 3012 Bern, Switzerland
| |
Collapse
|
10
|
Vaessen M, Van der Heijden K, de Gelder B. Modality-specific brain representations during automatic processing of face, voice and body expressions. Front Neurosci 2023; 17:1132088. [PMID: 37869514 PMCID: PMC10587395 DOI: 10.3389/fnins.2023.1132088] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/26/2022] [Accepted: 09/05/2023] [Indexed: 10/24/2023] Open
Abstract
A central question in affective science and one that is relevant for its clinical applications is how emotions provided by different stimuli are experienced and represented in the brain. Following the traditional view emotional signals are recognized with the help of emotion concepts that are typically used in descriptions of mental states and emotional experiences, irrespective of the sensory modality. This perspective motivated the search for abstract representations of emotions in the brain, shared across variations in stimulus type (face, body, voice) and sensory origin (visual, auditory). On the other hand, emotion signals like for example an aggressive gesture, trigger rapid automatic behavioral responses and this may take place before or independently of full abstract representation of the emotion. This pleads in favor specific emotion signals that may trigger rapid adaptative behavior only by mobilizing modality and stimulus specific brain representations without relying on higher order abstract emotion categories. To test this hypothesis, we presented participants with naturalistic dynamic emotion expressions of the face, the whole body, or the voice in a functional magnetic resonance (fMRI) study. To focus on automatic emotion processing and sidestep explicit concept-based emotion recognition, participants performed an unrelated target detection task presented in a different sensory modality than the stimulus. By using multivariate analyses to assess neural activity patterns in response to the different stimulus types, we reveal a stimulus category and modality specific brain organization of affective signals. Our findings are consistent with the notion that under ecological conditions emotion expressions of the face, body and voice may have different functional roles in triggering rapid adaptive behavior, even if when viewed from an abstract conceptual vantage point, they may all exemplify the same emotion. This has implications for a neuroethologically grounded emotion research program that should start from detailed behavioral observations of how face, body, and voice expressions function in naturalistic contexts.
Collapse
|
11
|
Borgomaneri S, Zanon M, Di Luzio P, Cataneo A, Arcara G, Romei V, Tamietto M, Avenanti A. Increasing associative plasticity in temporo-occipital back-projections improves visual perception of emotions. Nat Commun 2023; 14:5720. [PMID: 37737239 PMCID: PMC10517146 DOI: 10.1038/s41467-023-41058-3] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Accepted: 08/17/2023] [Indexed: 09/23/2023] Open
Abstract
The posterior superior temporal sulcus (pSTS) is a critical node in a network specialized for perceiving emotional facial expressions that is reciprocally connected with early visual cortices (V1/V2). Current models of perceptual decision-making increasingly assign relevance to recursive processing for visual recognition. However, it is unknown whether inducing plasticity into reentrant connections from pSTS to V1/V2 impacts emotion perception. Using a combination of electrophysiological and neurostimulation methods, we demonstrate that strengthening the connectivity from pSTS to V1/V2 selectively increases the ability to perceive facial expressions associated with emotions. This behavior is associated with increased electrophysiological activity in both these brain regions, particularly in V1/V2, and depends on specific temporal parameters of stimulation that follow Hebbian principles. Therefore, we provide evidence that pSTS-to-V1/V2 back-projections are instrumental to perception of emotion from facial stimuli and functionally malleable via manipulation of associative plasticity.
Collapse
Affiliation(s)
- Sara Borgomaneri
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Alma Mater Studiorum Università di Bologna, Cesena Campus, Cesena, Italy.
| | - Marco Zanon
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Alma Mater Studiorum Università di Bologna, Cesena Campus, Cesena, Italy
- Neuroscience Area, International School for Advanced Studies (SISSA), Trieste, Italy
| | - Paolo Di Luzio
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Alma Mater Studiorum Università di Bologna, Cesena Campus, Cesena, Italy
| | - Antonio Cataneo
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Alma Mater Studiorum Università di Bologna, Cesena Campus, Cesena, Italy
| | | | - Vincenzo Romei
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Alma Mater Studiorum Università di Bologna, Cesena Campus, Cesena, Italy
- Facultad de Lenguas y Educación, Universidad Antonio de Nebrija, Madrid, 28015, Spain
| | - Marco Tamietto
- Dipartimento di Psicologia, Università degli Studi di Torino, Torino, Italy.
- Department of Medical and Clinical Psychology, Tilburg University, Tilburg, The Netherlands.
| | - Alessio Avenanti
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Alma Mater Studiorum Università di Bologna, Cesena Campus, Cesena, Italy.
- Centro de Investigación en Neuropsicología y Neurociencias Cognitivas, Universidad Católica del Maule, Talca, Chile.
| |
Collapse
|
12
|
Rocklin ML, Garròn Torres AA, Reeves B, Robinson TN, Ram N. The Affective Dynamics of Everyday Digital Life: Opening Computational Possibility. AFFECTIVE SCIENCE 2023; 4:529-540. [PMID: 37744988 PMCID: PMC10514010 DOI: 10.1007/s42761-023-00202-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/17/2023] [Accepted: 07/12/2023] [Indexed: 09/26/2023]
Abstract
Up to now, there was no way to observe and track the affective impacts of the massive amount of complex visual stimuli that people encounter "in the wild" during their many hours of digital life. In this paper, we propose and illustrate how recent advances in AI-trained ensembles of deep neural networks-can be deployed on new data streams that are long sequences of screenshots of study participants' smartphones obtained unobtrusively during everyday life. We obtained affective valence and arousal ratings of hundreds of images drawn from existing picture repositories often used in psychological studies, and a new screenshot repository chronicling individuals' everyday digital life from both N = 832 adults and an affect computation model (Parry & Vuong, 2021). Results and analysis suggest that (a) our sample rates images similarly to other samples used in psychological studies, (b) the affect computation model is able to assign valence and arousal ratings similarly to humans, and (c) the resulting computational pipeline can be deployed at scale to obtain detailed maps of the affective space individuals travel through on their smartphones. Leveraging innovative methods for tracking the emotional content individuals encounter on their smartphones, we open the possibility for large-scale studies of how the affective dynamics of everyday digital life shape individuals' moment-to-moment experiences and well-being. Supplementary Information The online version contains supplementary material available at 10.1007/s42761-023-00202-4.
Collapse
Affiliation(s)
- Maia L. Rocklin
- Department of Psychology, Stanford University, Stanford, CA 94305 USA
| | | | - Byron Reeves
- Department of Communication, Stanford University, Stanford, 300-A Building 120, 450 Jane Stanford Way, Stanford, CA 94305 USA
| | - Thomas N. Robinson
- Department of Pediatrics, Stanford University, Stanford, CA 94305 USA
- Department of Medicine, Stanford University, Stanford, CA 94305 USA
| | - Nilam Ram
- Department of Psychology, Stanford University, Stanford, CA 94305 USA
- Department of Communication, Stanford University, Stanford, 300-A Building 120, 450 Jane Stanford Way, Stanford, CA 94305 USA
| |
Collapse
|
13
|
Straccia MA, Teed AR, Katzman PL, Tan KM, Parrish MH, Irwin MR, Eisenberger NI, Lieberman MD, Tabak BA. Null results of oxytocin and vasopressin administration on mentalizing in a large fMRI sample: evidence from a randomized controlled trial. Psychol Med 2023; 53:2285-2295. [PMID: 37310308 PMCID: PMC10123837 DOI: 10.1017/s0033291721004104] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/25/2020] [Revised: 09/07/2021] [Accepted: 09/20/2021] [Indexed: 11/05/2022]
Abstract
BACKGROUND Although potential links between oxytocin (OT), vasopressin (AVP), and social cognition are well-grounded theoretically, most studies have included all male samples, and few have demonstrated consistent effects of either neuropeptide on mentalizing (i.e. understanding the mental states of others). To understand the potential of either neuropeptide as a pharmacological treatment for individuals with impairments in social cognition, it is important to demonstrate the beneficial effects of OT and AVP on mentalizing in healthy individuals. METHODS In the present randomized, double-blind, placebo-controlled study (n = 186) of healthy individuals, we examined the effects of OT and AVP administration on behavioral responses and neural activity in response to a mentalizing task. RESULTS Relative to placebo, neither drug showed an effect on task reaction time or accuracy, nor on whole-brain neural activation or functional connectivity observed within brain networks associated with mentalizing. Exploratory analyses included several variables previously shown to moderate OT's effects on social processes (e.g., self-reported empathy, alexithymia) but resulted in no significant interaction effects. CONCLUSIONS Results add to a growing literature demonstrating that intranasal administration of OT and AVP may have a more limited effect on social cognition, at both the behavioral and neural level, than initially assumed. Randomized controlled trial registrations: ClinicalTrials.gov; NCT02393443; NCT02393456; NCT02394054.
Collapse
Affiliation(s)
- Mark A. Straccia
- Department of Psychology, University of California, Los Angeles, CA, USA
| | - Adam R. Teed
- Department of Psychology, Southern Methodist University, Dallas, TX, USA
| | - Perri L. Katzman
- Department of Psychology, New York University, New York, NY, USA
| | - Kevin M. Tan
- Department of Psychology, University of California, Los Angeles, CA, USA
| | - Michael H. Parrish
- Department of Psychology, University of California, Los Angeles, CA, USA
| | - Michael R. Irwin
- Department of Psychology, University of California, Los Angeles, CA, USA
- Department of Psychiatry and Biobehavioral Sciences, David Geffen School of Medicine, University of California, Los Angeles, CA, USA
- Cousins Center for Psychoneuroimmunology, Jane and Terry Semel Institute for Neuroscience, David Geffen School of Medicine, University of California, Los Angeles, CA, USA
| | | | - Matthew D. Lieberman
- Department of Psychology, University of California, Los Angeles, CA, USA
- Department of Psychiatry and Biobehavioral Sciences, David Geffen School of Medicine, University of California, Los Angeles, CA, USA
| | - Benjamin A. Tabak
- Department of Psychology, Southern Methodist University, Dallas, TX, USA
| |
Collapse
|
14
|
Lee M, Lori A, Langford NA, Rilling JK. The neural basis of smile authenticity judgments and the potential modulatory role of the oxytocin receptor gene (OXTR). Behav Brain Res 2023; 437:114144. [PMID: 36216140 DOI: 10.1016/j.bbr.2022.114144] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Revised: 09/03/2022] [Accepted: 09/30/2022] [Indexed: 11/13/2022]
Abstract
Accurate perception of genuine vs. posed smiles is crucial for successful social navigation in humans. While people vary in their ability to assess the authenticity of smiles, little is known about the specific biological mechanisms underlying this variation. We investigated the neural substrates of smile authenticity judgments using functional magnetic resonance imaging (fMRI). We also tested a preliminary hypothesis that a common polymorphism in the oxytocin receptor gene (OXTR) rs53576 would modulate the behavioral and neural indices of accurate smile authenticity judgments. A total of 185 healthy adult participants (Neuroimaging arm: N = 44, Behavioral arm: N = 141) determined the authenticity of dynamic facial expressions of genuine and posed smiles either with or without fMRI scanning. Correctly identified genuine vs. posed smiles activated brain areas involved with reward processing, facial mimicry, and mentalizing. Activation within the inferior frontal gyrus and dorsomedial prefrontal cortex correlated with individual differences in sensitivity (d') and response criterion (C), respectively. Our exploratory genetic analysis revealed that rs53576 G homozygotes in the neuroimaging arm had a stronger tendency to judge posed smiles as genuine than did A allele carriers and showed decreased activation in the medial prefrontal cortex when viewing genuine vs. posed smiles. Yet, OXTR rs53576 did not modulate task performance in the behavioral arm, which calls for further studies to evaluate the legitimacy of this result. Our findings extend previous literature on the biological foundations of smile authenticity judgments, particularly emphasizing the involvement of brain regions implicated in reward, facial mimicry, and mentalizing.
Collapse
Affiliation(s)
| | - Adriana Lori
- Department of Psychiatry and Behavioral Science, USA
| | - Nicole A Langford
- Department of Psychiatry and Behavioral Science, USA; Nell Hodgson Woodruff School of Nursing, USA
| | - James K Rilling
- Department of Anthropology, USA; Department of Psychiatry and Behavioral Science, USA; Center for Behavioral Neuroscience, USA; Emory National Primate Research Center, USA; Center for Translational Social Neuroscience, USA.
| |
Collapse
|
15
|
Ikeda S. Development of Emotion Recognition from Facial Expressions with Different Eye and Mouth Cues in Japanese People. J Genet Psychol 2023; 184:187-197. [PMID: 36661090 DOI: 10.1080/00221325.2023.2168174] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/21/2023]
Abstract
Research has reported that Japanese people are more likely to focus on and look longer at eyes when reading emotions from facial expressions than their western counterparts. However, how these tendencies develop and whether there is a relationship between the two tendencies (to focus on the eyes and to look longer at the eyes) is unclear. The present study examined emotion recognition and gaze patterns in Japanese preschool children (n = 51) and university students (n = 57), using facial expressions with different eye and mouth cues. The results showed developmental changes in emotion recognition, with adults being more sensitive to negative emotions, whereas gaze patterns showed no developmental changes. Furthermore, there was no relationship between emotion recognition and gaze patterns. This suggests that the implicit and explicit processing of emotion recognition develops at different times, and that there is no direct relationship between the two processes.
Collapse
Affiliation(s)
- Shinnosuke Ikeda
- Faculty of Humanities, Kyoto University of Advanced Science, Kyoto, Japan
| |
Collapse
|
16
|
Thomasson M, Ceravolo L, Corradi-Dell’Acqua C, Mantelli A, Saj A, Assal F, Grandjean D, Péron J. Dysfunctional cerebello-cerebral network associated with vocal emotion recognition impairments. Cereb Cortex Commun 2023; 4:tgad002. [PMID: 36726795 PMCID: PMC9883615 DOI: 10.1093/texcom/tgad002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2022] [Revised: 12/23/2022] [Accepted: 12/29/2022] [Indexed: 01/13/2023] Open
Abstract
Vocal emotion recognition, a key determinant to analyzing a speaker's emotional state, is known to be impaired following cerebellar dysfunctions. Nevertheless, its possible functional integration in the large-scale brain network subtending emotional prosody recognition has yet to be explored. We administered an emotional prosody recognition task to patients with right versus left-hemispheric cerebellar lesions and a group of matched controls. We explored the lesional correlates of vocal emotion recognition in patients through a network-based analysis by combining a neuropsychological approach for lesion mapping with normative brain connectome data. Results revealed impaired recognition among patients for neutral or negative prosody, with poorer sadness recognition performances by patients with right cerebellar lesion. Network-based lesion-symptom mapping revealed that sadness recognition performances were linked to a network connecting the cerebellum with left frontal, temporal, and parietal cortices. Moreover, when focusing solely on a subgroup of patients with right cerebellar damage, sadness recognition performances were associated with a more restricted network connecting the cerebellum to the left parietal lobe. As the left hemisphere is known to be crucial for the processing of short segmental information, these results suggest that a corticocerebellar network operates on a fine temporal scale during vocal emotion decoding.
Collapse
Affiliation(s)
- Marine Thomasson
- Clinical and Experimental Neuropsychology Laboratory, Department of Psychology, University of Geneva, 40 bd du Pont d’Arve, Geneva 1205, Switzerland,Neuroscience of Emotion and Affective Dynamics Laboratory, Department of Psychology and Swiss Centre for Affective Sciences, University of Geneva, 40 bd du Pont d’Arve, Geneva 1205, Switzerland,Cognitive Neurology Unit, Department of Neurology, University Hospitals of Geneva, Rue Gabrielle-Perret-Gentil 4, Geneva 1205, Switzerland
| | - Leonardo Ceravolo
- Neuroscience of Emotion and Affective Dynamics Laboratory, Department of Psychology and Swiss Centre for Affective Sciences, University of Geneva, 40 bd du Pont d’Arve, Geneva 1205, Switzerland
| | - Corrado Corradi-Dell’Acqua
- Theory of Pain Laboratory, Department of Psychology, Faculty of Psychology and Educational Sciences (FPSE), University of Geneva, 40 bd du Pont d’Arve, Geneva 1205, Switzerland,Geneva Neuroscience Centre, University of Geneva, Rue Michel-Servet 1, Geneva 1206, Switzerland
| | - Amélie Mantelli
- Clinical and Experimental Neuropsychology Laboratory, Department of Psychology, University of Geneva, 40 bd du Pont d’Arve, Geneva 1205, Switzerland
| | - Arnaud Saj
- Department of Psychology, University of Montreal, Montreal, 90 avenue Vincent d'Indy Montréal, H2V 2S9 Montréal, Québec, Canada
| | - Frédéric Assal
- Cognitive Neurology Unit, Department of Neurology, University Hospitals of Geneva, Rue Gabrielle-Perret-Gentil 4, Geneva 1205, Switzerland,Faculty of Medicine, University of Geneva, Rue Michel-Servet 1, Geneva 1206, Switzerland
| | - Didier Grandjean
- Neuroscience of Emotion and Affective Dynamics Laboratory, Department of Psychology and Swiss Centre for Affective Sciences, University of Geneva, 40 bd du Pont d’Arve, Geneva 1205, Switzerland
| | - Julie Péron
- Corresponding author: Clinical and Experimental Neuropsychology Laboratory, Faculté de Psychologie et des Sciences de l’Education, Université de Genève, 40 bd du Pont d’Arve, Geneva 1205, Switzerland.
| |
Collapse
|
17
|
Eicher M, Jokeit H. Toward social neuropsychology of epilepsy: a meta-analysis on social cognition in epilepsy phenotypes and a critical narrative review on assessment methods. ACTA EPILEPTOLOGICA 2022. [DOI: 10.1186/s42494-022-00093-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
Abstract
Abstract
Background
The aim of this review is to (a) characterize social cognition impairments in the domains of emotion recognition (ER) and theory of mind (ToM) in patients with epilepsy and (b) to review assessment tools with a focus on their validity and usability in clinical practice.
Methods
An electronic search for clinical studies investigating social cognition in epilepsy populations vs healthy control subjects (HC) yielded 53 studies for the meta-analysis and descriptive review.
Results
Results suggest that (1) social cognition is significantly impaired in patients with temporal lobe epilepsy (TLE), frontal lobe epilepsy (FLE) and patients with epilepsy not originating within the temporal or frontal lobes including idiopathic generalized epilepsies (eTLE/eFLE); (2) there is no significant difference between eTLE/eFLE and TLE regarding ER, while TLE and FLE patients perform worse than those with eTLE/eFLE, without significant differences between FLE and TLE regarding ToM ability. A descriptive analysis of the most commonly used assessment tools and stimulus material in this field revealed a lack of ecological validity, usability, and economic viability for everyday clinical practice.
Conclusions
Our meta-analysis shows that patients with epilepsy are at a significantly increased risk of deficits in social cognition. However, the underlying multifactorial mechanisms remain unclear. Future research should therefore specifically address the impairment of processing and methodological problems of testing.
Collapse
|
18
|
Morningstar M, Mattson WI, Nelson EE. Longitudinal Change in Neural Response to Vocal Emotion in Adolescence. Soc Cogn Affect Neurosci 2022; 17:890-903. [PMID: 35323933 PMCID: PMC9527472 DOI: 10.1093/scan/nsac021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2021] [Revised: 02/25/2022] [Accepted: 03/21/2022] [Indexed: 01/09/2023] Open
Abstract
Adolescence is associated with maturation of function within neural networks supporting the processing of social information. Previous longitudinal studies have established developmental influences on youth’s neural response to facial displays of emotion. Given the increasing recognition of the importance of non-facial cues to social communication, we build on existing work by examining longitudinal change in neural response to vocal expressions of emotion in 8- to 19-year-old youth. Participants completed a vocal emotion recognition task at two timepoints (1 year apart) while undergoing functional magnetic resonance imaging. The right inferior frontal gyrus, right dorsal striatum and right precentral gyrus showed decreases in activation to emotional voices across timepoints, which may reflect focalization of response in these areas. Activation in the dorsomedial prefrontal cortex was positively associated with age but was stable across timepoints. In addition, the slope of change across visits varied as a function of participants’ age in the right temporo-parietal junction (TPJ): this pattern of activation across timepoints and age may reflect ongoing specialization of function across childhood and adolescence. Decreased activation in the striatum and TPJ across timepoints was associated with better emotion recognition accuracy. Findings suggest that specialization of function in social cognitive networks may support the growth of vocal emotion recognition skills across adolescence.
Collapse
Affiliation(s)
- Michele Morningstar
- Correspondence should be addressed to Michele Morningstar, Department of Psychology, Queen’s University, 62 Arch Street, Kingston, ON K7L 3L3, Canada. E-mail:
| | - Whitney I Mattson
- Center for Biobehavioral Health, Nationwide Children’s Hospital, Columbus, OH 43205, USA
| | - Eric E Nelson
- Center for Biobehavioral Health, Nationwide Children’s Hospital, Columbus, OH 43205, USA
- Department of Pediatrics, The Ohio State University, Columbus, OH 43205, USA
| |
Collapse
|
19
|
The structural neural correlates of atypical facial expression recognition in autism spectrum disorder. Brain Imaging Behav 2022; 16:1428-1440. [PMID: 35048265 DOI: 10.1007/s11682-021-00626-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/18/2021] [Indexed: 11/02/2022]
Abstract
Previous studies have demonstrated that individuals with autism spectrum disorder (ASD) are worse at recognizing facial expressions than are typically developing (TD) individuals. The present study investigated the differences in structural neural correlates of emotion recognition between individuals with and without ASD using voxel-based morphometry (VBM). We acquired structural MRI data from 27 high-functioning adults with ASD and 27 age- and sex-matched TD individuals. The ability to recognize facial expressions was measured using a label-matching paradigm featuring six basic emotions (anger, disgust, fear, happiness, sadness, and surprise). The behavioural task did not find deficits of emotion recognition in ASD after controlling for intellectual ability. However, the VBM analysis for the region of interest showed a positive correlation between the averaged percent accuracy across six basic emotions and the grey matter volume of the right inferior frontal gyrus in TD individuals, but not in individuals with ASD. The VBM for the whole brain region under each emotion condition revealed a positive correlation between the percent accuracy for disgusted faces and the grey matter volume of the left dorsomedial prefrontal cortex in individuals with ASD, but not in TD individuals. The different pattern of correlations suggests that individuals with and without ASD use different processing mechanisms for recognizing others' facial expressions.
Collapse
|
20
|
Shany O, Greental A, Gilam G, Perry D, Bleich-Cohen M, Ovadia M, Cohen A, Raz G. Somatic engagement alters subsequent neurobehavioral correlates of affective mentalizing. Hum Brain Mapp 2021; 42:5846-5861. [PMID: 34651382 PMCID: PMC8596949 DOI: 10.1002/hbm.25640] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2021] [Revised: 07/04/2021] [Accepted: 08/14/2021] [Indexed: 01/10/2023] Open
Abstract
Socio‐emotional encounters involve a resonance of others' affective states, known as affect sharing (AS); and attribution of mental states to others, known as theory‐of‐mind (ToM). Empathy necessitates the integration of both processes, yet their interaction during emotional episodes and subsequent generation of inferences on others' affective states has rarely been tested. To address this, we developed a novel experimental design, wherein we manipulated AS by presenting nonverbal emotionally negative movies twice—each time accompanied by one of two soundtracks that accentuated either somatic cues or externally generated sounds. Movies were followed by questions addressing affective‐ToM (emotional inferences), cognitive‐ToM (inferences on beliefs and knowledge), and non‐ToM aspects. Results revealed a neural differentiation between AS, affective‐ToM, and cognitive‐ToM. AS movies activated regions that have been implicated in emotional (e.g., amygdala) and somatosensory processing, and synchronized brain activity between participants in the latter. Affective‐ToM activated the middle insula, limbic regions, and both ventral and dorsal portions of the medial prefrontal cortex (ventral medial prefrontal cortex [VMPFC] and dorsal medial prefrontal cortex [DMPFC], respectively), whereas cognitive‐ToM activated posteromedial and lateral–prefrontal and temporal cortices. Critically, AS movies specifically altered neural activation in AS and ToM‐related regions during subsequent affective‐ToM inferences, most notably in the DMPFC. Moreover, DMPFC–VMPFC connectivity correlated with affective‐ToM accuracy, when such questions followed AS movies. Our results associate empathic processes with designated neural activations and shed light on how neuro‐behavioral indices of affective ToM are shaped by preceding somatic engagement.
Collapse
Affiliation(s)
- Ofir Shany
- School of Psychological Sciences, Tel-Aviv University, Tel-Aviv, Israel.,Sagol Brain Institute, Tel-Aviv Sourasky Medical Center, Tel-Aviv, Israel
| | - Ayam Greental
- School of Psychological Sciences, Tel-Aviv University, Tel-Aviv, Israel.,Sagol Brain Institute, Tel-Aviv Sourasky Medical Center, Tel-Aviv, Israel.,Sagol School of Neuroscience, Tel-Aviv University, Tel-Aviv, Israel
| | - Gadi Gilam
- Division of Pain Medicine, Department of Anesthesiology, Perioperative, and Pain Medicine, Stanford University, Stanford, California, USA
| | - Daniella Perry
- Sagol Brain Institute, Tel-Aviv Sourasky Medical Center, Tel-Aviv, Israel
| | - Maya Bleich-Cohen
- Sagol Brain Institute, Tel-Aviv Sourasky Medical Center, Tel-Aviv, Israel
| | - Moran Ovadia
- Sagol Brain Institute, Tel-Aviv Sourasky Medical Center, Tel-Aviv, Israel.,Steve Tisch School of Film and Television, Tel Aviv University, Tel-Aviv, Israel
| | - Avihay Cohen
- Sagol Brain Institute, Tel-Aviv Sourasky Medical Center, Tel-Aviv, Israel
| | - Gal Raz
- Sagol Brain Institute, Tel-Aviv Sourasky Medical Center, Tel-Aviv, Israel.,Sagol School of Neuroscience, Tel-Aviv University, Tel-Aviv, Israel.,Steve Tisch School of Film and Television, Tel Aviv University, Tel-Aviv, Israel
| |
Collapse
|
21
|
Human face and gaze perception is highly context specific and involves bottom-up and top-down neural processing. Neurosci Biobehav Rev 2021; 132:304-323. [PMID: 34861296 DOI: 10.1016/j.neubiorev.2021.11.042] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2021] [Revised: 11/24/2021] [Accepted: 11/24/2021] [Indexed: 11/21/2022]
Abstract
This review summarizes human perception and processing of face and gaze signals. Face and gaze signals are important means of non-verbal social communication. The review highlights that: (1) some evidence is available suggesting that the perception and processing of facial information starts in the prenatal period; (2) the perception and processing of face identity, expression and gaze direction is highly context specific, the effect of race and culture being a case in point. Culture affects by means of experiential shaping and social categorization the way in which information on face and gaze is collected and perceived; (3) face and gaze processing occurs in the so-called 'social brain'. Accumulating evidence suggests that the processing of facial identity, facial emotional expression and gaze involves two parallel and interacting pathways: a fast and crude subcortical route and a slower cortical pathway. The flow of information is bi-directional and includes bottom-up and top-down processing. The cortical networks particularly include the fusiform gyrus, superior temporal sulcus (STS), intraparietal sulcus, temporoparietal junction and medial prefrontal cortex.
Collapse
|
22
|
Establishing a role of the semantic control network in social cognitive processing: A meta-analysis of functional neuroimaging studies. Neuroimage 2021; 245:118702. [PMID: 34742940 DOI: 10.1016/j.neuroimage.2021.118702] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2021] [Revised: 10/01/2021] [Accepted: 10/30/2021] [Indexed: 11/24/2022] Open
Abstract
The contribution and neural basis of cognitive control is under-specified in many prominent models of socio-cognitive processing. Important outstanding questions include whether there are multiple, distinguishable systems underpinning control and whether control is ubiquitously or selectively engaged across different social behaviours and task demands. Recently, it has been proposed that the regulation of social behaviours could rely on brain regions specialised in the controlled retrieval of semantic information, namely the anterior inferior frontal gyrus (IFG) and posterior middle temporal gyrus. Accordingly, we investigated for the first time whether the neural activation commonly found in social functional neuroimaging studies extends to these 'semantic control' regions. We conducted five coordinate-based meta-analyses to combine results of 499 fMRI/PET experiments and identified the brain regions consistently involved in semantic control, as well as four social abilities: theory of mind, trait inference, empathy and moral reasoning. This allowed an unprecedented parallel review of the neural networks associated with each of these cognitive domains. The results confirmed that the anterior left IFG region involved in semantic control is reliably engaged in all four social domains. This supports the hypothesis that social cognition is partly regulated by the neurocognitive system underpinning semantic control.
Collapse
|
23
|
Matt S, Dzhelyova M, Maillard L, Lighezzolo-Alnot J, Rossion B, Caharel S. The rapid and automatic categorization of facial expression changes in highly variable natural images. Cortex 2021; 144:168-184. [PMID: 34666300 DOI: 10.1016/j.cortex.2021.08.005] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Revised: 07/08/2021] [Accepted: 08/09/2021] [Indexed: 01/23/2023]
Abstract
Emotional expressions are quickly and automatically read from human faces under natural viewing conditions. Yet, categorization of facial expressions is typically measured in experimental contexts with homogenous sets of face stimuli. Here we evaluated how the 6 basic facial emotions (Fear, Disgust, Happiness, Anger, Surprise or Sadness) can be rapidly and automatically categorized with faces varying in head orientation, lighting condition, identity, gender, age, ethnic origin and background context. High-density electroencephalography was recorded in 17 participants viewing 50 s sequences with natural variable images of neutral-expression faces alternating at a 6 Hz rate. Every five stimuli (1.2 Hz), variable natural images of one of the six basic expressions were presented. Despite the wide physical variability across images, a significant F/5 = 1.2 Hz response and its harmonics (e.g., 2F/5 = 2.4 Hz, etc.) was observed for all expression changes at the group-level and in every individual participant. Facial categorization responses were found mainly over occipito-temporal sites, with distinct hemispheric lateralization and cortical topographies according to the different expressions. Specifically, a stronger response was found to Sadness categorization, especially over the left hemisphere, as compared to Fear and Happiness, together with a right hemispheric dominance for categorization of Fearful faces. Importantly, these differences were specific to upright faces, ruling out the contribution of low-level visual cues. Overall, these observations point to robust rapid and automatic facial expression categorization processes in the human brain.
Collapse
Affiliation(s)
- Stéphanie Matt
- Université de Lorraine, 2LPN, Nancy, France; Université de Lorraine, Laboratoire INTERPSY, Nancy, France.
| | - Milena Dzhelyova
- Université Catholique de Louvain, Institute of Research in Psychological Science, Louvain-la-Neuve, Belgium.
| | - Louis Maillard
- Université de Lorraine, CNRS, CRAN, Nancy, France; Université de Lorraine, CHRU-Nancy, Service de Neurologie, Nancy, France.
| | | | - Bruno Rossion
- Université Catholique de Louvain, Institute of Research in Psychological Science, Louvain-la-Neuve, Belgium; Université de Lorraine, CNRS, CRAN, Nancy, France; Université de Lorraine, CHRU-Nancy, Service de Neurologie, Nancy, France.
| | - Stéphanie Caharel
- Université de Lorraine, 2LPN, Nancy, France; Institut Universitaire de France, Paris, France.
| |
Collapse
|
24
|
Liu M, Liu CH, Zheng S, Zhao K, Fu X. Reexamining the neural network involved in perception of facial expression: A meta-analysis. Neurosci Biobehav Rev 2021; 131:179-191. [PMID: 34536463 DOI: 10.1016/j.neubiorev.2021.09.024] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Revised: 08/19/2021] [Accepted: 09/09/2021] [Indexed: 11/15/2022]
Abstract
Perception of facial expression is essential for social interactions. Although a few competing models have enjoyed some success to map brain regions, they are also facing difficult challenges. The current study used an updated activation likelihood estimation (ALE) method of meta-analysis to explore the involvement of brain regions in facial expression processing. The sample contained 96 functional magnetic resonance imaging (fMRI) and positron emission tomography (PET) studies of healthy adults with the results of whole-brain analyses. The key findings revealed that the ventral pathway, especially the left fusiform face area (FFA) region, was more responsive to facial expression. The left posterior FFA showed strong involvement when participants passively viewing emotional faces without being asked to judge the type of expression or other attributes of the stimuli. Through meta-analytic connectivity modeling (MACM) of the main brain regions in the ventral pathway, we constructed a co-activating neural network as a revised model of facial expression processing that assigns prominent roles to the amygdala, FFA, the occipital gyrus, and the inferior frontal gyrus.
Collapse
Affiliation(s)
- Mingtong Liu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Chang Hong Liu
- Department of Psychology, Bournemouth University, Dorset, United Kingdom
| | - Shuang Zheng
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Ke Zhao
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China.
| | - Xiaolan Fu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China.
| |
Collapse
|
25
|
Ikeda S. Dual Development of Affective-Speech-Based Emotion Perception. The Journal of Genetic Psychology 2021; 182:462-470. [PMID: 34424134 DOI: 10.1080/00221325.2021.1967270] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
Studies have shown that when interpreting emotions from speech, adults focus on prosody, while young children focus on lexical content. However, the kind of socio-emotional processing implemented in such emotion perception, as well as how it is developed, remains unclear. The present study examined the development of a dual process in affective-speech-induced emotion perception in 3- and 5-year-old children. Previous studies have suggested that unconscious emotion perception at the gaze level and conscious emotion judgment in response to speakers' emotions develop differently. Children were presented with affective speech, which included inconsistent lexical content and prosody (e.g., saying 'thank you' in an angry tone), and asked to report the speaker's emotions by pointing to the corresponding facial expressions (happy or angry). Additionally, the duration for which children gazed at each facial expression was examined. The results showed that 3-year-old children judged the speaker's emotions based on lexical content more than the 5-year-olds, who used prosody. However, at the gaze level, both the 3- and 5-year-olds focused longer on the facial expressions that matched the prosody. The results suggest that two processes can be observed: unconscious emotion perception, which matches prosody and expression, and assessment of the speaker's emotions by weighting the lexical content and prosody.
Collapse
Affiliation(s)
- Shinnosuke Ikeda
- Faculty of Humanities, Kyoto University of Advanced Science, Kyoto, Japan
| |
Collapse
|
26
|
Ikeda S. Approach-avoidance responses and categorical perception of ambiguous facial expressions. INTERNATIONAL JOURNAL OF PSYCHOLOGY 2021; 57:227-239. [PMID: 34405403 DOI: 10.1002/ijop.12803] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2020] [Accepted: 08/04/2021] [Indexed: 11/11/2022]
Abstract
Emotion perception of facial expressions involves two processes: quick approach-avoidance responses and subsequent sorting into emotional categories (i.e., happiness, anger), considering the context. Sorting of morphed ambiguous facial expressions is known to occur categorically, but the occurrence of approach-avoidance responses for morphed facial expressions is yet to be investigated. The present study used morphed angry and fearful facial expressions and measured approach-avoidance responses among Japanese university students (Experiment 1, n = 29). Similar experiments with linguistic load (Experiment 2, n = 28) and visual load (Experiment 3, n = 29) were conducted. The results indicated categorical perception in the sorting of facial expressions but no approach-avoidance response for morphed expressions. Furthermore, linguistic load affected the categorisation of facial expressions, but neither linguistic load nor visual load affected the approach-avoidance response. These results support the idea that the non-linguistic approach-avoidance response and the linguistic categorisation of facial expressions are two different processes. The nature of the emotional perception process is also discussed.
Collapse
Affiliation(s)
- Shinnosuke Ikeda
- Faculty of Humanities, Kyoto University of Advanced Science, Kyoto, Japan
| |
Collapse
|
27
|
Facial expression recognition: A meta-analytic review of theoretical models and neuroimaging evidence. Neurosci Biobehav Rev 2021; 127:820-836. [PMID: 34052280 DOI: 10.1016/j.neubiorev.2021.05.023] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2020] [Revised: 04/03/2021] [Accepted: 05/24/2021] [Indexed: 11/23/2022]
Abstract
Discrimination of facial expressions is an elementary function of the human brain. While the way emotions are represented in the brain has long been debated, common and specific neural representations in recognition of facial expressions are also complicated. To examine brain organizations and asymmetry on discrete and dimensional facial emotions, we conducted an activation likelihood estimation meta-analysis and meta-analytic connectivity modelling on 141 studies with a total of 3138 participants. We found consistent engagement of the amygdala and a common set of brain networks across discrete and dimensional emotions. The left-hemisphere dominance of the amygdala and AI across categories of facial expression, but category-specific lateralization of the vmPFC, suggesting a flexibly asymmetrical neural representations of facial expression recognition. These results converge to characteristic activation and connectivity patterns across discrete and dimensional emotion categories in recognition of facial expressions. Our findings provide the first quantitatively meta-analytic brain network-based evidence supportive of the psychological constructionist hypothesis in facial expression recognition.
Collapse
|
28
|
Kegel LC, Brugger P, Frühholz S, Grunwald T, Hilfiker P, Kohnen O, Loertscher ML, Mersch D, Rey A, Sollfrank T, Steiger BK, Sternagel J, Weber M, Jokeit H. Dynamic human and avatar facial expressions elicit differential brain responses. Soc Cogn Affect Neurosci 2021; 15:303-317. [PMID: 32232359 PMCID: PMC7235958 DOI: 10.1093/scan/nsaa039] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2019] [Revised: 03/02/2020] [Accepted: 03/24/2020] [Indexed: 01/25/2023] Open
Abstract
Computer-generated characters, so-called avatars, are widely used in advertising, entertainment, human–computer interaction or as research tools to investigate human emotion perception. However, brain responses to avatar and human faces have scarcely been studied to date. As such, it remains unclear whether dynamic facial expressions of avatars evoke different brain responses than dynamic facial expressions of humans. In this study, we designed anthropomorphic avatars animated with motion tracking and tested whether the human brain processes fearful and neutral expressions in human and avatar faces differently. Our fMRI results showed that fearful human expressions evoked stronger responses than fearful avatar expressions in the ventral anterior and posterior cingulate gyrus, the anterior insula, the anterior and posterior superior temporal sulcus, and the inferior frontal gyrus. Fearful expressions in human and avatar faces evoked similar responses in the amygdala. We did not find different responses to neutral human and avatar expressions. Our results highlight differences, but also similarities in the processing of fearful human expressions and fearful avatar expressions even if they are designed to be highly anthropomorphic and animated with motion tracking. This has important consequences for research using dynamic avatars, especially when processes are investigated that involve cortical and subcortical regions.
Collapse
Affiliation(s)
- Lorena C Kegel
- Swiss Epilepsy Center, CH-8008 Zurich, Switzerland.,Department of Psychology, University of Zurich, Zurich, Switzerland
| | - Peter Brugger
- Neuropsychology Unit, Valens Rehabilitation Centre, Valens, Switzerland.,Department of Psychiatry, Psychotherapy, and Psychosomatics, University Hospital of Psychiatry Zurich, Zurich, Switzerland
| | - Sascha Frühholz
- Department of Psychology, University of Zurich, Zurich, Switzerland
| | | | | | - Oona Kohnen
- Swiss Epilepsy Center, CH-8008 Zurich, Switzerland
| | - Miriam L Loertscher
- Institute for the Performing Arts and Film, Zurich University of the Arts, Zurich, Switzerland.,Department of Psychology, University of Bern, Bern, Switzerland
| | - Dieter Mersch
- Institute for Critical Theory, Zurich University of the Arts, Zurich, Switzerland
| | - Anton Rey
- Institute for the Performing Arts and Film, Zurich University of the Arts, Zurich, Switzerland
| | | | | | - Joerg Sternagel
- Institute for Critical Theory, Zurich University of the Arts, Zurich, Switzerland
| | - Michel Weber
- Institute for the Performing Arts and Film, Zurich University of the Arts, Zurich, Switzerland
| | - Hennric Jokeit
- Swiss Epilepsy Center, CH-8008 Zurich, Switzerland.,Department of Psychology, University of Zurich, Zurich, Switzerland
| |
Collapse
|
29
|
Daedelow LS, Beck A, Romund L, Mascarell-Maricic L, Dziobek I, Romanczuk-Seiferth N, Wüstenberg T, Heinz A. Neural correlates of RDoC-specific cognitive processes in a high-functional autistic patient: a statistically validated case report. J Neural Transm (Vienna) 2021; 128:845-859. [PMID: 34003357 PMCID: PMC8205905 DOI: 10.1007/s00702-021-02352-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2020] [Accepted: 05/08/2021] [Indexed: 11/29/2022]
Abstract
The level of functioning of individuals with autism spectrum disorder (ASD) varies widely. To better understand the neurobiological mechanism associated with high-functioning ASD, we studied the rare case of a female patient with an exceptional professional career in the highly competitive academic field of Mathematics. According to the Research Domain Criteria (RDoC) approach, which proposes to describe the basic dimensions of functioning by integrating different levels of information, we conducted four fMRI experiments targeting the (1) social processes domain (Theory of mind (ToM) and face matching), (2) positive valence domain (reward processing), and (3) cognitive domain (N-back). Patient’s data were compared to data of 14 healthy controls (HC). Additionally, we assessed the subjective experience of our case during the experiments. The patient showed increased response times during face matching and achieved a higher total gain in the Reward task, whereas her performance in N-back and ToM was similar to HC. Her brain function differed mainly in the positive valence and cognitive domains. During reward processing, she showed reduced activity in a left-hemispheric frontal network and cortical midline structures but increased connectivity within this network. During the working memory task patients’ brain activity and connectivity in left-hemispheric temporo-frontal regions were elevated. In the ToM task, activity in posterior cingulate cortex and temporo-parietal junction was reduced. We suggest that the high level of functioning in our patient is rather related to the effects in brain connectivity than to local cortical information processing and that subjective report provides a fruitful framework for interpretation.
Collapse
Affiliation(s)
- Laura S Daedelow
- Department of Psychiatry and Psychotherapy CCM, Charité-Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Berlin, Germany
| | - Anne Beck
- Health and Medical University Potsdam, Potsdam, Germany
| | - Lydia Romund
- Department of Psychiatry and Psychotherapy CCM, Charité-Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Berlin, Germany
| | - Lea Mascarell-Maricic
- Department of Psychiatry and Psychotherapy CCM, Charité-Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Berlin, Germany
| | - Isabel Dziobek
- Berlin School of Mind and Brain, Berlin, Germany.,Department of Psychology, Humboldt-University of Berlin, Berlin, Germany
| | - Nina Romanczuk-Seiferth
- Department of Psychiatry and Psychotherapy CCM, Charité-Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Berlin, Germany
| | - Torsten Wüstenberg
- Department of Psychiatry and Psychotherapy CCM, Charité-Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Berlin, Germany. .,Department of Clinical Psychology and Psychotherapy, Psychological Institute, Ruprecht-Karls-University Heidelberg, Hauptstr. 47-51, 69117, Heidelberg, Germany.
| | - Andreas Heinz
- Department of Psychiatry and Psychotherapy CCM, Charité-Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Berlin, Germany
| |
Collapse
|
30
|
Li Y, Li W, Zhang T, Zhang J, Jin Z, Li L. Probing the role of the right inferior frontal gyrus during Pain-Related empathy processing: Evidence from fMRI and TMS. Hum Brain Mapp 2021; 42:1518-1531. [PMID: 33283946 PMCID: PMC7927301 DOI: 10.1002/hbm.25310] [Citation(s) in RCA: 20] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2020] [Revised: 11/22/2020] [Accepted: 11/26/2020] [Indexed: 01/10/2023] Open
Abstract
Recent studies have suggested that the right inferior frontal gyrus (rIFG) may be involved in pain-related empathy. To verify the role of the rIFG, we performed a functional magnetic resonance imaging (fMRI) experiment to replicate previous research and further designed a noninvasive repetitive transcranial magnetic stimulation (rTMS) experiment to probe the causal role of the rIFG in pain-related empathy processing. We assigned 74 volunteers (37 females) to three groups. Group 1 (n = 26) performed a task in which participants were required to perceive pain in others (task of pain: TP) and we used fMRI to observe the activity of the rIFG during pain-related empathy processing. Then, we applied online rTMS to the rIFG and the vertex site (as reference site) to observe the performance of Group 2 (n = 24; performing TP) and Group 3 (n = 24; performing a control task of identifying body parts; task of body: TB). fMRI experiment demonstrated stronger activation in the rIFG than in the vertex during the perception of pain in others (p < .0001, Bonferroni-corrected). rTMS experiment indicated that when the rIFG was temporarily disrupted, participants perceived pain in others significantly more slowly (p < .0001, Bonferroni-corrected) than when the vertex was disrupted. Our results provide evidence that the rIFG is involved in pain-related empathy processing, which yields insights into how the brain perceives pain in others.
Collapse
Affiliation(s)
- Yun Li
- MOE Key Lab for Neuroinformation, High‐Field Magnetic Resonance Brain Imaging Key Laboratory of Sichuan Province, Center for Psychiatry and Psychology, School of Life Science and TechnologyUniversity of Electronic Science and Technology of ChinaChengduChina
- School of ManagementChengdu University of Traditional Chinese MedicineChengduChina
| | - Wenjuan Li
- MOE Key Lab for Neuroinformation, High‐Field Magnetic Resonance Brain Imaging Key Laboratory of Sichuan Province, Center for Psychiatry and Psychology, School of Life Science and TechnologyUniversity of Electronic Science and Technology of ChinaChengduChina
| | - Tingting Zhang
- MOE Key Lab for Neuroinformation, High‐Field Magnetic Resonance Brain Imaging Key Laboratory of Sichuan Province, Center for Psychiatry and Psychology, School of Life Science and TechnologyUniversity of Electronic Science and Technology of ChinaChengduChina
| | - Junjun Zhang
- MOE Key Lab for Neuroinformation, High‐Field Magnetic Resonance Brain Imaging Key Laboratory of Sichuan Province, Center for Psychiatry and Psychology, School of Life Science and TechnologyUniversity of Electronic Science and Technology of ChinaChengduChina
| | - Zhenlan Jin
- MOE Key Lab for Neuroinformation, High‐Field Magnetic Resonance Brain Imaging Key Laboratory of Sichuan Province, Center for Psychiatry and Psychology, School of Life Science and TechnologyUniversity of Electronic Science and Technology of ChinaChengduChina
| | - Ling Li
- MOE Key Lab for Neuroinformation, High‐Field Magnetic Resonance Brain Imaging Key Laboratory of Sichuan Province, Center for Psychiatry and Psychology, School of Life Science and TechnologyUniversity of Electronic Science and Technology of ChinaChengduChina
| |
Collapse
|
31
|
Del Re EC, Stone WS, Bouix S, Seitz J, Zeng V, Guliano A, Somes N, Zhang T, Reid B, Lyall A, Lyons M, Li H, Whitfield-Gabrieli S, Keshavan M, Seidman LJ, McCarley RW, Wang J, Tang Y, Shenton ME, Niznikiewicz MA. Baseline Cortical Thickness Reductions in Clinical High Risk for Psychosis: Brain Regions Associated with Conversion to Psychosis Versus Non-Conversion as Assessed at One-Year Follow-Up in the Shanghai-At-Risk-for-Psychosis (SHARP) Study. Schizophr Bull 2021; 47:562-574. [PMID: 32926141 PMCID: PMC8480195 DOI: 10.1093/schbul/sbaa127] [Citation(s) in RCA: 24] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
OBJECTIVE To assess cortical thickness (CT) and surface area (SA) of frontal, temporal, and parietal brain regions in a large clinical high risk for psychosis (CHR) sample, and to identify cortical brain abnormalities in CHR who convert to psychosis and in the whole CHR sample, compared with the healthy controls (HC). METHODS Magnetic resonance imaging, clinical, and cognitive data were acquired at baseline in 92 HC, 130 non-converters, and 22 converters (conversion assessed at 1-year follow-up). CT and SA at baseline were calculated for frontal, temporal, and parietal subregions. Correlations between regions showing group differences and clinical scores and age were also obtained. RESULTS CT but not SA was significantly reduced in CHR compared with HC. Two patterns of findings emerged: (1) In converters, CT was significantly reduced relative to non-converters and controls in the banks of superior temporal sulcus, Heschl's gyrus, and pars triangularis and (2) CT in the inferior parietal and supramarginal gyrus, and at trend level in the pars opercularis, fusiform, and middle temporal gyri was significantly reduced in all high-risk individuals compared with HC. Additionally, reduced CT correlated significantly with older age in HC and in non-converters but not in converters. CONCLUSIONS These results show for the first time that fronto-temporo-parietal abnormalities characterized all CHR, that is, both converters and non-converters, relative to HC, while CT abnormalities in converters relative to CHR-NC and HC were found in core auditory and language processing regions.
Collapse
Affiliation(s)
- Elisabetta C Del Re
- Laboratory of Neuroscience, Department of Psychiatry, VA Boston
Healthcare System, Brockton Division, and Harvard Medical School,
Boston, MA
- Psychiatry Neuroimaging Laboratory, Department of Psychiatry, Brigham
and Women’s Hospital, and Harvard Medical School, Boston,
MA
| | - William S Stone
- Department of Psychiatry, Beth Israel Deaconess Medical Center, Harvard
Medical School, Boston, MA
| | - Sylvain Bouix
- Psychiatry Neuroimaging Laboratory, Department of Psychiatry, Brigham
and Women’s Hospital, and Harvard Medical School, Boston,
MA
| | - Johanna Seitz
- Psychiatry Neuroimaging Laboratory, Department of Psychiatry, Brigham
and Women’s Hospital, and Harvard Medical School, Boston,
MA
| | - Victor Zeng
- Department of Psychiatry, Beth Israel Deaconess Medical Center, Harvard
Medical School, Boston, MA
| | - Anthony Guliano
- Department of Psychiatry, Beth Israel Deaconess Medical Center, Harvard
Medical School, Boston, MA
| | - Nathaniel Somes
- Psychiatry Neuroimaging Laboratory, Department of Psychiatry, Brigham
and Women’s Hospital, and Harvard Medical School, Boston,
MA
| | - Tianhong Zhang
- Shanghai Mental Health Center, Shanghai Jiaotong University School of
Medicine, Shanghai Key Laboratory of Psychotic Disorders, SHARP
Program, Shanghai China
| | - Benjamin Reid
- Psychiatry Neuroimaging Laboratory, Department of Psychiatry, Brigham
and Women’s Hospital, and Harvard Medical School, Boston,
MA
| | - Amanda Lyall
- Psychiatry Neuroimaging Laboratory, Department of Psychiatry, Brigham
and Women’s Hospital, and Harvard Medical School, Boston,
MA
- Department of Psychiatry, Massachusetts General Hospital and Harvard
Medical School, Boston, MA
| | - Monica Lyons
- Psychiatry Neuroimaging Laboratory, Department of Psychiatry, Brigham
and Women’s Hospital, and Harvard Medical School, Boston,
MA
- Department of Psychiatry, Massachusetts General Hospital and Harvard
Medical School, Boston, MA
| | - Huijun Li
- Florida A&M University, Department of Psychology,
Tallahassee, FL
| | | | - Matcheri Keshavan
- Department of Psychiatry, Beth Israel Deaconess Medical Center, Harvard
Medical School, Boston, MA
| | - Larry J Seidman
- Department of Psychiatry, Beth Israel Deaconess Medical Center, Harvard
Medical School, Boston, MA
- Department of Psychiatry, Massachusetts General Hospital and Harvard
Medical School, Boston, MA
| | - Robert W McCarley
- Laboratory of Neuroscience, Department of Psychiatry, VA Boston
Healthcare System, Brockton Division, and Harvard Medical School,
Boston, MA
| | - Jijun Wang
- Shanghai Mental Health Center, Shanghai Jiaotong University School of
Medicine, Shanghai Key Laboratory of Psychotic Disorders, SHARP
Program, Shanghai China
| | - Yingying Tang
- Shanghai Mental Health Center, Shanghai Jiaotong University School of
Medicine, Shanghai Key Laboratory of Psychotic Disorders, SHARP
Program, Shanghai China
| | - Martha E Shenton
- Psychiatry Neuroimaging Laboratory, Department of Psychiatry, Brigham
and Women’s Hospital, and Harvard Medical School, Boston,
MA
- Department of Psychiatry, Massachusetts General Hospital and Harvard
Medical School, Boston, MA
- Department of Radiology, Brigham and Women’s Hospital, and
Harvard Medical School, Boston, MA
- Research and Development, VA Boston Healthcare System,
Boston, MA
| | - Margaret A Niznikiewicz
- Laboratory of Neuroscience, Department of Psychiatry, VA Boston
Healthcare System, Brockton Division, and Harvard Medical School,
Boston, MA
- Department of Psychiatry, Beth Israel Deaconess Medical Center, Harvard
Medical School, Boston, MA
- To whom correspondence should be addressed; e-mail:
| |
Collapse
|
32
|
Pozzi E, Vijayakumar N, Rakesh D, Whittle S. Neural Correlates of Emotion Regulation in Adolescents and Emerging Adults: A Meta-analytic Study. Biol Psychiatry 2021; 89:194-204. [PMID: 33268030 DOI: 10.1016/j.biopsych.2020.08.006] [Citation(s) in RCA: 31] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/13/2020] [Revised: 08/10/2020] [Accepted: 08/11/2020] [Indexed: 12/16/2022]
Abstract
BACKGROUND The development of adaptive implicit and explicit emotion regulation skills is crucial for mental health. Adolescence and emerging adulthood are periods of heightened risk for psychopathology associated with emotion dysregulation, and neurodevelopmental mechanisms have been proposed to account for this increased risk. However, progress in understanding these mechanisms has been hampered by an incomplete knowledge of the neural underpinnings of emotion regulation during development. METHODS Using activation likelihood estimation, we conducted a quantitative analysis of functional magnetic resonance imaging studies in healthy developmental samples (i.e., adolescence [10-18 years of age] and emerging adulthood [19-30 years of age]) investigating emotion reactivity (N studies = 48), and implicit (N studies = 41) and explicit (N studies = 19) emotion regulation processes. RESULTS Explicit emotion regulation was associated with activation in frontal, temporal, and parietal regions, whereas both implicit regulation and emotion reactivity were associated with activation in the amygdala and posterior temporal regions. During implicit regulation, adolescents exhibited more consistent activation of the amygdala, fusiform gyrus, and thalamus than emerging adults, who showed more consistent activation in the posterior superior temporal sulcus. CONCLUSIONS Our results suggest that emotion reactivity and regulation in developmental samples engage a robust group of regions that are implicated in bottom-up and top-down emotional responding. Adolescents are also more likely to recruit regions involved in early stages of emotion processing during implicit regulation, while emerging adults recruit higher-order regions involved in the extraction of semantic meaning. Findings have implications for future research aiming to better understand the neurodevelopmental mechanisms underlying risk for psychopathology.
Collapse
Affiliation(s)
- Elena Pozzi
- Melbourne Neuropsychiatry Centre, Department of Psychiatry, The University of Melbourne and Melbourne Health, Melbourne, Victoria, Australia
| | | | - Divyangana Rakesh
- Melbourne Neuropsychiatry Centre, Department of Psychiatry, The University of Melbourne and Melbourne Health, Melbourne, Victoria, Australia
| | - Sarah Whittle
- Melbourne Neuropsychiatry Centre, Department of Psychiatry, The University of Melbourne and Melbourne Health, Melbourne, Victoria, Australia; Melbourne School of Psychological Sciences, The University of Melbourne, Melbourne, Victoria, Australia.
| |
Collapse
|
33
|
Serotonin differentially modulates the temporal dynamics of the limbic response to facial emotions in male adults with and without autism spectrum disorder (ASD): a randomised placebo-controlled single-dose crossover trial. Neuropsychopharmacology 2020; 45:2248-2256. [PMID: 32388538 PMCID: PMC7784897 DOI: 10.1038/s41386-020-0693-0] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/13/2019] [Revised: 03/16/2020] [Accepted: 04/24/2020] [Indexed: 12/04/2022]
Abstract
Emotion processing-including signals from facial expressions-is often altered in individuals with autism spectrum disorder (ASD). The biological basis of this is poorly understood but may include neurochemically mediated differences in the responsivity of key 'limbic' regions (including amygdala, ventromedial prefrontal cortex (vmPFC) and nucleus accumbens (NAc)). Emerging evidence also suggests that ASD may be a disorder of brain temporal dynamics. Moreover, serotonin (5-HT) has been shown to be a key regulator of both facial-emotion processing and brain dynamics, and 5-HT abnormalities have been consistently implicated in ASD. To date, however, no one has examined how 5-HT influences the dynamics of facial-emotion processing in ASD. Therefore, we compared the influence of 5-HT on the responsivity of brain dynamics during facial-emotion processing in individuals with and without ASD. Participants completed a facial-emotion processing fMRI task at least 8 days apart using a randomised double-blind crossover design. At each visit they received either a single 20-mg oral dose of the selective serotonin reuptake inhibitor (SSRI) citalopram or placebo. We found that citalopram (which increases levels of 5-HT) caused sustained activation in key limbic regions during processing of negative facial emotions in adults with ASD-but not in neurotypical adults. The neurotypical adults' limbic response reverted more rapidly to baseline following a 5-HT-challenge. Our results suggest that serotonergic homoeostatic control of the temporal dynamics in limbic regions is altered in adults with ASD, and provide a fresh perspective on the biology of ASD.
Collapse
|
34
|
Samaey C, Van der Donck S, van Winkel R, Boets B. Facial Expression Processing Across the Autism-Psychosis Spectra: A Review of Neural Findings and Associations With Adverse Childhood Events. Front Psychiatry 2020; 11:592937. [PMID: 33281648 PMCID: PMC7691238 DOI: 10.3389/fpsyt.2020.592937] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/08/2020] [Accepted: 10/09/2020] [Indexed: 11/13/2022] Open
Abstract
Autism spectrum disorder (ASD) and primary psychosis are classified as distinct neurodevelopmental disorders, yet they display overlapping epidemiological, environmental, and genetic components as well as endophenotypic similarities. For instance, both disorders are characterized by impairments in facial expression processing, a crucial skill for effective social communication, and both disorders display an increased prevalence of adverse childhood events (ACE). This narrative review provides a brief summary of findings from neuroimaging studies investigating facial expression processing in ASD and primary psychosis with a focus on the commonalities and differences between these disorders. Individuals with ASD and primary psychosis activate the same brain regions as healthy controls during facial expression processing, albeit to a different extent. Overall, both groups display altered activation in the fusiform gyrus and amygdala as well as altered connectivity among the broader face processing network, probably indicating reduced facial expression processing abilities. Furthermore, delayed or reduced N170 responses have been reported in ASD and primary psychosis, but the significance of these findings is questioned, and alternative frequency-tagging electroencephalography (EEG) measures are currently explored to capture facial expression processing impairments more selectively. Face perception is an innate process, but it is also guided by visual learning and social experiences. Extreme environmental factors, such as adverse childhood events, can disrupt normative development and alter facial expression processing. ACE are hypothesized to induce altered neural facial expression processing, in particular a hyperactive amygdala response toward negative expressions. Future studies should account for the comorbidity among ASD, primary psychosis, and ACE when assessing facial expression processing in these clinical groups, as it may explain some of the inconsistencies and confound reported in the field.
Collapse
Affiliation(s)
- Celine Samaey
- Department of Neurosciences, Center for Clinical Psychiatry, KU Leuven, Leuven, Belgium
| | - Stephanie Van der Donck
- Department of Neurosciences, Center for Developmental Psychiatry, KU Leuven, Leuven, Belgium
- Leuven Autism Research (LAuRes), KU Leuven, Leuven, Belgium
| | - Ruud van Winkel
- Department of Neurosciences, Center for Clinical Psychiatry, KU Leuven, Leuven, Belgium
- University Psychiatric Center (UPC), KU Leuven, Leuven, Belgium
| | - Bart Boets
- Department of Neurosciences, Center for Developmental Psychiatry, KU Leuven, Leuven, Belgium
- Leuven Autism Research (LAuRes), KU Leuven, Leuven, Belgium
| |
Collapse
|
35
|
Nonverbal auditory communication - Evidence for integrated neural systems for voice signal production and perception. Prog Neurobiol 2020; 199:101948. [PMID: 33189782 DOI: 10.1016/j.pneurobio.2020.101948] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2020] [Revised: 10/12/2020] [Accepted: 11/04/2020] [Indexed: 12/24/2022]
Abstract
While humans have developed a sophisticated and unique system of verbal auditory communication, they also share a more common and evolutionarily important nonverbal channel of voice signaling with many other mammalian and vertebrate species. This nonverbal communication is mediated and modulated by the acoustic properties of a voice signal, and is a powerful - yet often neglected - means of sending and perceiving socially relevant information. From the viewpoint of dyadic (involving a sender and a signal receiver) voice signal communication, we discuss the integrated neural dynamics in primate nonverbal voice signal production and perception. Most previous neurobiological models of voice communication modelled these neural dynamics from the limited perspective of either voice production or perception, largely disregarding the neural and cognitive commonalities of both functions. Taking a dyadic perspective on nonverbal communication, however, it turns out that the neural systems for voice production and perception are surprisingly similar. Based on the interdependence of both production and perception functions in communication, we first propose a re-grouping of the neural mechanisms of communication into auditory, limbic, and paramotor systems, with special consideration for a subsidiary basal-ganglia-centered system. Second, we propose that the similarity in the neural systems involved in voice signal production and perception is the result of the co-evolution of nonverbal voice production and perception systems promoted by their strong interdependence in dyadic interactions.
Collapse
|
36
|
Johnson C, Langbehn KE, Long JD, Moser D, Cross S, Gutmann L, Nopoulos PC, van der Plas E. Encoding of facial expressions in individuals with adult-onset myotonic dystrophy type 1. J Clin Exp Neuropsychol 2020; 42:932-940. [PMID: 33028165 PMCID: PMC7676461 DOI: 10.1080/13803395.2020.1826410] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2020] [Accepted: 09/13/2020] [Indexed: 10/23/2022]
Abstract
Introduction: Emotional issues are often reported among individuals with myotonic dystrophy type 1 (DM1) and some studies have suggested that deficits in ability to quickly encode emotions may contribute to these problems. However, poor performance on emotion encoding tasks could also be explained by a more general cognitive deficit (Full Scale IQ [FSIQ]), rather than a specific deficit in emotional processing. Since individuals with DM1 are known to exhibit difficulties in general cognitive abilities, it is important to account for FSIQ when evaluating emotion encoding. The aim of this study was to compare emotion encoding abilities between individuals with and without DM1, while adjusting for the impact of general cognitive abilities (FSIQ). Methods: The sample included 35 individuals with adult-onset DM1 and 54 unaffected adults who completed assessments of emotion encoding abilities (Ekman faces test) and general cognitive abilities (Wechsler Adult Intelligence Scale-IV). Performance on the emotion encoding task was operationalized as proportion correct and response time. Group differences in proportion correct were evaluated with generalized linear regression, while linear regression models were used to determine the effect of group on response time. Models were adjusted for age, sex, and FSIQ. The false discovery rate (FDR) was applied to control false positives due to multiple comparisons (pfdr ). Results: No significant group differences were observed for emotion encoding abilities (all pfdr > 0.13). FSIQ was significantly associated with proportion correct and with response time (all pfdr < 0.05). Conclusions: Emotion encoding appears intact in individuals with DM1 and variation in the ability to encode facial expressions was associated with FSIQ. Further research is required to address the relationship between general cognitive abilities and emotion encoding abilities among DM1 patients.
Collapse
Affiliation(s)
- Claire Johnson
- Department of Psychiatry, University of Iowa Hospitals & Clinics, Iowa City IA, USA
| | - Kathleen E. Langbehn
- Department of Psychiatry, University of Iowa Hospitals & Clinics, Iowa City IA, USA
| | - Jeff D. Long
- Department of Psychiatry, University of Iowa Hospitals & Clinics, Iowa City IA, USA
| | - David Moser
- Department of Psychiatry, University of Iowa Hospitals & Clinics, Iowa City IA, USA
| | - Stephen Cross
- Department of Psychiatry, University of Iowa Hospitals & Clinics, Iowa City IA, USA
| | - Laurie Gutmann
- Department of Neurology, University of Iowa Hospitals & Clinics, Iowa City, IA, USA
| | - Peggy C. Nopoulos
- Department of Psychiatry, University of Iowa Hospitals & Clinics, Iowa City IA, USA
| | - Ellen van der Plas
- Department of Psychiatry, University of Iowa Hospitals & Clinics, Iowa City IA, USA
| |
Collapse
|
37
|
Morningstar M, Mattson WI, Singer S, Venticinque JS, Nelson EE. Children and adolescents' neural response to emotional faces and voices: Age-related changes in common regions of activation. Soc Neurosci 2020; 15:613-629. [PMID: 33017278 DOI: 10.1080/17470919.2020.1832572] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
The perception of facial and vocal emotional expressions engages overlapping regions of the brain. However, at a behavioral level, the ability to recognize the intended emotion in both types of nonverbal cues follows a divergent developmental trajectory throughout childhood and adolescence. The current study a) identified regions of common neural activation to facial and vocal stimuli in 8- to 19-year-old typically-developing adolescents, and b) examined age-related changes in blood-oxygen-level dependent (BOLD) response within these areas. Both modalities elicited activation in an overlapping network of subcortical regions (insula, thalamus, dorsal striatum), visual-motor association areas, prefrontal regions (inferior frontal cortex, dorsomedial prefrontal cortex), and the right superior temporal gyrus. Within these regions, increased age was associated with greater frontal activation to voices, but not faces. Results suggest that processing facial and vocal stimuli elicits activation in common areas of the brain in adolescents, but that age-related changes in response within these regions may vary by modality.
Collapse
Affiliation(s)
- M Morningstar
- Center for Biobehavioral Health, Nationwide Children's Hospital , Columbus, OH, USA.,Department of Pediatrics, The Ohio State University , Columbus, OH, USA.,Department of Psychology, Queen's University , Kingston, ON, Canada
| | - W I Mattson
- Center for Biobehavioral Health, Nationwide Children's Hospital , Columbus, OH, USA
| | - S Singer
- Center for Biobehavioral Health, Nationwide Children's Hospital , Columbus, OH, USA
| | - J S Venticinque
- Center for Biobehavioral Health, Nationwide Children's Hospital , Columbus, OH, USA
| | - E E Nelson
- Center for Biobehavioral Health, Nationwide Children's Hospital , Columbus, OH, USA.,Department of Pediatrics, The Ohio State University , Columbus, OH, USA
| |
Collapse
|
38
|
Guldner S, Nees F, McGettigan C. Vocomotor and Social Brain Networks Work Together to Express Social Traits in Voices. Cereb Cortex 2020; 30:6004-6020. [PMID: 32577719 DOI: 10.1093/cercor/bhaa175] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2020] [Revised: 05/08/2020] [Accepted: 05/31/2020] [Indexed: 11/14/2022] Open
Abstract
Voice modulation is important when navigating social interactions-tone of voice in a business negotiation is very different from that used to comfort an upset child. While voluntary vocal behavior relies on a cortical vocomotor network, social voice modulation may require additional social cognitive processing. Using functional magnetic resonance imaging, we investigated the neural basis for social vocal control and whether it involves an interplay of vocal control and social processing networks. Twenty-four healthy adult participants modulated their voice to express social traits along the dimensions of the social trait space (affiliation and competence) or to express body size (control for vocal flexibility). Naïve listener ratings showed that vocal modulations were effective in evoking social trait ratings along the two primary dimensions of the social trait space. Whereas basic vocal modulation engaged the vocomotor network, social voice modulation specifically engaged social processing regions including the medial prefrontal cortex, superior temporal sulcus, and precuneus. Moreover, these regions showed task-relevant modulations in functional connectivity to the left inferior frontal gyrus, a core vocomotor control network area. These findings highlight the impact of the integration of vocal motor control and social information processing for socially meaningful voice modulation.
Collapse
Affiliation(s)
- Stella Guldner
- Department of Cognitive and Clinical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim 68159, Germany.,Graduate School of Economic and Social Sciences, University of Mannheim, Mannheim 68159, Germany.,Department of Speech, Hearing and Phonetic Sciences, University College London, London, UK
| | - Frauke Nees
- Department of Cognitive and Clinical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim 68159, Germany.,Institute of Medical Psychology and Medical Sociology, University Medical Center Schleswig Holstein, Kiel University, Kiel 24105, Germany
| | - Carolyn McGettigan
- Department of Speech, Hearing and Phonetic Sciences, University College London, London, UK.,Department of Psychology, Royal Holloway, University of London, Egham TW20 0EX, UK
| |
Collapse
|
39
|
Operto FF, Pastorino GMG, Mazza R, Di Bonaventura C, Marotta R, Pastorino N, Matricardi S, Verrotti A, Carotenuto M, Roccella M. Social cognition and executive functions in children and adolescents with focal epilepsy. Eur J Paediatr Neurol 2020; 28:167-175. [PMID: 32718867 DOI: 10.1016/j.ejpn.2020.06.019] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/05/2019] [Revised: 06/23/2020] [Accepted: 06/24/2020] [Indexed: 02/08/2023]
Abstract
OBJECTIVES Deficits in facial emotion recognition and Theory of Mind are frequent in patients with epilepsy. Although this evidence, studies on pediatric age are few and the relation between these abilities and other cognitive domain remains to be better elucidated. The purpose of our study is to evaluate facial emotion recognition and Theory of Mind in children and adolescents with focal epilepsy, and correlate them with intelligence and executive functions. MATERIALS AND METHODS Our work is a cross-sectional observational study. Sixty-two children and adolescents aged between 7-16 years diagnosed by focal epilepsy and 32 sex/age-matched controls were recruited. All participants were administered a standardized battery tests to assess social cognition (NEPSY-II), executive functions (EpiTrack Junior) and cognitive non-verbal level (Raven Progressive Matrices). RESULTS Emotion recognition mean score was significantly lower in the epilepsy group than in the controls to Student's t-test (p<0.05). Epilepsy group showed an impairment in happiness, sadness, anger and fear recognition, compared to controls (p<0.05). Theory of Mind mean score was also significantly lower in epilepsy group than controls (p<0.05). Deficits in emotion recognition seemed to be related to low age at onset of epilepsy, long duration of disease, low executive functions and low non-verbal intelligence. Deficits in Theory of Mind seemed to be related to a high seizure frequency. CONCLUSIONS Our results suggest that children and adolescents with focal epilepsy had deficit in facial emotion recognition and Theory of Mind, compared to their peer. Both these difficulties seem to be related to some features of epilepsy itself. Our results also suggest that deficits in facial emotion recognition are potentially related to difficulties in executive functions and non-verbal intelligence. More studies are needed to confirm these hypotheses.
Collapse
Affiliation(s)
- Francesca Felicia Operto
- Child Neuropsychiatry Unit, Department of Medicine, Surgery and Dentistry, University of Salerno, Salerno, Italy.
| | - Grazia Maria Giovanna Pastorino
- Child Neuropsychiatry Unit, Department of Medicine, Surgery and Dentistry, University of Salerno, Salerno, Italy; Department of Mental Health, Physical and Preventive Medicine, Clinic of Child and Adolescent Neuropsychiatry, Università degli Studi della Campania "Luigi Vanvitelli", Naples, Italy
| | - Roberta Mazza
- Child Neuropsychiatry Unit, Department of Basic Medical Sciences, Neuroscience and Sense Organs, University of Bari "Aldo Moro", Bari, Italy
| | - Carlo Di Bonaventura
- Epilepsy Unit, Department of Neurosciences/Mental Health, "Sapienza" University, Rome, Italy
| | - Rosa Marotta
- Department of Health Sciences, University "Magna Graecia", Catanzaro, Italy
| | - Nazareno Pastorino
- Department of Cultural Heritage Sciences, University of Salerno, Salerno, Italy
| | - Sara Matricardi
- Department of Pediatrics, University of Chieti, Chieti, Italy
| | - Alberto Verrotti
- Department of Pediatrics, University of L'Aquila, L'Aquila, Italy
| | - Marco Carotenuto
- Department of Mental Health, Physical and Preventive Medicine, Clinic of Child and Adolescent Neuropsychiatry, Università degli Studi della Campania "Luigi Vanvitelli", Naples, Italy
| | - Michele Roccella
- Department of Psychological, Pedagogical and Educational Sciences, University of Palermo, Palermo, Italy
| |
Collapse
|
40
|
Poyo Solanas M, Vaessen M, de Gelder B. Computation-Based Feature Representation of Body Expressions in the Human Brain. Cereb Cortex 2020; 30:6376-6390. [DOI: 10.1093/cercor/bhaa196] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2020] [Revised: 06/04/2020] [Accepted: 06/26/2020] [Indexed: 01/31/2023] Open
Abstract
Abstract
Humans and other primate species are experts at recognizing body expressions. To understand the underlying perceptual mechanisms, we computed postural and kinematic features from affective whole-body movement videos and related them to brain processes. Using representational similarity and multivoxel pattern analyses, we showed systematic relations between computation-based body features and brain activity. Our results revealed that postural rather than kinematic features reflect the affective category of the body movements. The feature limb contraction showed a central contribution in fearful body expression perception, differentially represented in action observation, motor preparation, and affect coding regions, including the amygdala. The posterior superior temporal sulcus differentiated fearful from other affective categories using limb contraction rather than kinematics. The extrastriate body area and fusiform body area also showed greater tuning to postural features. The discovery of midlevel body feature encoding in the brain moves affective neuroscience beyond research on high-level emotion representations and provides insights in the perceptual features that possibly drive automatic emotion perception.
Collapse
Affiliation(s)
- Marta Poyo Solanas
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200 MD, The Netherlands
| | - Maarten Vaessen
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200 MD, The Netherlands
| | - Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200 MD, The Netherlands
- Department of Computer Science, University College London, London WC1E 6BT, UK
| |
Collapse
|
41
|
Multilevel fMRI adaptation for spoken word processing in the awake dog brain. Sci Rep 2020; 10:11968. [PMID: 32747731 PMCID: PMC7398925 DOI: 10.1038/s41598-020-68821-6] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2020] [Accepted: 06/30/2020] [Indexed: 01/08/2023] Open
Abstract
Human brains process lexical meaning separately from emotional prosody of speech at higher levels of the processing hierarchy. Recently we demonstrated that dog brains can also dissociate lexical and emotional prosodic information in human spoken words. To better understand the neural dynamics of lexical processing in the dog brain, here we used an event-related design, optimized for fMRI adaptation analyses on multiple time scales. We investigated repetition effects in dogs’ neural (BOLD) responses to lexically marked (praise) words and to lexically unmarked (neutral) words, in praising and neutral prosody. We identified temporally and anatomically distinct adaptation patterns. In a subcortical auditory region, we found both short- and long-term fMRI adaptation for emotional prosody, but not for lexical markedness. In multiple cortical auditory regions, we found long-term fMRI adaptation for lexically marked compared to unmarked words. This lexical adaptation showed right-hemisphere bias and was age-modulated in a near-primary auditory region and was independent of prosody in a secondary auditory region. Word representations in dogs’ auditory cortex thus contain more than just the emotional prosody they are typically associated with. These findings demonstrate multilevel fMRI adaptation effects in the dog brain and are consistent with a hierarchical account of spoken word processing.
Collapse
|
42
|
Brown CL, Hua AY, De Coster L, Sturm VE, Kramer JH, Rosen HJ, Miller BL, Levenson RW. Comparing two facets of emotion perception across multiple neurodegenerative diseases. Soc Cogn Affect Neurosci 2020; 15:511-522. [PMID: 32363385 PMCID: PMC7328026 DOI: 10.1093/scan/nsaa060] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2019] [Revised: 03/25/2020] [Accepted: 04/27/2020] [Indexed: 12/30/2022] Open
Abstract
Deficits in emotion perception (the ability to infer others' emotions accurately) can occur as a result of neurodegeneration. It remains unclear how different neurodegenerative diseases affect different forms of emotion perception. The present study compares performance on a dynamic tracking task of emotion perception (where participants track the changing valence of a film character's emotions) with performance on an emotion category labeling task (where participants label specific emotions portrayed by film characters) across seven diagnostic groups (N = 178) including Alzheimer's disease (AD), behavioral variant frontotemporal dementia (bvFTD), semantic variant primary progressive aphasia (svPPA), non-fluent variant primary progressive aphasia (nfvPPA), progressive supranuclear palsy (PSP), corticobasal syndrome and healthy controls. Consistent with hypotheses, compared to controls, the bvFTD group was impaired on both tasks. The svPPA group was impaired on the emotion labeling task, whereas the nfvPPA, PSP and AD groups were impaired on the dynamic tracking task. Smaller volumes in bilateral frontal and left insular regions were associated with worse labeling, whereas smaller volumes in bilateral medial frontal, temporal and right insular regions were associated with worse tracking. Findings suggest labeling and tracking facets of emotion perception are differentially affected across neurodegenerative diseases due to their unique neuroanatomical correlates.
Collapse
Affiliation(s)
- Casey L Brown
- Berkeley Psychophysiology Laboratory, Department of Psychology, University of California, Berkeley, CA 94720-1650, USA
- Department of Psychiatry, University of California, San Francisco, CA 94115, USA
| | - Alice Y Hua
- Berkeley Psychophysiology Laboratory, Department of Psychology, University of California, Berkeley, CA 94720-1650, USA
| | - Lize De Coster
- Department of Psychiatry, University of California, San Francisco, CA 94115, USA
| | - Virginia E Sturm
- Memory and Aging Center, Department of Neurology, University of California, San Francisco, CA 94115, USA
| | - Joel H Kramer
- Memory and Aging Center, Department of Neurology, University of California, San Francisco, CA 94115, USA
| | - Howard J Rosen
- Memory and Aging Center, Department of Neurology, University of California, San Francisco, CA 94115, USA
| | - Bruce L Miller
- Memory and Aging Center, Department of Neurology, University of California, San Francisco, CA 94115, USA
| | - Robert W Levenson
- Berkeley Psychophysiology Laboratory, Department of Psychology, University of California, Berkeley, CA 94720-1650, USA
| |
Collapse
|
43
|
Gruber T, Debracque C, Ceravolo L, Igloi K, Marin Bosch B, Frühholz S, Grandjean D. Human Discrimination and Categorization of Emotions in Voices: A Functional Near-Infrared Spectroscopy (fNIRS) Study. Front Neurosci 2020; 14:570. [PMID: 32581695 PMCID: PMC7290129 DOI: 10.3389/fnins.2020.00570] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2020] [Accepted: 05/08/2020] [Indexed: 11/24/2022] Open
Abstract
Functional Near-Infrared spectroscopy (fNIRS) is a neuroimaging tool that has been recently used in a variety of cognitive paradigms. Yet, it remains unclear whether fNIRS is suitable to study complex cognitive processes such as categorization or discrimination. Previously, functional imaging has suggested a role of both inferior frontal cortices in attentive decoding and cognitive evaluation of emotional cues in human vocalizations. Here, we extended paradigms used in functional magnetic resonance imaging (fMRI) to investigate the suitability of fNIRS to study frontal lateralization of human emotion vocalization processing during explicit and implicit categorization and discrimination using mini-blocks and event-related stimuli. Participants heard speech-like but semantically meaningless pseudowords spoken in various tones and evaluated them based on their emotional or linguistic content. Behaviorally, participants were faster to discriminate than to categorize; and processed the linguistic faster than the emotional content of stimuli. Interactions between condition (emotion/word), task (discrimination/categorization) and emotion content (anger, fear, neutral) influenced accuracy and reaction time. At the brain level, we found a modulation of the Oxy-Hb changes in IFG depending on condition, task, emotion and hemisphere (right or left), highlighting the involvement of the right hemisphere to process fear stimuli, and of both hemispheres to treat anger stimuli. Our results show that fNIRS is suitable to study vocal emotion evaluation, fostering its application to complex cognitive paradigms.
Collapse
Affiliation(s)
- Thibaud Gruber
- Neuroscience of Emotion and Affective Dynamics Lab, Department of Psychology and Educational Sciences and Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland.,Cognitive Science Center, University of Neuchâtel, Neuchâtel, Switzerland
| | - Coralie Debracque
- Neuroscience of Emotion and Affective Dynamics Lab, Department of Psychology and Educational Sciences and Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| | - Leonardo Ceravolo
- Neuroscience of Emotion and Affective Dynamics Lab, Department of Psychology and Educational Sciences and Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| | - Kinga Igloi
- Department of Neuroscience, Faculty of Medicine, University of Geneva, Geneva, Switzerland.,Geneva Neuroscience Center, University of Geneva, Geneva, Switzerland
| | - Blanca Marin Bosch
- Department of Neuroscience, Faculty of Medicine, University of Geneva, Geneva, Switzerland.,Geneva Neuroscience Center, University of Geneva, Geneva, Switzerland
| | - Sascha Frühholz
- Department of Psychology, University of Zurich, Zurich, Switzerland.,Neuroscience Center Zurich, University of Zurich and ETH Zürich, Zurich, Switzerland.,Center for Integrative Human Physiology, University of Zurich, Zurich, Switzerland
| | - Didier Grandjean
- Neuroscience of Emotion and Affective Dynamics Lab, Department of Psychology and Educational Sciences and Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| |
Collapse
|
44
|
Bidet-Ildei C, Decatoire A, Gil S. Recognition of Emotions From Facial Point-Light Displays. Front Psychol 2020; 11:1062. [PMID: 32581934 PMCID: PMC7287185 DOI: 10.3389/fpsyg.2020.01062] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2020] [Accepted: 04/27/2020] [Indexed: 12/26/2022] Open
Abstract
Facial emotion recognition occupies a prominent place in emotion psychology. How perceivers recognize messages conveyed by faces can be studied in either an explicit or an implicit way, and using different kinds of facial stimuli. In the present study, we explored for the first time how facial point-light displays (PLDs) (i.e., biological motion with minimal perceptual properties) can elicit both explicit and implicit mechanisms of facial emotion recognition. Participants completed tasks of explicit or implicit facial emotion recognition from PLDs. Results showed that point-light stimuli are sufficient to allow facial emotion recognition, be it explicit and implicit. We argue that this finding could encourage the use of PLDs in research on the perception of emotional cues from faces.
Collapse
Affiliation(s)
- Christel Bidet-Ildei
- Université de Poitiers, Poitiers, France.,Université de Tours, Tours, France.,Centre de Recherches sur la Cognition et l'Apprentissage, UMR 7295, Poitiers, France.,Centre National de la Recherche Scientifique (CNRS), Paris, France
| | - Arnaud Decatoire
- Université de Poitiers, Poitiers, France.,Centre National de la Recherche Scientifique (CNRS), Paris, France.,Institut Pprime UPR 3346, Poitiers, France
| | - Sandrine Gil
- Université de Poitiers, Poitiers, France.,Université de Tours, Tours, France.,Centre de Recherches sur la Cognition et l'Apprentissage, UMR 7295, Poitiers, France.,Centre National de la Recherche Scientifique (CNRS), Paris, France
| |
Collapse
|
45
|
Young AW, Frühholz S, Schweinberger SR. Face and Voice Perception: Understanding Commonalities and Differences. Trends Cogn Sci 2020; 24:398-410. [DOI: 10.1016/j.tics.2020.02.001] [Citation(s) in RCA: 33] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2019] [Revised: 01/16/2020] [Accepted: 02/03/2020] [Indexed: 01/01/2023]
|
46
|
Dricu M, Frühholz S. A neurocognitive model of perceptual decision-making on emotional signals. Hum Brain Mapp 2020; 41:1532-1556. [PMID: 31868310 PMCID: PMC7267943 DOI: 10.1002/hbm.24893] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2019] [Revised: 11/18/2019] [Accepted: 11/29/2019] [Indexed: 01/09/2023] Open
Abstract
Humans make various kinds of decisions about which emotions they perceive from others. Although it might seem like a split-second phenomenon, deliberating over which emotions we perceive unfolds across several stages of decisional processing. Neurocognitive models of general perception postulate that our brain first extracts sensory information about the world then integrates these data into a percept and lastly interprets it. The aim of the present study was to build an evidence-based neurocognitive model of perceptual decision-making on others' emotions. We conducted a series of meta-analyses of neuroimaging data spanning 30 years on the explicit evaluations of others' emotional expressions. We find that emotion perception is rather an umbrella term for various perception paradigms, each with distinct neural structures that underline task-related cognitive demands. Furthermore, the left amygdala was responsive across all classes of decisional paradigms, regardless of task-related demands. Based on these observations, we propose a neurocognitive model that outlines the information flow in the brain needed for a successful evaluation of and decisions on other individuals' emotions. HIGHLIGHTS: Emotion classification involves heterogeneous perception and decision-making tasks Decision-making processes on emotions rarely covered by existing emotions theories We propose an evidence-based neuro-cognitive model of decision-making on emotions Bilateral brain processes for nonverbal decisions, left brain processes for verbal decisions Left amygdala involved in any kind of decision on emotions.
Collapse
Affiliation(s)
- Mihai Dricu
- Department of PsychologyUniversity of BernBernSwitzerland
| | - Sascha Frühholz
- Cognitive and Affective Neuroscience Unit, Department of PsychologyUniversity of ZurichZurichSwitzerland
- Neuroscience Center Zurich (ZNZ)University of Zurich and ETH ZurichZurichSwitzerland
- Center for Integrative Human Physiology (ZIHP)University of ZurichZurichSwitzerland
| |
Collapse
|
47
|
Koch SBJ, Galli A, Volman I, Kaldewaij R, Toni I, Roelofs K. Neural Control of Emotional Actions in Response to Affective Vocalizations. J Cogn Neurosci 2020; 32:977-988. [PMID: 31933433 DOI: 10.1162/jocn_a_01523] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/02/2023]
Abstract
Social-emotional cues, such as affective vocalizations and emotional faces, automatically elicit emotional action tendencies. Adaptive social-emotional behavior depends on the ability to control these automatic action tendencies. It remains unknown whether neural control over automatic action tendencies is supramodal or relies on parallel modality-specific neural circuits. Here, we address this largely unexplored issue in humans. We consider neural circuits supporting emotional action control in response to affective vocalizations, using an approach-avoidance task known to reliably index control over emotional action tendencies elicited by emotional faces. We isolate supramodal neural contributions to emotional action control through a conjunction analysis of control-related neural activity evoked by auditory and visual affective stimuli, the latter from a previously published data set obtained in an independent sample. We show that the anterior pFC (aPFC) supports control of automatic action tendencies in a supramodal manner, that is, triggered by either emotional faces or affective vocalizations. When affective vocalizations are heard and emotional control is required, the aPFC supports control through negative functional connectivity with the posterior insula. When emotional faces are seen and emotional control is required, control relies on the same aPFC territory downregulating the amygdala. The findings provide evidence for a novel mechanism of emotional action control with a hybrid hierarchical architecture, relying on a supramodal node (aPFC) implementing an abstract goal by modulating modality-specific nodes (posterior insula, amygdala) involved in signaling motivational significance of either affective vocalizations or faces.
Collapse
Affiliation(s)
- Saskia B J Koch
- Donders Institute for Brain, Cognition and Behavior, Radboud University.,Behavioral Science Institute, Radboud University
| | - Alessandra Galli
- Donders Institute for Brain, Cognition and Behavior, Radboud University
| | - Inge Volman
- Wellcome Centre for Integrative Neuroimaging, Oxford, UK
| | - Reinoud Kaldewaij
- Donders Institute for Brain, Cognition and Behavior, Radboud University.,Behavioral Science Institute, Radboud University
| | - Ivan Toni
- Donders Institute for Brain, Cognition and Behavior, Radboud University
| | - Karin Roelofs
- Donders Institute for Brain, Cognition and Behavior, Radboud University.,Behavioral Science Institute, Radboud University
| |
Collapse
|
48
|
Correia AI, Branco P, Martins M, Reis AM, Martins N, Castro SL, Lima CF. Resting-state connectivity reveals a role for sensorimotor systems in vocal emotional processing in children. Neuroimage 2019; 201:116052. [DOI: 10.1016/j.neuroimage.2019.116052] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2018] [Revised: 07/19/2019] [Accepted: 07/23/2019] [Indexed: 11/17/2022] Open
|
49
|
Schlegel K, Palese T, Mast MS, Rammsayer TH, Hall JA, Murphy NA. A meta-analysis of the relationship between emotion recognition ability and intelligence. Cogn Emot 2019; 34:329-351. [PMID: 31221021 DOI: 10.1080/02699931.2019.1632801] [Citation(s) in RCA: 35] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Abstract
The ability to recognise others' emotions from nonverbal cues (emotion recognition ability, ERA) is measured with performance-based tests and has many positive correlates. Although researchers have long proposed that ERA is related to general mental ability or intelligence, a comprehensive analysis of this relationship is lacking. For instance, it remains unknown whether the magnitude of the association varies by intelligence type, ERA test features, as well as demographic variables. The present meta-analysis examined the relationship between ERA and intelligence based on 471 effect sizes from 133 samples and found a significant mean effect size (controlled for nesting within samples) of r = .19. Different intelligence types (crystallized, fluid, spatial, memory, information processing speed and efficiency) yielded similar effect sizes, whereas academic achievement measures (e.g. SAT scores) were unrelated to ERA. Effect sizes were higher for ERA tests that simultaneously present facial, vocal, and bodily cues (as compared to tests using static pictures) and for tests with higher reliability and more emotions. Results were unaffected by most study and sample characteristics, but effect size increased with higher mean age of the sample. These findings establish ERA as sensory-cognitive ability that is distinct from, yet related to, intelligence.
Collapse
Affiliation(s)
- Katja Schlegel
- Institute for Psychology, University of Bern, Bern, Switzerland
| | - Tristan Palese
- Faculty of Business and Economics, University of Lausanne, Lausanne, Switzerland
| | - Marianne Schmid Mast
- Faculty of Business and Economics, University of Lausanne, Lausanne, Switzerland
| | | | - Judith A Hall
- Department of Psychology, Northeastern University, Boston, MA, USA
| | - Nora A Murphy
- Department of Psychology, Loyola Marymount University, Los Angeles, CA, USA
| |
Collapse
|
50
|
Ferretti V, Maltese F, Contarini G, Nigro M, Bonavia A, Huang H, Gigliucci V, Morelli G, Scheggia D, Managò F, Castellani G, Lefevre A, Cancedda L, Chini B, Grinevich V, Papaleo F. Oxytocin Signaling in the Central Amygdala Modulates Emotion Discrimination in Mice. Curr Biol 2019; 29:1938-1953.e6. [DOI: 10.1016/j.cub.2019.04.070] [Citation(s) in RCA: 87] [Impact Index Per Article: 17.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2018] [Revised: 04/11/2019] [Accepted: 04/26/2019] [Indexed: 11/29/2022]
|