1
|
Hagen J, Ramkiran S, Schnellbächer GJ, Rajkumar R, Collee M, Khudeish N, Veselinović T, Shah NJ, Neuner I. Phenomena of hypo- and hyperconnectivity in basal ganglia-thalamo-cortical circuits linked to major depression: a 7T fMRI study. Mol Psychiatry 2024:10.1038/s41380-024-02669-4. [PMID: 39020104 DOI: 10.1038/s41380-024-02669-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/26/2023] [Revised: 06/28/2024] [Accepted: 07/05/2024] [Indexed: 07/19/2024]
Abstract
Major depressive disorder (MDD) typically manifests itself in depressed affect, anhedonia, low energy, and additional symptoms. Despite its high global prevalence, its pathophysiology still gives rise to questions. Current research places alterations in functional connectivity among MDD's most promising biomarkers. However, given the heterogeneity of previous findings, the use of higher-resolution imaging techniques, like ultra-high field (UHF) fMRI (≥7 Tesla, 7T), may offer greater specificity in delineating fundamental impairments. In this study, 7T UHF fMRI scans were conducted on 31 MDD patients and 27 age-gender matched healthy controls to exploratorily contrast cerebral resting-state functional connectivity patterns between both groups. The CONN toolbox was used to generate functional network connectivity (FNC) analysis based on the region of interest (ROI)-to-ROI correlations in order to enable the identification of clusters of significantly different connections. Correction for multiple comparisons was implemented at the cluster level using a false discovery rate (FDR). The analysis revealed three significant clusters differentiating MDD patients and healthy controls. In Clusters 1 and 2, MDD patients exhibited between-network hypoconnectivity in basal ganglia-cortical pathways as well as hyperconnectivity in thalamo-cortical pathways, including several individual ROI-to-ROI connections. In Cluster 3, they showed increased occipital interhemispheric within-network connectivity. These findings suggest that alterations in basal ganglia-thalamo-cortical circuits play a substantial role in the pathophysiology of MDD. Furthermore, they indicate potential MDD-related deficits relating to a combination of perception (vision, audition, and somatosensation) as well as more complex functions, especially social-emotional processing, modulation, and regulation. It is anticipated that these findings might further inform more accurate clinical procedures for addressing MDD.
Collapse
Affiliation(s)
- Jana Hagen
- Department of Psychiatry, Psychotherapy and Psychosomatics, Uniklinik RWTH Aachen, Aachen, Germany
- Institute of Neuroscience and Medicine - 4, Forschungszentrum Jülich, Jülich, Germany
| | - Shukti Ramkiran
- Department of Psychiatry, Psychotherapy and Psychosomatics, Uniklinik RWTH Aachen, Aachen, Germany
- Institute of Neuroscience and Medicine - 4, Forschungszentrum Jülich, Jülich, Germany
| | - Gereon J Schnellbächer
- Department of Psychiatry, Psychotherapy and Psychosomatics, Uniklinik RWTH Aachen, Aachen, Germany
- Institute of Neuroscience and Medicine - 4, Forschungszentrum Jülich, Jülich, Germany
| | - Ravichandran Rajkumar
- Department of Psychiatry, Psychotherapy and Psychosomatics, Uniklinik RWTH Aachen, Aachen, Germany
- Institute of Neuroscience and Medicine - 4, Forschungszentrum Jülich, Jülich, Germany
| | - Maria Collee
- Department of Psychiatry, Psychotherapy and Psychosomatics, Uniklinik RWTH Aachen, Aachen, Germany
| | - Nibal Khudeish
- Department of Psychiatry, Psychotherapy and Psychosomatics, Uniklinik RWTH Aachen, Aachen, Germany
- Institute of Neuroscience and Medicine - 4, Forschungszentrum Jülich, Jülich, Germany
| | - Tanja Veselinović
- Department of Psychiatry, Psychotherapy and Psychosomatics, Uniklinik RWTH Aachen, Aachen, Germany
- Institute of Neuroscience and Medicine - 4, Forschungszentrum Jülich, Jülich, Germany
| | - N Jon Shah
- Institute of Neuroscience and Medicine - 4, Forschungszentrum Jülich, Jülich, Germany
- Department of Neurology, Uniklinik RWTH Aachen, Aachen, Germany
- Institute of Neuroscience and Medicine - 11, Forschungszentrum Jülich, Jülich, Germany
| | - Irene Neuner
- Department of Psychiatry, Psychotherapy and Psychosomatics, Uniklinik RWTH Aachen, Aachen, Germany.
- Institute of Neuroscience and Medicine - 4, Forschungszentrum Jülich, Jülich, Germany.
| |
Collapse
|
2
|
Saragosa-Harris NM, Guassi Moreira JF, Waizman Y, Sedykin A, Peris TS, Silvers JA. Early life adversity is associated with greater similarity in neural representations of ambiguous and threatening stimuli. Dev Psychopathol 2024:1-13. [PMID: 38602091 DOI: 10.1017/s0954579424000683] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/12/2024]
Abstract
Exposure to early life adversity (ELA) is hypothesized to sensitize threat-responsive neural circuitry. This may lead individuals to overestimate threat in the face of ambiguity, a cognitive-behavioral phenotype linked to poor mental health. The tendency to process ambiguity as threatening may stem from difficulty distinguishing between ambiguous and threatening stimuli. However, it is unknown how exposure to ELA relates to neural representations of ambiguous and threatening stimuli, or how processing of ambiguity following ELA relates to psychosocial functioning. The current fMRI study examined multivariate representations of threatening and ambiguous social cues in 41 emerging adults (aged 18 to 19 years). Using representational similarity analysis, we assessed neural representations of ambiguous and threatening images within affective neural circuitry and tested whether similarity in these representations varied by ELA exposure. Greater exposure to ELA was associated with greater similarity in neural representations of ambiguous and threatening images. Moreover, individual differences in processing ambiguity related to global functioning, an association that varied as a function of ELA. By evidencing reduced neural differentiation between ambiguous and threatening cues in ELA-exposed emerging adults and linking behavioral responses to ambiguity to psychosocial wellbeing, these findings have important implications for future intervention work in at-risk, ELA-exposed populations.
Collapse
Affiliation(s)
| | - João F Guassi Moreira
- Department of Psychology, University of California Los Angeles, Los Angeles, CA, USA
| | - Yael Waizman
- Department of Psychology, University of California Los Angeles, Los Angeles, CA, USA
| | - Anna Sedykin
- Jane and Terry Semel Institute for Neuroscience and Human Behavior, University of California Los Angeles, Los Angeles, CA, USA
| | - Tara S Peris
- Jane and Terry Semel Institute for Neuroscience and Human Behavior, University of California Los Angeles, Los Angeles, CA, USA
| | - Jennifer A Silvers
- Department of Psychology, University of California Los Angeles, Los Angeles, CA, USA
| |
Collapse
|
3
|
Walsh E, Whitby J, Chen YY, Longo MR. No influence of emotional expression on size underestimation of upright faces. PLoS One 2024; 19:e0293920. [PMID: 38300951 PMCID: PMC10833517 DOI: 10.1371/journal.pone.0293920] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2023] [Accepted: 10/20/2023] [Indexed: 02/03/2024] Open
Abstract
Faces are a primary means of conveying social information between humans. One important factor modulating the perception of human faces is emotional expression. Face inversion also affects perception, including judgments of emotional expression, possibly through the disruption of configural processing. One intriguing inversion effect is an illusion whereby faces appear to be physically smaller when upright than when inverted. This illusion appears to be highly selective for faces. In this study, we investigated whether the emotional expression of a face (neutral, happy, afraid, and angry) modulates the magnitude of this size illusion. Results showed that for all four expressions, there was a clear bias for inverted stimuli to be judged as larger than upright ones. This demonstrates that there is no influence of emotional expression on the size underestimation of upright faces, a surprising result given that recognition of different emotional expressions is known to be affected unevenly by inversion. Results are discussed considering recent neuroimaging research which used population receptive field (pRF) mapping to investigate the neural mechanisms underlying face perception features and which may provide an explanation for how an upright face appears smaller than an inverted one. Elucidation of this effect would lead to a greater understanding of how humans communicate.
Collapse
Affiliation(s)
- Eamonn Walsh
- Department of Basic & Clinical Neuroscience, Institute of Psychiatry, Psychology & Neuroscience, King’s College London, London, United Kingdom
- Cultural and Social Neuroscience Research Group, Institute of Psychiatry, Psychology & Neuroscience, King’s College London, London, United Kingdom
| | - Jack Whitby
- Department of Basic & Clinical Neuroscience, Institute of Psychiatry, Psychology & Neuroscience, King’s College London, London, United Kingdom
| | - Yen-Ya Chen
- Department of Basic & Clinical Neuroscience, Institute of Psychiatry, Psychology & Neuroscience, King’s College London, London, United Kingdom
| | - Matthew R. Longo
- Department of Psychological Sciences, Birkbeck, University of London, London, United Kingdom
| |
Collapse
|
4
|
Pitcher D, Sliwinska MW, Kaiser D. TMS disruption of the lateral prefrontal cortex increases neural activity in the default mode network when naming facial expressions. Soc Cogn Affect Neurosci 2023; 18:nsad072. [PMID: 38048419 PMCID: PMC10695328 DOI: 10.1093/scan/nsad072] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2023] [Revised: 10/17/2023] [Accepted: 11/15/2023] [Indexed: 12/06/2023] Open
Abstract
Recognizing facial expressions is dependent on multiple brain networks specialized for different cognitive functions. In the current study, participants (N = 20) were scanned using functional magnetic resonance imaging (fMRI), while they performed a covert facial expression naming task. Immediately prior to scanning thetaburst transcranial magnetic stimulation (TMS) was delivered over the right lateral prefrontal cortex (PFC), or the vertex control site. A group whole-brain analysis revealed that TMS induced opposite effects in the neural responses across different brain networks. Stimulation of the right PFC (compared to stimulation of the vertex) decreased neural activity in the left lateral PFC but increased neural activity in three nodes of the default mode network (DMN): the right superior frontal gyrus, right angular gyrus and the bilateral middle cingulate gyrus. A region of interest analysis showed that TMS delivered over the right PFC reduced neural activity across all functionally localised face areas (including in the PFC) compared to TMS delivered over the vertex. These results suggest that visually recognizing facial expressions is dependent on the dynamic interaction of the face-processing network and the DMN. Our study also demonstrates the utility of combined TMS/fMRI studies for revealing the dynamic interactions between different functional brain networks.
Collapse
Affiliation(s)
- David Pitcher
- Department of Psychology, University of York, Heslington, York YO105DD, UK
| | | | - Daniel Kaiser
- Mathematical Institute, Department of Mathematics and Computer Science, Physics, Geography, Justus-Liebig-Universität Gießen, Gießen 35392, Germany
- Center for Mind, Brain and Behaviour, Philipps-Universität Marburg, and Justus-Liebig-Universität Gießen, Marburg 35032, Germany
| |
Collapse
|
5
|
Lee Y, Seo Y, Lee Y, Lee D. Dimensional emotions are represented by distinct topographical brain networks. Int J Clin Health Psychol 2023; 23:100408. [PMID: 37663040 PMCID: PMC10472247 DOI: 10.1016/j.ijchp.2023.100408] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2023] [Accepted: 08/21/2023] [Indexed: 09/05/2023] Open
Abstract
The ability to recognize others' facial emotions has become increasingly important after the COVID-19 pandemic, which causes stressful situations in emotion regulation. Considering the importance of emotion in maintaining a social life, emotion knowledge to perceive and label emotions of oneself and others requires an understanding of affective dimensions, such as emotional valence and emotional arousal. However, limited information is available about whether the behavioral representation of affective dimensions is similar to their neural representation. To explore the relationship between the brain and behavior in the representational geometries of affective dimensions, we constructed a behavioral paradigm in which emotional faces were categorized into geometric spaces along the valence, arousal, and valence and arousal dimensions. Moreover, we compared such representations to neural representations of the faces acquired by functional magnetic resonance imaging. We found that affective dimensions were similarly represented in the behavior and brain. Specifically, behavioral and neural representations of valence were less similar to those of arousal. We also found that valence was represented in the dorsolateral prefrontal cortex, frontal eye fields, precuneus, and early visual cortex, whereas arousal was represented in the cingulate gyrus, middle frontal gyrus, orbitofrontal cortex, fusiform gyrus, and early visual cortex. In conclusion, the current study suggests that dimensional emotions are similarly represented in the behavior and brain and are presented with differential topographical organizations in the brain.
Collapse
Affiliation(s)
| | | | - Youngju Lee
- Cognitive Science Research Group, Korea Brain Research Institute, 61 Cheomdan-ro, Dong-gu, Daegu 41062, Republic of Korea
| | - Dongha Lee
- Cognitive Science Research Group, Korea Brain Research Institute, 61 Cheomdan-ro, Dong-gu, Daegu 41062, Republic of Korea
| |
Collapse
|
6
|
Kan S, Fujita N, Shibata M, Miki K, Yukioka M, Senba E. Three weeks of exercise therapy altered brain functional connectivity in fibromyalgia inpatients. NEUROBIOLOGY OF PAIN (CAMBRIDGE, MASS.) 2023; 14:100132. [PMID: 38099286 PMCID: PMC10719530 DOI: 10.1016/j.ynpai.2023.100132] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/31/2022] [Revised: 04/26/2023] [Accepted: 05/16/2023] [Indexed: 12/17/2023]
Abstract
Background Fibromyalgia (FM) is a chronic pain syndrome characterized by widespread pain, tenderness, and fatigue. Patients with FM have no effective medication so far, and their activity of daily living and quality of life are remarkably impaired. Therefore, new therapeutic approaches are awaited. Recently, exercise therapy has been gathering much attention as a promising treatment for FM. However, the underlying mechanisms are not fully understood, particularly, in the central nervous system, including the brain. Therefore, we investigated functional connectivity changes and their relationship with clinical improvement in patients with FM after exercise therapy to investigate the underlying mechanisms in the brain using resting-state fMRI (rs-fMRI) and functional connectivity (FC) analysis. Methods Seventeen patients with FM participated in this study. They underwent a 3-week exercise therapy on in-patient basis and a 5-min rs-fMRI scan before and after the exercise therapy. We compared the FC strength of sensorimotor regions and the mesocortico-limbic system between two scans. We also performed a multiple regression analysis to examine the relationship between pre-post differences in FC strength and improvement of patients' clinical symptoms or motor abilities. Results Patients with FM showed significant improvement in clinical symptoms and motor abilities. They also showed a significant pre-post difference in FC of the anterior cingulate cortex and a significant correlation between pre-post FC changes and improvement of clinical symptoms and motor abilities. Although sensorimotor regions tended to be related to the improvement of general disease severity and depression, brain regions belonging to the mesocortico-limbic system tended to be related to the improvement of motor abilities. Conclusion Our 3-week exercise therapy could ameliorate clinical symptoms and motor abilities of patients with FM, and lead to FC changes in sensorimotor regions and brain regions belonging to the mesocortico-limbic system. Furthermore, these changes were related to improvement of clinical symptoms and motor abilities. Our findings suggest that, as predicted by previous animal studies, spontaneous brain activities modified by exercise therapy, including the mesocortico-limbic system, improve clinical symptoms in patients with FM.
Collapse
Affiliation(s)
- Shigeyuki Kan
- Department of Psychiatry and Neurosciences, Graduate School of Biomedical and Health Sciences, Hiroshima University, 1-2-3 Kasumi, Minami-ku, Hiroshima Hiroshima 734-8551, Japan
- Department of Anesthesiology and Intensive Care Medicine, 2-2 Yamadaoka, Suita, Osaka University Graduate School of Medicine, Osaka 565-0871, Japan
| | - Nobuko Fujita
- Department of Rehabilitation, Faculty of Health Sciences, Naragakuen University, 3-15-1 Nakatomigaoka, Nara, Nara 631-8524, Japan
| | - Masahiko Shibata
- Department of Rehabilitation, Faculty of Health Sciences, Naragakuen University, 3-15-1 Nakatomigaoka, Nara, Nara 631-8524, Japan
| | - Kenji Miki
- Hayaishi Hospital, 2-75 Fudegasakicho, Tennoji-ku, Osaka, Osaka 543-0027, Japan
- Department of Physical Therapy, Osaka Yukioka College of Health Science, 1-1-41 Sojiji, Ibaraki, Osaka 567-0801, Japan
| | - Masao Yukioka
- Department of Rheumatology, Yukioka Hospital, 2-2-3 Ukita, Kita-ku, Osaka 530-0021, Japan
| | - Emiko Senba
- Department of Physical Therapy, Osaka Yukioka College of Health Science, 1-1-41 Sojiji, Ibaraki, Osaka 567-0801, Japan
| |
Collapse
|
7
|
Santavirta S, Karjalainen T, Nazari-Farsani S, Hudson M, Putkinen V, Seppälä K, Sun L, Glerean E, Hirvonen J, Karlsson HK, Nummenmaa L. Functional organization of social perception in the human brain. Neuroimage 2023; 272:120025. [PMID: 36958619 PMCID: PMC10112277 DOI: 10.1016/j.neuroimage.2023.120025] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2022] [Revised: 03/07/2023] [Accepted: 03/11/2023] [Indexed: 03/25/2023] Open
Abstract
Humans rapidly extract diverse and complex information from ongoing social interactions, but the perceptual and neural organization of the different aspects of social perception remains unresolved. We showed short movie clips with rich social content to 97 healthy participants while their haemodynamic brain activity was measured with fMRI. The clips were annotated moment-to-moment for a large set of social features and 45 of the features were evaluated reliably between annotators. Cluster analysis of the social features revealed that 13 dimensions were sufficient for describing the social perceptual space. Three different analysis methods were used to map the social perceptual processes in the human brain. Regression analysis mapped regional neural response profiles for different social dimensions. Multivariate pattern analysis then established the spatial specificity of the responses and intersubject correlation analysis connected social perceptual processing with neural synchronization. The results revealed a gradient in the processing of social information in the brain. Posterior temporal and occipital regions were broadly tuned to most social dimensions and the classifier revealed that these responses showed spatial specificity for social dimensions; in contrast Heschl gyri and parietal areas were also broadly associated with different social signals, yet the spatial patterns of responses did not differentiate social dimensions. Frontal and subcortical regions responded only to a limited number of social dimensions and the spatial response patterns did not differentiate social dimension. Altogether these results highlight the distributed nature of social processing in the brain.
Collapse
Affiliation(s)
- Severi Santavirta
- Turku PET Centre, University of Turku, Turku, Finland; Turku PET Centre, Turku University Hospital, Turku, Finland.
| | - Tomi Karjalainen
- Turku PET Centre, University of Turku, Turku, Finland; Turku PET Centre, Turku University Hospital, Turku, Finland
| | - Sanaz Nazari-Farsani
- Turku PET Centre, University of Turku, Turku, Finland; Turku PET Centre, Turku University Hospital, Turku, Finland
| | - Matthew Hudson
- Turku PET Centre, University of Turku, Turku, Finland; Turku PET Centre, Turku University Hospital, Turku, Finland; School of Psychology, University of Plymouth, Plymouth, United Kingdom
| | - Vesa Putkinen
- Turku PET Centre, University of Turku, Turku, Finland; Turku PET Centre, Turku University Hospital, Turku, Finland
| | - Kerttu Seppälä
- Turku PET Centre, University of Turku, Turku, Finland; Turku PET Centre, Turku University Hospital, Turku, Finland; Department of Medical Physics, Turku University Hospital, Turku, Finland
| | - Lihua Sun
- Turku PET Centre, University of Turku, Turku, Finland; Turku PET Centre, Turku University Hospital, Turku, Finland; Department of Nuclear Medicine, Pudong Hospital, Fudan University, Shanghai, China
| | - Enrico Glerean
- Department of Neuroscience and Biomedical Engineering, Aalto University School of Science, Espoo, Finland
| | - Jussi Hirvonen
- Department of Radiology, University of Turku and Turku University Hospital, Turku, Finland; Medical Imaging Center, Department of Radiology, Tampere University and Tampere University Hospital, Tampere, Finland
| | - Henry K Karlsson
- Turku PET Centre, University of Turku, Turku, Finland; Turku PET Centre, Turku University Hospital, Turku, Finland
| | - Lauri Nummenmaa
- Turku PET Centre, University of Turku, Turku, Finland; Turku PET Centre, Turku University Hospital, Turku, Finland; Department of Psychology, University of Turku, Turku, Finland
| |
Collapse
|
8
|
Dissociation and hierarchy of human visual pathways for simultaneously coding facial identity and expression. Neuroimage 2022; 264:119769. [PMID: 36435341 DOI: 10.1016/j.neuroimage.2022.119769] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2022] [Revised: 11/14/2022] [Accepted: 11/22/2022] [Indexed: 11/25/2022] Open
Abstract
Humans have an extraordinary ability to recognize facial expression and identity from a single face simultaneously and effortlessly, however, the underlying neural computation is not well understood. Here, we optimized a multi-task deep neural network to classify facial expression and identity simultaneously. Under various optimization training strategies, the best-performing model consistently showed 'share-separate' organization. The two separate branches of the best-performing model also exhibited distinct abilities to categorize facial expression and identity, and these abilities increased along the facial expression or identity branches toward high layers. By comparing the representational similarities between the best-performing model and functional magnetic resonance imaging (fMRI) responses in the human visual cortex to the same face stimuli, the face-selective posterior superior temporal sulcus (pSTS) in the dorsal visual cortex was significantly correlated with layers in the expression branch of the model, and the anterior inferotemporal cortex (aIT) and anterior fusiform face area (aFFA) in the ventral visual cortex were significantly correlated with layers in the identity branch of the model. Besides, the aFFA and aIT better matched the high layers of the model, while the posterior FFA (pFFA) and occipital facial area (OFA) better matched the middle and early layers of the model, respectively. Overall, our study provides a task-optimization computational model to better understand the neural mechanism underlying face recognition, which suggest that similar to the best-performing model, the human visual system exhibits both dissociated and hierarchical neuroanatomical organization when simultaneously coding facial identity and expression.
Collapse
|
9
|
Association between fractional amplitude of low-frequency fluctuation (fALFF) and facial emotion recognition ability in first-episode schizophrenia patients: a fMRI study. Sci Rep 2022; 12:19561. [PMID: 36380188 PMCID: PMC9666540 DOI: 10.1038/s41598-022-24258-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2022] [Accepted: 11/14/2022] [Indexed: 11/16/2022] Open
Abstract
It was still unclear that the correlation between the resting-state intrinsic activity in brain regions and facial emotion recognition (FER) ability in patients with first-episode schizophrenia (FSZ). Our aim was to analyse the correlation between the fractional amplitude of low-frequency fluctuation (fALFF) and FER ability in FSZ patients. A total of 28 patients with FSZ and 33 healthy controls (HCs) completed visual search tasks for FER ability. Regions of interest (ROIs) related to facial emotion were obtained from a previous meta-analysis. Pearson correlation analysis was performed to understand the correlation between fALFF and FER ability. Our results indicated that the patients performed worse than the HCs in the accuracy performances of happy FER and fearful FER. The previous meta-analysis results showed that the brain regions related to FER included the bilateral amygdala (AMY)/hippocampus (HIP), right fusiform gyrus (FFG), and right supplementary motor area (SMA). Partial correlation analysis showed that the fALFF of the right FFG was associated with high-load fearful FER accuracy (r = - 0.60, p = 0.004). Our study indicated that FER ability is correlated with resting-state intrinsic activity in brain regions related to facial emotion, which may provide a reference for the study of FER deficiency in schizophrenia.
Collapse
|
10
|
Padilla-Coreano N, Tye KM, Zelikowsky M. Dynamic influences on the neural encoding of social valence. Nat Rev Neurosci 2022; 23:535-550. [PMID: 35831442 PMCID: PMC9997616 DOI: 10.1038/s41583-022-00609-1] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/27/2022] [Indexed: 11/09/2022]
Abstract
Social signals can serve as potent emotional triggers with powerful impacts on processes from cognition to valence processing. How are social signals dynamically and flexibly associated with positive or negative valence? How do our past social experiences and present social standing shape our motivation to seek or avoid social contact? We discuss a model in which social attributes, social history, social memory, social rank and social isolation can flexibly influence valence assignment to social stimuli, termed here as 'social valence'. We emphasize how the brain encodes each of these four factors and highlight the neural circuits and mechanisms that play a part in the perception of social attributes, social memory and social rank, as well as how these factors affect valence systems associated with social stimuli. We highlight the impact of social isolation, dissecting the neural and behavioural mechanisms that mediate the effects of acute versus prolonged periods of social isolation. Importantly, we discuss conceptual models that may account for the potential shift in valence of social stimuli from positive to negative as the period of isolation extends in time. Collectively, this Review identifies factors that control the formation and attribution of social valence - integrating diverse areas of research and emphasizing their unique contributions to the categorization of social stimuli as positive or negative.
Collapse
Affiliation(s)
- Nancy Padilla-Coreano
- Department of Neuroscience, College of Medicine, University of Florida, Gainesville, FL, USA
| | - Kay M Tye
- HHMI-Salk Institute for Biological Studies, La Jolla, CA, USA.
| | - Moriel Zelikowsky
- Department of Neurobiology, School of Medicine, University of Utah, Salt Lake City, UT, USA
| |
Collapse
|
11
|
Tanaka T, Okamoto N, Kida I, Haruno M. The initial decrease in 7T-BOLD signals detected by hyperalignment contains information to decode facial expressions. Neuroimage 2022; 262:119537. [DOI: 10.1016/j.neuroimage.2022.119537] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2022] [Revised: 07/11/2022] [Accepted: 08/02/2022] [Indexed: 10/31/2022] Open
|
12
|
Izumika R, Cabeza R, Tsukiura T. Neural Mechanisms of Perceiving and Subsequently Recollecting Emotional Facial Expressions in Young and Older Adults. J Cogn Neurosci 2022; 34:1183-1204. [PMID: 35468212 DOI: 10.1162/jocn_a_01851] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
It is known that emotional facial expressions modulate the perception and subsequent recollection of faces and that aging alters these modulatory effects. Yet, the underlying neural mechanisms are not well understood, and they were the focus of the current fMRI study. We scanned healthy young and older adults while perceiving happy, neutral, or angry faces paired with names. Participants were then provided with the names of the faces and asked to recall the facial expression of each face. fMRI analyses focused on the fusiform face area (FFA), the posterior superior temporal sulcus (pSTS), the OFC, the amygdala, and the hippocampus (HC). Univariate activity, multivariate pattern (MVPA), and functional connectivity analyses were performed. The study yielded two main sets of findings. First, in pSTS and the amygdala, univariate activity and MVPA discrimination during the processing of facial expressions were similar in young and older adults, whereas in FFA and OFC, MVPA discriminated facial expressions less accurately in older than young adults. These findings suggest that facial expression representations in FFA and OFC reflect age-related dedifferentiation and positivity effect. Second, HC-OFC connectivity showed subsequent memory effects (SMEs) for happy expressions in both age groups, HC-FFA connectivity exhibited SMEs for happy and neutral expressions in young adults, and HC-pSTS interactions displayed SMEs for happy expressions in older adults. These results could be related to compensatory mechanisms and positivity effects in older adults. Taken together, the results clarify the effects of aging on the neural mechanisms in perceiving and encoding facial expressions.
Collapse
|
13
|
Axelrod V, Rozier C, Malkinson TS, Lehongre K, Adam C, Lambrecq V, Navarro V, Naccache L. Face-selective multi-unit activity in the proximity of the FFA modulated by facial expression stimuli. Neuropsychologia 2022; 170:108228. [DOI: 10.1016/j.neuropsychologia.2022.108228] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2021] [Revised: 02/13/2022] [Accepted: 03/23/2022] [Indexed: 01/02/2023]
|
14
|
Medial prefrontal and occipito-temporal activity at encoding determines enhanced recognition of threatening faces after 1.5 years. Brain Struct Funct 2022; 227:1655-1672. [PMID: 35174416 DOI: 10.1007/s00429-022-02462-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2021] [Accepted: 01/24/2022] [Indexed: 11/02/2022]
Abstract
Studies demonstrated that faces with threatening emotional expressions are better remembered than non-threatening faces. However, whether this memory advantage persists over years and which neural systems underlie such an effect remains unknown. Here, we employed an individual difference approach to examine whether the neural activity during incidental encoding was associated with differential recognition of faces with emotional expressions (angry, fearful, happy, sad and neutral) after a retention interval of > 1.5 years (N = 89). Behaviorally, we found a better recognition for threatening (angry, fearful) versus non-threatening (happy and neutral) faces after a delay of > 1.5 years, which was driven by forgetting of non-threatening faces compared with immediate recognition after encoding. Multivariate principal component analysis (PCA) on the behavioral responses further confirmed the discriminative recognition performance between threatening and non-threatening faces. A voxel-wise whole-brain analysis on the concomitantly acquired functional magnetic resonance imaging (fMRI) data during incidental encoding revealed that neural activity in bilateral inferior occipital gyrus (IOG) and ventromedial prefrontal/orbitofrontal cortex (vmPFC/OFC) was associated with the individual differences in the discriminative emotional face recognition performance measured by an innovative behavioral pattern similarity analysis (BPSA). The left fusiform face area (FFA) was additionally determined using a regionally focused analysis. Overall, the present study provides evidence that threatening facial expressions lead to persistent face recognition over periods of > 1.5 years, and that differential encoding-related activity in the medial prefrontal cortex and occipito-temporal cortex may underlie this effect.
Collapse
|
15
|
de Gelder B, Huis in ‘t Veldt E, Zhan M, Van den Stock J. Acquired Prosopagnosia with Structurally Intact and Functional Fusiform Face Area and with Face Identity-Specific Configuration Processing Deficits. Cereb Cortex 2022; 32:4671-4683. [DOI: 10.1093/cercor/bhab509] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2021] [Revised: 12/10/2021] [Accepted: 12/11/2021] [Indexed: 11/12/2022] Open
Abstract
Abstract
Prosopagnosia or loss of face perception and recognition is still poorly understood and rare single cases of acquired prosopagnosia can provide a unique window on the behavioural and brain basis of normal face perception. The present study of a new case of acquired prosopagnosia with bilateral occipito-temporal lesions but a structurally intact FFA and OFA investigated whether the lesion overlapped with the face network and whether the structurally intact FFA showed a face selective response. We also investigated the behavioral correlates of the neural findings and assessed configural processing in the context of facial and non-facial identity recognition, expression recognition and memory, also focusing on the face-selectivity of each specific deficit. The findings reveal a face-selective response in the FFA, despite lesions in the face perception network. At the behavioural level, the results showed impaired configural processing for facial identity, but not for other stimulus categories and not for facial expression recognition. These findings challenge a critical role of the FFA for face identity processing and support a domain-specific account of configural processing.
Collapse
Affiliation(s)
- Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht 6229 EV, The Netherlands
- Department of Computer Science, University College London, London WC1E 6BT, UK
| | - Elizabeth Huis in ‘t Veldt
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht 6229 EV, The Netherlands
- Departement of Medical and Clinical Psychology, Tilburg University, 5037 AB Tilburg, The Netherlands
| | - Minye Zhan
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht 6229 EV, The Netherlands
| | - Jan Van den Stock
- Department of Neurosciences, Neuropsychiatry, Leuven Brain Institute, KU Leuven, 3000 Leuven, Belgium
- Geriatric Psychiatry, University Psychiatric Center, KU Leuven, 3000 Leuven, Belgium
| |
Collapse
|
16
|
Resting state functional brain networks associated with emotion processing in frontotemporal lobar degeneration. Mol Psychiatry 2022; 27:4809-4821. [PMID: 35595978 PMCID: PMC9734056 DOI: 10.1038/s41380-022-01612-9] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/07/2022] [Revised: 04/21/2022] [Accepted: 05/04/2022] [Indexed: 12/14/2022]
Abstract
This study investigated the relationship between emotion processing and resting-state functional connectivity (rs-FC) of the brain networks in frontotemporal lobar degeneration (FTLD). Eighty FTLD patients (including cases with behavioral variant of frontotemporal dementia, primary progressive aphasia, progressive supranuclear palsy syndrome, motor neuron disease) and 65 healthy controls underwent rs-functional MRI. Emotion processing was tested using the Comprehensive Affect Testing System (CATS). In patients and controls, correlations were investigated between each emotion construct and rs-FC changes within critical networks. Mean rs-FC of the clusters significantly associated with CATS scoring were compared among FTLD groups. FTLD patients had pathological CATS scores compared with controls. In controls, increased rs-FC of the cerebellar and visuo-associative networks correlated with better scores in emotion-matching and discrimination tasks, respectively; while decreased rs-FC of the visuo-spatial network was related with better performance in the affect-matching and naming. In FTLD, the associations between rs-FC and CATS scores involved more brain regions, such as orbitofrontal and middle frontal gyri within anterior networks (i.e., salience and default-mode), parietal and somatosensory regions within visuo-spatial and sensorimotor networks, caudate and thalamus within basal-ganglia network. Rs-FC changes associated with CATS were similar among all FTLD groups. In FTLD compared to controls, the pattern of rs-FC associated with emotional processing involves a larger number of brain regions, likely due to functional specificity loss and compensatory attempts. These associations were similar across all FTLD groups, suggesting a common physiopathological mechanism of emotion processing breakdown, regardless the clinical presentation and pattern of atrophy.
Collapse
|
17
|
Cerebellum and Emotion Recognition. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2022; 1378:41-51. [DOI: 10.1007/978-3-030-99550-8_4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
18
|
OUP accepted manuscript. Cereb Cortex 2022; 32:4913-4933. [DOI: 10.1093/cercor/bhab524] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2021] [Revised: 12/16/2021] [Accepted: 12/17/2021] [Indexed: 11/12/2022] Open
|
19
|
Murray T, O'Brien J, Sagiv N, Garrido L. The role of stimulus-based cues and conceptual information in processing facial expressions of emotion. Cortex 2021; 144:109-132. [PMID: 34666297 DOI: 10.1016/j.cortex.2021.08.007] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2020] [Revised: 07/16/2021] [Accepted: 08/09/2021] [Indexed: 01/07/2023]
Abstract
Face shape and surface textures are two important cues that aid in the perception of facial expressions of emotion. Additionally, this perception is also influenced by high-level emotion concepts. Across two studies, we use representational similarity analysis to investigate the relative roles of shape, surface, and conceptual information in the perception, categorisation, and neural representation of facial expressions. In Study 1, 50 participants completed a perceptual task designed to measure the perceptual similarity of expression pairs, and a categorical task designed to measure the confusability between expression pairs when assigning emotion labels to a face. We used representational similarity analysis and constructed three models of the similarities between emotions using distinct information. Two models were based on stimulus-based cues (face shapes and surface textures) and one model was based on emotion concepts. Using multiple linear regression, we found that behaviour during both tasks was related with the similarity of emotion concepts. The model based on face shapes was more related with behaviour in the perceptual task than in the categorical, and the model based on surface textures was more related with behaviour in the categorical than the perceptual task. In Study 2, 30 participants viewed facial expressions while undergoing fMRI, allowing for the measurement of brain representational geometries of facial expressions of emotion in three core face-responsive regions (the Fusiform Face Area, Occipital Face Area, and Superior Temporal Sulcus), and a region involved in theory of mind (Medial Prefrontal Cortex). Across all four regions, the representational distances between facial expression pairs were related to the similarities of emotion concepts, but not to either of the stimulus-based cues. Together, these results highlight the important top-down influence of high-level emotion concepts both in behavioural tasks and in the neural representation of facial expressions.
Collapse
Affiliation(s)
- Thomas Murray
- Psychology Department, School of Biological and Behavioural Sciences, Queen Mary University London, United Kingdom.
| | - Justin O'Brien
- Centre for Cognitive Neuroscience, Department of Life Sciences, Brunel University London, United Kingdom
| | - Noam Sagiv
- Centre for Cognitive Neuroscience, Department of Life Sciences, Brunel University London, United Kingdom
| | - Lúcia Garrido
- Department of Psychology, City, University of London, United Kingdom
| |
Collapse
|
20
|
Matt S, Dzhelyova M, Maillard L, Lighezzolo-Alnot J, Rossion B, Caharel S. The rapid and automatic categorization of facial expression changes in highly variable natural images. Cortex 2021; 144:168-184. [PMID: 34666300 DOI: 10.1016/j.cortex.2021.08.005] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Revised: 07/08/2021] [Accepted: 08/09/2021] [Indexed: 01/23/2023]
Abstract
Emotional expressions are quickly and automatically read from human faces under natural viewing conditions. Yet, categorization of facial expressions is typically measured in experimental contexts with homogenous sets of face stimuli. Here we evaluated how the 6 basic facial emotions (Fear, Disgust, Happiness, Anger, Surprise or Sadness) can be rapidly and automatically categorized with faces varying in head orientation, lighting condition, identity, gender, age, ethnic origin and background context. High-density electroencephalography was recorded in 17 participants viewing 50 s sequences with natural variable images of neutral-expression faces alternating at a 6 Hz rate. Every five stimuli (1.2 Hz), variable natural images of one of the six basic expressions were presented. Despite the wide physical variability across images, a significant F/5 = 1.2 Hz response and its harmonics (e.g., 2F/5 = 2.4 Hz, etc.) was observed for all expression changes at the group-level and in every individual participant. Facial categorization responses were found mainly over occipito-temporal sites, with distinct hemispheric lateralization and cortical topographies according to the different expressions. Specifically, a stronger response was found to Sadness categorization, especially over the left hemisphere, as compared to Fear and Happiness, together with a right hemispheric dominance for categorization of Fearful faces. Importantly, these differences were specific to upright faces, ruling out the contribution of low-level visual cues. Overall, these observations point to robust rapid and automatic facial expression categorization processes in the human brain.
Collapse
Affiliation(s)
- Stéphanie Matt
- Université de Lorraine, 2LPN, Nancy, France; Université de Lorraine, Laboratoire INTERPSY, Nancy, France.
| | - Milena Dzhelyova
- Université Catholique de Louvain, Institute of Research in Psychological Science, Louvain-la-Neuve, Belgium.
| | - Louis Maillard
- Université de Lorraine, CNRS, CRAN, Nancy, France; Université de Lorraine, CHRU-Nancy, Service de Neurologie, Nancy, France.
| | | | - Bruno Rossion
- Université Catholique de Louvain, Institute of Research in Psychological Science, Louvain-la-Neuve, Belgium; Université de Lorraine, CNRS, CRAN, Nancy, France; Université de Lorraine, CHRU-Nancy, Service de Neurologie, Nancy, France.
| | - Stéphanie Caharel
- Université de Lorraine, 2LPN, Nancy, France; Institut Universitaire de France, Paris, France.
| |
Collapse
|
21
|
Barrick EM, Thornton MA, Tamir DI. Mask exposure during COVID-19 changes emotional face processing. PLoS One 2021; 16:e0258470. [PMID: 34637454 PMCID: PMC8509869 DOI: 10.1371/journal.pone.0258470] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2020] [Accepted: 09/28/2021] [Indexed: 11/19/2022] Open
Abstract
Faces are one of the key ways that we obtain social information about others. They allow people to identify individuals, understand conversational cues, and make judgements about others' mental states. When the COVID-19 pandemic hit the United States, widespread mask-wearing practices were implemented, causing a shift in the way Americans typically interact. This introduction of masks into social exchanges posed a potential challenge-how would people make these important inferences about others when a large source of information was no longer available? We conducted two studies that investigated the impact of mask exposure on emotion perception. In particular, we measured how participants used facial landmarks (visual cues) and the expressed valence and arousal (affective cues), to make similarity judgements about pairs of emotion faces. Study 1 found that in August 2020, participants with higher levels of mask exposure used cues from the eyes to a greater extent when judging emotion similarity than participants with less mask exposure. Study 2 measured participants' emotion perception in both April and September 2020 -before and after widespread mask adoption-in the same group of participants to examine changes in the use of facial cues over time. Results revealed an overall increase in the use of visual cues from April to September. Further, as mask exposure increased, people with the most social interaction showed the largest increase in the use of visual facial cues. These results provide evidence that a shift has occurred in how people process faces such that the more people are interacting with others that are wearing masks, the more they have learned to focus on visual cues from the eye area of the face.
Collapse
Affiliation(s)
- Elyssa M. Barrick
- Department of Psychology, Princeton University, Princeton, New Jersey, United States of America
| | - Mark A. Thornton
- Department of Psychological and Brain Sciences, Dartmouth College, Hanover, New Hampshire, United States of America
| | - Diana I. Tamir
- Department of Psychology, Princeton University, Princeton, New Jersey, United States of America
| |
Collapse
|
22
|
Liu M, Liu CH, Zheng S, Zhao K, Fu X. Reexamining the neural network involved in perception of facial expression: A meta-analysis. Neurosci Biobehav Rev 2021; 131:179-191. [PMID: 34536463 DOI: 10.1016/j.neubiorev.2021.09.024] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Revised: 08/19/2021] [Accepted: 09/09/2021] [Indexed: 11/15/2022]
Abstract
Perception of facial expression is essential for social interactions. Although a few competing models have enjoyed some success to map brain regions, they are also facing difficult challenges. The current study used an updated activation likelihood estimation (ALE) method of meta-analysis to explore the involvement of brain regions in facial expression processing. The sample contained 96 functional magnetic resonance imaging (fMRI) and positron emission tomography (PET) studies of healthy adults with the results of whole-brain analyses. The key findings revealed that the ventral pathway, especially the left fusiform face area (FFA) region, was more responsive to facial expression. The left posterior FFA showed strong involvement when participants passively viewing emotional faces without being asked to judge the type of expression or other attributes of the stimuli. Through meta-analytic connectivity modeling (MACM) of the main brain regions in the ventral pathway, we constructed a co-activating neural network as a revised model of facial expression processing that assigns prominent roles to the amygdala, FFA, the occipital gyrus, and the inferior frontal gyrus.
Collapse
Affiliation(s)
- Mingtong Liu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Chang Hong Liu
- Department of Psychology, Bournemouth University, Dorset, United Kingdom
| | - Shuang Zheng
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Ke Zhao
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China.
| | - Xiaolan Fu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China.
| |
Collapse
|
23
|
Brooks JA, Stolier RM, Freeman JB. Computational approaches to the neuroscience of social perception. Soc Cogn Affect Neurosci 2021; 16:827-837. [PMID: 32986115 PMCID: PMC8343569 DOI: 10.1093/scan/nsaa127] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2019] [Revised: 07/23/2020] [Accepted: 09/09/2020] [Indexed: 11/14/2022] Open
Abstract
Across multiple domains of social perception-including social categorization, emotion perception, impression formation and mentalizing-multivariate pattern analysis (MVPA) of functional magnetic resonance imaging (fMRI) data has permitted a more detailed understanding of how social information is processed and represented in the brain. As in other neuroimaging fields, the neuroscientific study of social perception initially relied on broad structure-function associations derived from univariate fMRI analysis to map neural regions involved in these processes. In this review, we trace the ways that social neuroscience studies using MVPA have built on these neuroanatomical associations to better characterize the computational relevance of different brain regions, and discuss how MVPA allows explicit tests of the correspondence between psychological models and the neural representation of social information. We also describe current and future advances in methodological approaches to multivariate fMRI data and their theoretical value for the neuroscience of social perception.
Collapse
Affiliation(s)
- Jeffrey A Brooks
- Department of Psychology, New York University, New York, NY, USA
| | - Ryan M Stolier
- Columbia University, 1190 Amsterdam Ave., New York, NY 10027, USA
| | | |
Collapse
|
24
|
Volynets S, Smirnov D, Saarimäki H, Nummenmaa L. Statistical pattern recognition reveals shared neural signatures for displaying and recognizing specific facial expressions. Soc Cogn Affect Neurosci 2021; 15:803-813. [PMID: 33007782 PMCID: PMC7543934 DOI: 10.1093/scan/nsaa110] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2019] [Revised: 05/25/2020] [Accepted: 08/03/2020] [Indexed: 12/30/2022] Open
Abstract
Human neuroimaging and behavioural studies suggest that somatomotor ‘mirroring’ of seen facial expressions may support their recognition. Here we show that viewing specific facial expressions triggers the representation corresponding to that expression in the observer’s brain. Twelve healthy female volunteers underwent two separate fMRI sessions: one where they observed and another where they displayed three types of facial expressions (joy, anger and disgust). Pattern classifier based on Bayesian logistic regression was trained to classify facial expressions (i) within modality (trained and tested with data recorded while observing or displaying expressions) and (ii) between modalities (trained with data recorded while displaying expressions and tested with data recorded while observing the expressions). Cross-modal classification was performed in two ways: with and without functional realignment of the data across observing/displaying conditions. All expressions could be accurately classified within and also across modalities. Brain regions contributing most to cross-modal classification accuracy included primary motor and somatosensory cortices. Functional realignment led to only minor increases in cross-modal classification accuracy for most of the examined ROIs. Substantial improvement was observed in the occipito-ventral components of the core system for facial expression recognition. Altogether these results support the embodied emotion recognition model and show that expression-specific somatomotor neural signatures could support facial expression recognition.
Collapse
Affiliation(s)
- Sofia Volynets
- Correspondence should be addressed to Lauri Nummenmaa, Turku PET Centre c/o Turku University Hospital, Kiinamyllynkatu 4-6, 20520 Turku, Finland. E-mail:
| | - Dmitry Smirnov
- Department of Neuroscience and Biomedical Engineering, School of Science, Aalto University, FI-0076 Aalto, Finland
| | - Heini Saarimäki
- Department of Neuroscience and Biomedical Engineering, School of Science, Aalto University, FI-0076 Aalto, Finland
- Faculty of Social Sciences, Tampere University, FI-33014 Tampere, Finland
| | - Lauri Nummenmaa
- Turku PET Centre and Department of Psychology, University of Turku, FI-20520 Turku, Finland
- Turku University Hospital, University of Turku, FI-20520 Turku, Finland
| |
Collapse
|
25
|
Modality-general and modality-specific audiovisual valence processing. Cortex 2021; 138:127-137. [PMID: 33684626 DOI: 10.1016/j.cortex.2021.01.022] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2020] [Revised: 11/08/2020] [Accepted: 01/20/2021] [Indexed: 11/23/2022]
Abstract
A fundamental question in affective neuroscience is whether there is a common hedonic system for valence processing independent of modality, or there are distinct neural systems for different modalities. To address this question, we used both region of interest and whole-brain representational similarity analyses on functional magnetic resonance imaging data to identify modality-general and modality-specific brain areas involved in valence processing across visual and auditory modalities. First, region of interest analyses showed that the superior temporal cortex was associated with both modality-general and auditory-specific models, while the primary visual cortex was associated with the visual-specific model. Second, the whole-brain searchlight analyses also identified both modality-general and modality-specific representations. The modality-general regions included the superior temporal, medial superior frontal, inferior frontal, precuneus, precentral, postcentral, supramarginal, paracentral lobule and middle cingulate cortices. The modality-specific regions included both perceptual cortices and higher-order brain areas. The valence representations derived from individualized behavioral valence ratings were consistent with these results. Together, these findings suggest both modality-general and modality-specific representations of valence.
Collapse
|
26
|
Hendriks MHA, Dillen C, Vettori S, Vercammen L, Daniels N, Steyaert J, Op de Beeck H, Boets B. Neural processing of facial identity and expression in adults with and without autism: A multi-method approach. NEUROIMAGE-CLINICAL 2020; 29:102520. [PMID: 33338966 PMCID: PMC7750419 DOI: 10.1016/j.nicl.2020.102520] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/15/2020] [Revised: 10/23/2020] [Accepted: 11/30/2020] [Indexed: 11/28/2022]
Abstract
The ability to recognize faces and facial expressions is a common human talent. It has, however, been suggested to be impaired in individuals with autism spectrum disorder (ASD). The goal of this study was to compare the processing of facial identity and emotion between individuals with ASD and neurotypicals (NTs). Behavioural and functional magnetic resonance imaging (fMRI) data from 46 young adults (aged 17-23 years, NASD = 22, NNT = 24) was analysed. During fMRI data acquisition, participants discriminated between short clips of a face transitioning from a neutral to an emotional expression. Stimuli included four identities and six emotions. We performed behavioural, univariate, multi-voxel, adaptation and functional connectivity analyses to investigate potential group differences. The ASD-group did not differ from the NT-group on behavioural identity and expression processing tasks. At the neural level, we found no differences in average neural activation, neural activation patterns and neural adaptation to faces in face-related brain regions. In terms of functional connectivity, we found that amygdala seems to be more strongly connected to inferior occipital cortex and V1 in individuals with ASD. Overall, the findings indicate that neural representations of facial identity and expression have a similar quality in individuals with and without ASD, but some regions containing these representations are connected differently in the extended face processing network.
Collapse
Affiliation(s)
- Michelle H A Hendriks
- Department of Brain and Cognition, KU Leuven, Tiensestraat 102 - bus 3714, Leuven, Belgium; Leuven Autism Research Consortium, KU Leuven, Leuven, Belgium
| | - Claudia Dillen
- Department of Brain and Cognition, KU Leuven, Tiensestraat 102 - bus 3714, Leuven, Belgium; Leuven Autism Research Consortium, KU Leuven, Leuven, Belgium
| | - Sofie Vettori
- Centre for Developmental Psychiatry, KU Leuven, Kapucijnenvoer 7 blok h - bus 7001, Leuven, Belgium; Leuven Autism Research Consortium, KU Leuven, Leuven, Belgium
| | - Laura Vercammen
- Department of Brain and Cognition, KU Leuven, Tiensestraat 102 - bus 3714, Leuven, Belgium
| | - Nicky Daniels
- Department of Brain and Cognition, KU Leuven, Tiensestraat 102 - bus 3714, Leuven, Belgium; Centre for Developmental Psychiatry, KU Leuven, Kapucijnenvoer 7 blok h - bus 7001, Leuven, Belgium; Leuven Autism Research Consortium, KU Leuven, Leuven, Belgium
| | - Jean Steyaert
- Centre for Developmental Psychiatry, KU Leuven, Kapucijnenvoer 7 blok h - bus 7001, Leuven, Belgium; Leuven Autism Research Consortium, KU Leuven, Leuven, Belgium
| | - Hans Op de Beeck
- Department of Brain and Cognition, KU Leuven, Tiensestraat 102 - bus 3714, Leuven, Belgium; Leuven Brain Institute, KU Leuven, Leuven, Belgium
| | - Bart Boets
- Centre for Developmental Psychiatry, KU Leuven, Kapucijnenvoer 7 blok h - bus 7001, Leuven, Belgium; Leuven Brain Institute, KU Leuven, Leuven, Belgium; Leuven Autism Research Consortium, KU Leuven, Leuven, Belgium.
| |
Collapse
|
27
|
Liang Y, Liu B. Cross-Subject Commonality of Emotion Representations in Dorsal Motion-Sensitive Areas. Front Neurosci 2020; 14:567797. [PMID: 33177977 PMCID: PMC7591793 DOI: 10.3389/fnins.2020.567797] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2020] [Accepted: 09/22/2020] [Indexed: 11/13/2022] Open
Abstract
Emotion perception is a crucial question in cognitive neuroscience and the underlying neural substrates have been the subject of intense study. One of our previous studies demonstrated that motion-sensitive areas are involved in the perception of facial expressions. However, it remains unclear whether emotions perceived from whole-person stimuli can be decoded from the motion-sensitive areas. In addition, if emotions are represented in the motion-sensitive areas, we may further ask whether the representations of emotions in the motion-sensitive areas can be shared across individual subjects. To address these questions, this study collected neural images while participants viewed emotions (joy, anger, and fear) from videos of whole-person expressions (contained both face and body parts) in a block-design functional magnetic resonance imaging (fMRI) experiment. Multivariate pattern analysis (MVPA) was conducted to explore the emotion decoding performance in individual-defined dorsal motion-sensitive regions of interest (ROIs). Results revealed that emotions could be successfully decoded from motion-sensitive ROIs with statistically significant classification accuracies for three emotions as well as positive versus negative emotions. Moreover, results from the cross-subject classification analysis showed that a person’s emotion representation could be robustly predicted by others’ emotion representations in motion-sensitive areas. Together, these results reveal that emotions are represented in dorsal motion-sensitive areas and that the representation of emotions is consistent across subjects. Our findings provide new evidence of the involvement of motion-sensitive areas in the emotion decoding, and further suggest that there exists a common emotion code in the motion-sensitive areas across individual subjects.
Collapse
Affiliation(s)
- Yin Liang
- Faculty of Information Technology, College of Computer Science and Technology, Beijing Artificial Intelligence Institute, Beijing University of Technology, Beijing, China
| | - Baolin Liu
- School of Computer and Communication Engineering, University of Science and Technology Beijing, Beijing, China
| |
Collapse
|
28
|
Mapping neural activity patterns to contextualized fearful facial expressions onto callous-unemotional (CU) traits: intersubject representational similarity analysis reveals less variation among high-CU adolescents. PERSONALITY NEUROSCIENCE 2020; 3:e12. [PMID: 33283146 PMCID: PMC7681174 DOI: 10.1017/pen.2020.13] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/28/2020] [Revised: 07/20/2020] [Accepted: 08/10/2020] [Indexed: 12/15/2022]
Abstract
Callous-unemotional (CU) traits are early-emerging personality features characterized by deficits in empathy, concern for others, and remorse following social transgressions. One of the interpersonal deficits most consistently associated with CU traits is impaired behavioral and neurophysiological responsiveness to fearful facial expressions. However, the facial expression paradigms traditionally employed in neuroimaging are often ambiguous with respect to the nature of threat (i.e., is the perceiver the threat, or is something else in the environment?). In the present study, 30 adolescents with varying CU traits viewed fearful facial expressions cued to three different contexts ("afraid for you," "afraid of you," "afraid for self") while undergoing functional magnetic resonance imaging (fMRI). Univariate analyses found that mean right amygdala activity during the "afraid for self" context was negatively associated with CU traits. With the goal of disentangling idiosyncratic stimulus-driven neural responses, we employed intersubject representational similarity analysis to link intersubject similarities in multivoxel neural response patterns to contextualized fearful expressions with differential intersubject models of CU traits. Among low-CU adolescents, neural response patterns while viewing fearful faces were most consistently similar early in the visual processing stream and among regions implicated in affective responding, but were more idiosyncratic as emotional face information moved up the cortical processing hierarchy. By contrast, high-CU adolescents' neural response patterns consistently aligned along the entire cortical hierarchy (but diverged among low-CU youths). Observed patterns varied across contexts, suggesting that interpretations of fearful expressions depend to an extent on neural response patterns and are further shaped by levels of CU traits.
Collapse
|
29
|
Sonderfeld M, Mathiak K, Häring GS, Schmidt S, Habel U, Gur R, Klasen M. Supramodal neural networks support top-down processing of social signals. Hum Brain Mapp 2020; 42:676-689. [PMID: 33073911 PMCID: PMC7814753 DOI: 10.1002/hbm.25252] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2020] [Revised: 08/08/2020] [Accepted: 09/29/2020] [Indexed: 12/17/2022] Open
Abstract
The perception of facial and vocal stimuli is driven by sensory input and cognitive top‐down influences. Important top‐down influences are attentional focus and supramodal social memory representations. The present study investigated the neural networks underlying these top‐down processes and their role in social stimulus classification. In a neuroimaging study with 45 healthy participants, we employed a social adaptation of the Implicit Association Test. Attentional focus was modified via the classification task, which compared two domains of social perception (emotion and gender), using the exactly same stimulus set. Supramodal memory representations were addressed via congruency of the target categories for the classification of auditory and visual social stimuli (voices and faces). Functional magnetic resonance imaging identified attention‐specific and supramodal networks. Emotion classification networks included bilateral anterior insula, pre‐supplementary motor area, and right inferior frontal gyrus. They were pure attention‐driven and independent from stimulus modality or congruency of the target concepts. No neural contribution of supramodal memory representations could be revealed for emotion classification. In contrast, gender classification relied on supramodal memory representations in rostral anterior cingulate and ventromedial prefrontal cortices. In summary, different domains of social perception involve different top‐down processes which take place in clearly distinguishable neural networks.
Collapse
Affiliation(s)
- Melina Sonderfeld
- Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen, Aachen, Germany.,JARA-Translational Brain Medicine, RWTH Aachen University, Aachen, Germany
| | - Klaus Mathiak
- Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen, Aachen, Germany.,JARA-Translational Brain Medicine, RWTH Aachen University, Aachen, Germany
| | - Gianna S Häring
- Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen, Aachen, Germany.,JARA-Translational Brain Medicine, RWTH Aachen University, Aachen, Germany
| | - Sarah Schmidt
- Life & Brain - Institute for Experimental Epileptology and Cognition Research, Bonn, Germany
| | - Ute Habel
- Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen, Aachen, Germany.,JARA-Translational Brain Medicine, RWTH Aachen University, Aachen, Germany
| | - Raquel Gur
- Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - Martin Klasen
- Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen, Aachen, Germany.,JARA-Translational Brain Medicine, RWTH Aachen University, Aachen, Germany.,Interdisciplinary Training Centre for Medical Education and Patient Safety - AIXTRA, Medical Faculty, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
30
|
Li Y, Richardson RM, Ghuman AS. Posterior Fusiform and Midfusiform Contribute to Distinct Stages of Facial Expression Processing. Cereb Cortex 2020; 29:3209-3219. [PMID: 30124788 DOI: 10.1093/cercor/bhy186] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2018] [Revised: 07/15/2018] [Accepted: 07/19/2018] [Indexed: 11/12/2022] Open
Abstract
Though the fusiform is well-established as a key node in the face perception network, its role in facial expression processing remains unclear, due to competing models and discrepant findings. To help resolve this debate, we recorded from 17 subjects with intracranial electrodes implanted in face sensitive patches of the fusiform. Multivariate classification analysis showed that facial expression information is represented in fusiform activity and in the same regions that represent identity, though with a smaller effect size. Examination of the spatiotemporal dynamics revealed a functional distinction between posterior fusiform and midfusiform expression coding, with posterior fusiform showing an early peak of facial expression sensitivity at around 180 ms after subjects viewed a face and midfusiform showing a later and extended peak between 230 and 460 ms. These results support the hypothesis that the fusiform plays a role in facial expression perception and highlight a qualitative functional distinction between processing in posterior fusiform and midfusiform, with each contributing to temporally segregated stages of expression perception.
Collapse
Affiliation(s)
- Yuanning Li
- Center for the Neural Basis of Cognition, Carnegie Mellon University and University of Pittsburgh, Pittsburgh, PA, USA.,Program in Neural Computation and Machine Learning, Carnegie Mellon University, Pittsburgh, PA, USA.,Department of Neurological Surgery, University of Pittsburgh, Pittsburgh, PA, USA
| | - R Mark Richardson
- Center for the Neural Basis of Cognition, Carnegie Mellon University and University of Pittsburgh, Pittsburgh, PA, USA.,Department of Neurological Surgery, University of Pittsburgh, Pittsburgh, PA, USA
| | - Avniel Singh Ghuman
- Center for the Neural Basis of Cognition, Carnegie Mellon University and University of Pittsburgh, Pittsburgh, PA, USA.,Program in Neural Computation and Machine Learning, Carnegie Mellon University, Pittsburgh, PA, USA.,Department of Neurological Surgery, University of Pittsburgh, Pittsburgh, PA, USA
| |
Collapse
|
31
|
Zhang K, Rigo P, Su X, Wang M, Chen Z, Esposito G, Putnick DL, Bornstein MH, Du X. Brain Responses to Emotional Infant Faces in New Mothers and Nulliparous Women. Sci Rep 2020; 10:9560. [PMID: 32533113 PMCID: PMC7293211 DOI: 10.1038/s41598-020-66511-x] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2020] [Accepted: 05/21/2020] [Indexed: 12/14/2022] Open
Abstract
The experience of motherhood is one of the most salient events in a woman’s life. Motherhood is associated with a series of neurophysiological, psychological, and behavioral changes that allow women to better adapt to their new role as mothers. Infants communicate their needs and physiological states mainly through salient emotional expressions, and maternal responses to infant signals are critical for infant survival and development. In this study, we investigated the whole brain functional response to emotional infant faces in 20 new mothers and 22 nulliparous women during functional magnetic resonance imaging scans. New mothers showed higher brain activation in regions involved in infant facial expression processing and empathic and mentalizing networks than nulliparous women. Furthermore, magnitudes of the activation of the left parahippocampal gyrus and the left fusiform gyrus, recruited during facial expression processing, were positively correlated with empathic concern (EC) scores in new mothers when viewing emotional (happy-sad) faces contrasted to neutral faces. Taken together, these results indicate that the experience of being a mother affects human brain responses in visual and social cognitive brain areas and in brain areas associated with theory-of-mind related and empathic processing.
Collapse
Affiliation(s)
- Kaihua Zhang
- Department of Electronic Science, Fujian Provincial Key Laboratory of Plasma and Magnetic Resonance, Xiamen University, Xiamen, 361000, China.,Shanghai Key Laboratory of Magnetic Resonance and Department of Physics, School of Physics and Electronic Science, East China Normal University, Shanghai, 200062, China
| | - Paola Rigo
- Department of Developmental Psychology and Socialisation, University of Padova, Padova, Italy
| | - Xueyun Su
- Department of Special Education, Faculty of Education, East China Normal University, Shanghai, 200062, China
| | - Mengxing Wang
- Shanghai Key Laboratory of Magnetic Resonance and Department of Physics, School of Physics and Electronic Science, East China Normal University, Shanghai, 200062, China
| | - Zhong Chen
- Department of Electronic Science, Fujian Provincial Key Laboratory of Plasma and Magnetic Resonance, Xiamen University, Xiamen, 361000, China
| | - Gianluca Esposito
- Department of Psychology and Cognitive Science, University of Trento, Trento, Italy.,Psychology Programme, School of Social Sciences, Nanyang Technological University, Nanyang, Singapore
| | - Diane L Putnick
- Eunice Kennedy Shriver National Institute of Child Health and Human Development, NIH, Bethesda, MD, USA
| | - Marc H Bornstein
- Eunice Kennedy Shriver National Institute of Child Health and Human Development, NIH, Bethesda, MD, USA
| | - Xiaoxia Du
- Shanghai Key Laboratory of Magnetic Resonance and Department of Physics, School of Physics and Electronic Science, East China Normal University, Shanghai, 200062, China.
| |
Collapse
|
32
|
Yau Y, Dadar M, Taylor M, Zeighami Y, Fellows LK, Cisek P, Dagher A. Neural Correlates of Evidence and Urgency During Human Perceptual Decision-Making in Dynamically Changing Conditions. Cereb Cortex 2020; 30:5471-5483. [PMID: 32500144 DOI: 10.1093/cercor/bhaa129] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2019] [Revised: 03/27/2020] [Accepted: 04/22/2020] [Indexed: 12/31/2022] Open
Abstract
Current models of decision-making assume that the brain gradually accumulates evidence and drifts toward a threshold that, once crossed, results in a choice selection. These models have been especially successful in primate research; however, transposing them to human fMRI paradigms has proved it to be challenging. Here, we exploit the face-selective visual system and test whether decoded emotional facial features from multivariate fMRI signals during a dynamic perceptual decision-making task are related to the parameters of computational models of decision-making. We show that trial-by-trial variations in the pattern of neural activity in the fusiform gyrus reflect facial emotional information and modulate drift rates during deliberation. We also observed an inverse-urgency signal based in the caudate nucleus that was independent of sensory information but appeared to slow decisions, particularly when information in the task was ambiguous. Taken together, our results characterize how decision parameters from a computational model (i.e., drift rate and urgency signal) are involved in perceptual decision-making and reflected in the activity of the human brain.
Collapse
Affiliation(s)
- Y Yau
- McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montréal, Quebec H3A 2B4, Canada
| | - M Dadar
- McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montréal, Quebec H3A 2B4, Canada
| | - M Taylor
- McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montréal, Quebec H3A 2B4, Canada.,Schulich School of Medicine and Dentistry, University of Western Ontario, London, Ontario, N6A 5C1, Canada
| | - Y Zeighami
- McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montréal, Quebec H3A 2B4, Canada
| | - L K Fellows
- McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montréal, Quebec H3A 2B4, Canada
| | - P Cisek
- Département of Neuroscience, Université of Montréal, Montréal, Quebec H3C 3J7, Canada
| | - A Dagher
- McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montréal, Quebec H3A 2B4, Canada
| |
Collapse
|
33
|
Zhao K, Liu M, Gu J, Mo F, Fu X, Hong Liu C. The Preponderant Role of Fusiform Face Area for the Facial Expression Confusion Effect: An MEG Study. Neuroscience 2020; 433:42-52. [PMID: 32169552 DOI: 10.1016/j.neuroscience.2020.03.001] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2019] [Revised: 02/24/2020] [Accepted: 03/02/2020] [Indexed: 01/21/2023]
Abstract
Although the recognition of facial expressions seems automatic and effortless, discrimination of expressions can still be error prone. Common errors are often due to visual similarities between some expressions (e.g., fear and surprise). However, little is known about the neural mechanisms underlying such a confusion effect. To address this question, we recorded the magnetoencephalography (MEG) while participants judged facial expressions that were either easily confused with or easily distinguished from other expressions. The results showed that the fusiform face area (FFA), rather than the posterior superior temporal sulcus (pSTS), played a preponderant role in discriminating confusable facial expressions. No difference between high confusion and low confusion conditions was observed on the M170 component in either the FFA or the pSTS, whilst a difference between two conditions started to emerge in the late positive potential (LPP), with the low confusion condition eliciting a larger LPP amplitude in the FFA. In addition, the power of delta was stronger in the time window of LPP component. This confusion effect was reflected in the FFA, which might be associated with the perceptual-to-conceptual shift.
Collapse
Affiliation(s)
- Ke Zhao
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China; Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Mingtong Liu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China; Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Jingjin Gu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China; Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Fan Mo
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China; Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Xiaolan Fu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China; Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.
| | - Chang Hong Liu
- Department of Psychology, Bournemouth University, Dorset, United Kingdom
| |
Collapse
|
34
|
Wong JJ, Chang DHF, Qi D, Men W, Gao JH, Lee TMC. The pontine-driven somatic gaze tract contributes to affective processing in humans. Neuroimage 2020; 213:116692. [PMID: 32135263 DOI: 10.1016/j.neuroimage.2020.116692] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2019] [Revised: 02/25/2020] [Accepted: 02/26/2020] [Indexed: 11/15/2022] Open
Abstract
The relevance of subcortical structures for affective processing is not fully understood. Inspired by the gerbil retino-raphe pathway that has been shown to regulate affective behavior and previous human work showing that the pontine region is important for processing emotion, we asked whether well-established tracts in humans traveling between the eye and the brain stem contribute to functions beyond their conventionally understood roles. Here we report neuroimaging findings showing that optic chiasm-brain stem diffusivity predict responses reflecting perceived arousal and valence. Analyses of subsequent task-evoked connectivity further revealed that visual affective processing implicates the brain stem, particularly the pontine region at an early stage of the cascade, projecting to cortico-limbic regions in a feedforward manner. The optimal model implies that all intrinsic connections between the regions of interest are unidirectional and outwards from the pontine region. These findings suggest that affective processing implicates regions outside the cortico-limbic network. The involvement of a phylogenetically older locus in the pons that has consequences in oculomotor control may imply adaptive consequences of affect detection.
Collapse
Affiliation(s)
- Jing Jun Wong
- State Key Laboratory of Brain and Cognitive Sciences, The University of Hong Kong, Hong Kong; Laboratory of Neuropsychology, The University of Hong Kong, Hong Kong; Laboratory of Social Cognitive and Affective Neuroscience, The University of Hong Kong, Hong Kong
| | - Dorita H F Chang
- State Key Laboratory of Brain and Cognitive Sciences, The University of Hong Kong, Hong Kong; Department of Psychology, The University of Hong Kong, Hong Kong
| | - Di Qi
- State Key Laboratory of Brain and Cognitive Sciences, The University of Hong Kong, Hong Kong; Laboratory of Neuropsychology, The University of Hong Kong, Hong Kong; Laboratory of Social Cognitive and Affective Neuroscience, The University of Hong Kong, Hong Kong
| | - Weiwei Men
- Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing, 100871, China
| | - Jia-Hong Gao
- Center for MRI Research and McGovern Institute for Brain Research, Peking University, Beijing, 100871, China
| | - Tatia M C Lee
- State Key Laboratory of Brain and Cognitive Sciences, The University of Hong Kong, Hong Kong; Laboratory of Neuropsychology, The University of Hong Kong, Hong Kong; Laboratory of Social Cognitive and Affective Neuroscience, The University of Hong Kong, Hong Kong; Institute of Clinical Neuropsychology, The University of Hong Kong, Hong Kong; Center for Brain Science and Brain-Inspired Intelligence, Guangdong-Hong Kong-Macao Greater Bay Area, China.
| |
Collapse
|
35
|
Thomas M, Gaudeau-Bosma C, Bouaziz N, Schenin-King Andrianisaina P, Benadhira R, Januel D, Moulier V. Gender-Related Brain Activations during an Emotional Task: An fMRI Study. Neuropsychobiology 2020; 78:128-135. [PMID: 31117090 DOI: 10.1159/000499977] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/11/2018] [Accepted: 03/27/2019] [Indexed: 11/19/2022]
Abstract
BACKGROUND Women have twice the rate of depression and anxiety disorders as men. Some studies suggest that this could be caused by women's greater sensitivity to negative emotions. Few brain imaging studies have compared the brain activity of men to women during a presentation of emotional stimuli. Our objective was to investigate brain activations in men and women during an emotional task. We hypothesized that the pattern of brain activations would differ by gender and valence of the stimuli. METHODS We conducted a functional magnetic resonance imaging study in 30 healthy participants (15 men and 15 women). Positive, negative and neutral photographs were presented to the subjects. Participants subjectively rated the valence and intensity of the stimuli. RESULTS No significant gender-by-category interaction effect was observed for the intensity or valence of the stimuli. We found that, during the presentation of negative photographs, there was a higher activity in women's right fusiform gyrus compared to men's. CONCLUSION Given the involvement of the fusiform gyrus in anxiety disorders, this study yields promising findings in order to better understand women's vulnerability to anxiety disorders.
Collapse
Affiliation(s)
- Maxence Thomas
- Unité de Recherche Clinique, EPS Ville-Evrard, Neuilly-sur-Marne, France,
| | | | - Noomane Bouaziz
- Unité de Recherche Clinique, EPS Ville-Evrard, Neuilly-sur-Marne, France
| | | | - René Benadhira
- Unité de Recherche Clinique, EPS Ville-Evrard, Neuilly-sur-Marne, France
| | - Dominique Januel
- Unité de Recherche Clinique, EPS Ville-Evrard, Neuilly-sur-Marne, France
| | - Virginie Moulier
- Unité de Recherche Clinique, EPS Ville-Evrard, Neuilly-sur-Marne, France.,University Department of Psychiatry, Centre Hospitalier du Rouvray, Sotteville-lès-Rouen, France
| |
Collapse
|
36
|
Liang Y, Liu B, Ji J, Li X. Network Representations of Facial and Bodily Expressions: Evidence From Multivariate Connectivity Pattern Classification. Front Neurosci 2019; 13:1111. [PMID: 31736683 PMCID: PMC6828617 DOI: 10.3389/fnins.2019.01111] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2019] [Accepted: 10/02/2019] [Indexed: 01/21/2023] Open
Abstract
Emotions can be perceived from both facial and bodily expressions. Our previous study has found the successful decoding of facial expressions based on the functional connectivity (FC) patterns. However, the role of the FC patterns in the recognition of bodily expressions remained unclear, and no neuroimaging studies have adequately addressed the question of whether emotions perceiving from facial and bodily expressions are processed rely upon common or different neural networks. To address this, the present study collected functional magnetic resonance imaging (fMRI) data from a block design experiment with facial and bodily expression videos as stimuli (three emotions: anger, fear, and joy), and conducted multivariate pattern classification analysis based on the estimated FC patterns. We found that in addition to the facial expressions, bodily expressions could also be successfully decoded based on the large-scale FC patterns. The emotion classification accuracies for the facial expressions were higher than that for the bodily expressions. Further contributive FC analysis showed that emotion-discriminative networks were widely distributed in both hemispheres, containing regions that ranged from primary visual areas to higher-level cognitive areas. Moreover, for a particular emotion, discriminative FCs for facial and bodily expressions were distinct. Together, our findings highlight the key role of the FC patterns in the emotion processing, indicating how large-scale FC patterns reconfigure in processing of facial and bodily expressions, and suggest the distributed neural representation for the emotion recognition. Furthermore, our results also suggest that the human brain employs separate network representations for facial and bodily expressions of the same emotions. This study provides new evidence for the network representations for emotion perception and may further our understanding of the potential mechanisms underlying body language emotion recognition.
Collapse
Affiliation(s)
- Yin Liang
- Faculty of Information Technology, Beijing Artificial Intelligence Institute, Beijing University of Technology, Beijing, China
| | - Baolin Liu
- Tianjin Key Laboratory of Cognitive Computing and Application, School of Computer Science and Technology, Tianjin University, Tianjin, China.,School of Computer and Communication Engineering, University of Science and Technology Beijing, Beijing, China.,State Key Laboratory of Intelligent Technology and Systems, National Laboratory for Information Science and Technology, Tsinghua University, Beijing, China
| | - Junzhong Ji
- Faculty of Information Technology, Beijing Artificial Intelligence Institute, Beijing University of Technology, Beijing, China
| | - Xianglin Li
- Medical Imaging Research Institute, Binzhou Medical University, Yantai, China
| |
Collapse
|
37
|
Nguyen T, Zhou T, Potter T, Zou L, Zhang Y. The Cortical Network of Emotion Regulation: Insights From Advanced EEG-fMRI Integration Analysis. IEEE TRANSACTIONS ON MEDICAL IMAGING 2019; 38:2423-2433. [PMID: 30802854 DOI: 10.1109/tmi.2019.2900978] [Citation(s) in RCA: 31] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
The ability to perceive and regulate emotion is a key component of cognition that is often disrupted by disease. Current neuroimaging studies regarding emotion regulation have implicated a number of cortical regions and identified several EEG features of interest, including the late positive potential and frontal asymmetry. Unfortunately, currently applied methods generally lack in the resolution necessary to capture focal cortical activity and explore the causal interactions between brain regions. In this paper, electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) data were simultaneously recorded from 20 subjects undergoing emotion processing and regulation tasks. Cortical activity with high-spatiotemporal resolution and accuracy was reconstructed using a novel multimodal EEG/fMRI integration method. A detailed causal brain network associated with emotion processing and regulation was then identified, and the network changes that facilitate different emotion conditions were investigated. The cortical activity of the ventrolateral prefrontal (VLPFC) and posterior parietal cortices depicted conditionally-sensitive spike and wave patterns evidenced in inter-regional communication. The VLPFC was found to behave as a main network source, with conditionally-specific interactions supporting emotional shifts. The results provide unique insight into the cortical activity that supports emotional perception and regulation, the origins of known EEG phenomena, and the manner in which brain regions coordinate to affect behavior.
Collapse
|
38
|
Fernandes O, Portugal LCL, Alves RDCS, Arruda-Sanchez T, Volchan E, Pereira MG, Mourão-Miranda J, Oliveira L. How do you perceive threat? It's all in your pattern of brain activity. Brain Imaging Behav 2019; 14:2251-2266. [PMID: 31446554 PMCID: PMC7648008 DOI: 10.1007/s11682-019-00177-6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Whether subtle differences in the emotional context during threat perception can be detected by multi-voxel pattern analysis (MVPA) remains a topic of debate. To investigate this question, we compared the ability of pattern recognition analysis to discriminate between patterns of brain activity to a threatening versus a physically paired neutral stimulus in two different emotional contexts (the stimulus being directed towards or away from the viewer). The directionality of the stimuli is known to be an important factor in activating different defensive responses. Using multiple kernel learning (MKL) classification models, we accurately discriminated patterns of brain activation to threat versus neutral stimuli in the directed towards context but not during the directed away context. Furthermore, we investigated whether it was possible to decode an individual’s subjective threat perception from patterns of whole-brain activity to threatening stimuli in the different emotional contexts using MKL regression models. Interestingly, we were able to accurately predict the subjective threat perception index from the pattern of brain activation to threat only during the directed away context. These results show that subtle differences in the emotional context during threat perception can be detected by MVPA. In the directed towards context, the threat perception was more intense, potentially producing more homogeneous patterns of brain activation across individuals. In the directed away context, the threat perception was relatively less intense and more variable across individuals, enabling the regression model to successfully capture the individual differences and predict the subjective threat perception.
Collapse
Affiliation(s)
- Orlando Fernandes
- Laboratory of Neuroimaging and Psychophysiology, Department of Radiology, Faculty of Medicine, Federal University of Rio de Janeiro, 255 Rodolpho Paulo Rocco st., Ilha do Fundão, Rio de Janeiro, RJ, 21941-590, Brazil.
| | - Liana Catrina Lima Portugal
- Laboratory of Behavioral Neurophysiology, Department of Physiology and Pharmacology, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil
| | - Rita de Cássia S Alves
- Laboratory of Behavioral Neurophysiology, Department of Physiology and Pharmacology, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil.,IBMR University Center, Rio de Janeiro, RJ, Brazil
| | - Tiago Arruda-Sanchez
- Laboratory of Neuroimaging and Psychophysiology, Department of Radiology, Faculty of Medicine, Federal University of Rio de Janeiro, 255 Rodolpho Paulo Rocco st., Ilha do Fundão, Rio de Janeiro, RJ, 21941-590, Brazil
| | - Eliane Volchan
- Laboratory of Neurobiology II, Institute of Biophysics Carlos Chagas Filho, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, Brazil
| | - Mirtes Garcia Pereira
- Laboratory of Behavioral Neurophysiology, Department of Physiology and Pharmacology, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil
| | - Janaina Mourão-Miranda
- Centre for Medical Image Computing, Department of Computer Science, University College London, London, UK.,Max Planck University College London Centre for Computational Psychiatry and Ageing Research, University College London, London, UK
| | - Letícia Oliveira
- Laboratory of Behavioral Neurophysiology, Department of Physiology and Pharmacology, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil
| |
Collapse
|
39
|
The neural representation of facial-emotion categories reflects conceptual structure. Proc Natl Acad Sci U S A 2019; 116:15861-15870. [PMID: 31332015 DOI: 10.1073/pnas.1816408116] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023] Open
Abstract
Humans reliably categorize configurations of facial actions into specific emotion categories, leading some to argue that this process is invariant between individuals and cultures. However, growing behavioral evidence suggests that factors such as emotion-concept knowledge may shape the way emotions are visually perceived, leading to variability-rather than universality-in facial-emotion perception. Understanding variability in emotion perception is only emerging, and the neural basis of any impact from the structure of emotion-concept knowledge remains unknown. In a neuroimaging study, we used a representational similarity analysis (RSA) approach to measure the correspondence between the conceptual, perceptual, and neural representational structures of the six emotion categories Anger, Disgust, Fear, Happiness, Sadness, and Surprise. We found that subjects exhibited individual differences in their conceptual structure of emotions, which predicted their own unique perceptual structure. When viewing faces, the representational structure of multivoxel patterns in the right fusiform gyrus was significantly predicted by a subject's unique conceptual structure, even when controlling for potential physical similarity in the faces themselves. Finally, cross-cultural differences in emotion perception were also observed, which could be explained by individual differences in conceptual structure. Our results suggest that the representational structure of emotion expressions in visual face-processing regions may be shaped by idiosyncratic conceptual understanding of emotion categories.
Collapse
|
40
|
Grisendi T, Reynaud O, Clarke S, Da Costa S. Processing pathways for emotional vocalizations. Brain Struct Funct 2019; 224:2487-2504. [DOI: 10.1007/s00429-019-01912-x] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2019] [Accepted: 06/12/2019] [Indexed: 01/06/2023]
|
41
|
Kovarski K, Mennella R, Wong SM, Dunkley BT, Taylor MJ, Batty M. Enhanced Early Visual Responses During Implicit Emotional Faces Processing in Autism Spectrum Disorder. J Autism Dev Disord 2019; 49:871-886. [PMID: 30374763 DOI: 10.1007/s10803-018-3787-3] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
Research on Autism Spectrum Disorder (ASD) has focused on processing of socially-relevant stimuli, such as faces. Nonetheless, before being 'social', faces are visual stimuli. The present magnetoencephalography study investigated the time course of brain activity during an implicit emotional task in visual emotion-related regions in 19 adults with ASD (mean age 26.3 ± 4.4) and 19 typically developed controls (26.4 ± 4). The results confirmed previously-reported differences between groups in brain responses to emotion and a hypo-activation in the ASD group in the right fusiform gyrus around 150 ms. However, the ASD group also presented early enhanced activity in the occipital region. These results support that impaired face processing in ASD might be sustained by atypical responses in primary visual areas.
Collapse
Affiliation(s)
- Klara Kovarski
- UMR 1253, iBrain, Université de Tours, Inserm, Centre Universitaire de PédoPsychiatrie, Tours, France. .,Department of Diagnostic Imaging, The Hospital for the Sick Children, Toronto, Canada.
| | - Rocco Mennella
- Department of Diagnostic Imaging, The Hospital for the Sick Children, Toronto, Canada.,Laboratoire de neurosciences cognitives, INSERM U960, Département d'études cognitives, École Normale Supérieure, PSL Research University, Paris, France
| | - Simeon M Wong
- Department of Diagnostic Imaging, The Hospital for the Sick Children, Toronto, Canada.,Neurosciences & Mental Health Program, The Hospital for the Sick Children Research Institute, Toronto, Canada
| | - Benjamin T Dunkley
- Department of Diagnostic Imaging, The Hospital for the Sick Children, Toronto, Canada.,Neurosciences & Mental Health Program, The Hospital for the Sick Children Research Institute, Toronto, Canada.,Department of Medical Imaging, University of Toronto, Toronto, Canada
| | - Margot J Taylor
- Department of Diagnostic Imaging, The Hospital for the Sick Children, Toronto, Canada.,Neurosciences & Mental Health Program, The Hospital for the Sick Children Research Institute, Toronto, Canada.,Department of Medical Imaging, University of Toronto, Toronto, Canada.,Department of Psychology, University of Toronto, Toronto, Canada
| | - Magali Batty
- CERPPS, Université de Toulouse, Toulouse, France
| |
Collapse
|
42
|
Grupe DW, Schaefer SM, Lapate RC, Schoen AJ, Gresham LK, Mumford JA, Davidson RJ. Behavioral and neural indices of affective coloring for neutral social stimuli. Soc Cogn Affect Neurosci 2018; 13:310-320. [PMID: 29447377 PMCID: PMC5836278 DOI: 10.1093/scan/nsy011] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2017] [Accepted: 02/09/2018] [Indexed: 01/08/2023] Open
Abstract
Emotional processing often continues beyond the presentation of emotionally evocative stimuli, which can result in affective biasing or coloring of subsequently encountered events. Here, we describe neural correlates of affective coloring and examine how individual differences in affective style impact the magnitude of affective coloring. We conducted functional magnetic resonance imaging in 117 adults who passively viewed negative, neutral and positive pictures presented 2 s prior to neutral faces. Brain responses to neutral faces were modulated by the valence of preceding pictures, with greater activation for faces following negative (vs positive) pictures in the amygdala, dorsomedial and lateral prefrontal cortex, ventral visual cortices, posterior superior temporal sulcus, and angular gyrus. Three days after the magnetic resonance imaging scan, participants rated their memory and liking of previously encountered neutral faces. Individuals higher in trait positive affect and emotional reappraisal rated faces as more likable when preceded by emotionally arousing (negative or positive) pictures. In addition, greater amygdala responses to neutral faces preceded by positively valenced pictures were associated with greater memory for these faces 3 days later. Collectively, these results reveal individual differences in how emotions spill over onto the processing of unrelated social stimuli, resulting in persistent and affectively biased evaluations of such stimuli.
Collapse
Affiliation(s)
- Daniel W Grupe
- Center for Healthy Minds and Waisman Laboratory for Brain Imaging and Behavior, University of Wisconsin-Madison, Madison, WI 53706, USA
| | - Stacey M Schaefer
- Center for Healthy Minds and Waisman Laboratory for Brain Imaging and Behavior, University of Wisconsin-Madison, Madison, WI 53706, USA
| | - Regina C Lapate
- Center for Healthy Minds and Waisman Laboratory for Brain Imaging and Behavior, University of Wisconsin-Madison, Madison, WI 53706, USA
| | - Andrew J Schoen
- Center for Healthy Minds and Waisman Laboratory for Brain Imaging and Behavior, University of Wisconsin-Madison, Madison, WI 53706, USA
| | - Lauren K Gresham
- Center for Healthy Minds and Waisman Laboratory for Brain Imaging and Behavior, University of Wisconsin-Madison, Madison, WI 53706, USA
| | - Jeanette A Mumford
- Center for Healthy Minds and Waisman Laboratory for Brain Imaging and Behavior, University of Wisconsin-Madison, Madison, WI 53706, USA
| | - Richard J Davidson
- Center for Healthy Minds and Waisman Laboratory for Brain Imaging and Behavior, University of Wisconsin-Madison, Madison, WI 53706, USA
| |
Collapse
|
43
|
Rizzo G, Milardi D, Bertino S, Basile GA, Di Mauro D, Calamuneri A, Chillemi G, Silvestri G, Anastasi G, Bramanti A, Cacciola A. The Limbic and Sensorimotor Pathways of the Human Amygdala: A Structural Connectivity Study. Neuroscience 2018; 385:166-180. [DOI: 10.1016/j.neuroscience.2018.05.051] [Citation(s) in RCA: 31] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2018] [Revised: 05/28/2018] [Accepted: 05/31/2018] [Indexed: 12/21/2022]
|
44
|
Kesner L, Grygarová D, Fajnerová I, Lukavský J, Nekovářová T, Tintěra J, Zaytseva Y, Horáček J. Perception of direct vs. averted gaze in portrait paintings: An fMRI and eye-tracking study. Brain Cogn 2018; 125:88-99. [DOI: 10.1016/j.bandc.2018.06.004] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2017] [Revised: 06/07/2018] [Accepted: 06/08/2018] [Indexed: 11/30/2022]
|
45
|
Kliemann D, Richardson H, Anzellotti S, Ayyash D, Haskins AJ, Gabrieli JDE, Saxe RR. Cortical responses to dynamic emotional facial expressions generalize across stimuli, and are sensitive to task-relevance, in adults with and without Autism. Cortex 2018; 103:24-43. [PMID: 29554540 PMCID: PMC5988954 DOI: 10.1016/j.cortex.2018.02.006] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2017] [Revised: 01/11/2018] [Accepted: 02/08/2018] [Indexed: 12/22/2022]
Abstract
Individuals with Autism Spectrum Disorders (ASD) report difficulties extracting meaningful information from dynamic and complex social cues, like facial expressions. The nature and mechanisms of these difficulties remain unclear. Here we tested whether that difficulty can be traced to the pattern of activity in "social brain" regions, when viewing dynamic facial expressions. In two studies, adult participants (male and female) watched brief videos of a range of positive and negative facial expressions, while undergoing functional magnetic resonance imaging (Study 1: ASD n = 16, control n = 21; Study 2: ASD n = 22, control n = 30). Patterns of hemodynamic activity differentiated among facial emotional expressions in left and right superior temporal sulcus, fusiform gyrus, and parts of medial prefrontal cortex. In both control participants and high-functioning individuals with ASD, we observed (i) similar responses to emotional valence that generalized across facial expressions and animated social events; (ii) similar flexibility of responses to emotional valence, when manipulating the task-relevance of perceived emotions; and (iii) similar responses to a range of emotions within valence. Altogether, the data indicate that there was little or no group difference in cortical responses to isolated dynamic emotional facial expressions, as measured with fMRI. Difficulties with real-world social communication and social interaction in ASD may instead reflect differences in initiating and maintaining contingent interactions, or in integrating social information over time or context.
Collapse
Affiliation(s)
- Dorit Kliemann
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA, USA.
| | - Hilary Richardson
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Stefano Anzellotti
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Dima Ayyash
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Amanda J Haskins
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - John D E Gabrieli
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Rebecca R Saxe
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA
| |
Collapse
|
46
|
Greening SG, Mitchell DG, Smith FW. Spatially generalizable representations of facial expressions: Decoding across partial face samples. Cortex 2018; 101:31-43. [DOI: 10.1016/j.cortex.2017.11.016] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2017] [Revised: 11/02/2017] [Accepted: 11/28/2017] [Indexed: 10/18/2022]
|
47
|
Influence of task instructions and stimuli on the neural network of face processing: An ALE meta-analysis. Cortex 2018; 103:240-255. [PMID: 29665467 DOI: 10.1016/j.cortex.2018.03.011] [Citation(s) in RCA: 38] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2017] [Revised: 12/27/2017] [Accepted: 03/09/2018] [Indexed: 12/11/2022]
Abstract
Many neuroimaging studies have investigated the neural correlates of face processing. However, the location of face-preferential regions differs considerably between studies, possibly due to the use of different stimuli or tasks. By using Activation likelihood estimation meta-analyses, we aimed to a) delineate regions consistently involved in face processing and b) to assess the influence of stimuli and task on convergence of activation patterns. In total, we included 77 neuroimaging experiments in healthy subjects comparing face processing to a control condition. Results revealed a core face-processing network encompassing bilateral fusiform gyrus (FFG), inferior occipital (IOG) gyrus, superior temporal sulcus/middle temporal gyrus (STS/MTG), amygdala, inferior frontal junction (IFJ) and gyrus (IFG), left anterior insula as well as pre-supplementary motor area (pre-SMA). Furthermore, separate meta-analyses showed, that while significant convergence across all task and stimuli conditions was found in bilateral amygdala, right IOG, right mid-FFG, and right IFG, convergence in IFJ, STS/MTG, right posterior FFG, left FFG and pre-SMA differed between conditions. Thus, our results point to an occipito-frontal-amygdalae system that is involved regardless of stimulus and attention, whereas the remaining regions of the face-processing network are influenced by the task-dependent focus on specific facial characteristics as well as the type of stimuli processed.
Collapse
|
48
|
Liang Y, Liu B, Li X, Wang P. Multivariate Pattern Classification of Facial Expressions Based on Large-Scale Functional Connectivity. Front Hum Neurosci 2018; 12:94. [PMID: 29615882 PMCID: PMC5868121 DOI: 10.3389/fnhum.2018.00094] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2017] [Accepted: 02/27/2018] [Indexed: 01/15/2023] Open
Abstract
It is an important question how human beings achieve efficient recognition of others’ facial expressions in cognitive neuroscience, and it has been identified that specific cortical regions show preferential activation to facial expressions in previous studies. However, the potential contributions of the connectivity patterns in the processing of facial expressions remained unclear. The present functional magnetic resonance imaging (fMRI) study explored whether facial expressions could be decoded from the functional connectivity (FC) patterns using multivariate pattern analysis combined with machine learning algorithms (fcMVPA). We employed a block design experiment and collected neural activities while participants viewed facial expressions of six basic emotions (anger, disgust, fear, joy, sadness, and surprise). Both static and dynamic expression stimuli were included in our study. A behavioral experiment after scanning confirmed the validity of the facial stimuli presented during the fMRI experiment with classification accuracies and emotional intensities. We obtained whole-brain FC patterns for each facial expression and found that both static and dynamic facial expressions could be successfully decoded from the FC patterns. Moreover, we identified the expression-discriminative networks for the static and dynamic facial expressions, which span beyond the conventional face-selective areas. Overall, these results reveal that large-scale FC patterns may also contain rich expression information to accurately decode facial expressions, suggesting a novel mechanism, which includes general interactions between distributed brain regions, and that contributes to the human facial expression recognition.
Collapse
Affiliation(s)
- Yin Liang
- School of Computer Science and Technology, Tianjin Key Laboratory of Cognitive Computing and Application, Tianjin University, Tianjin, China
| | - Baolin Liu
- School of Computer Science and Technology, Tianjin Key Laboratory of Cognitive Computing and Application, Tianjin University, Tianjin, China.,State Key Laboratory of Intelligent Technology and Systems, National Laboratory for Information Science and Technology, Tsinghua University, Beijing, China
| | - Xianglin Li
- Medical Imaging Research Institute, Binzhou Medical University, Yantai, China
| | - Peiyuan Wang
- Department of Radiology, Yantai Affiliated Hospital of Binzhou Medical University, Yantai, China
| |
Collapse
|
49
|
Dobs K, Schultz J, Bülthoff I, Gardner JL. Task-dependent enhancement of facial expression and identity representations in human cortex. Neuroimage 2018; 172:689-702. [PMID: 29432802 DOI: 10.1016/j.neuroimage.2018.02.013] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2017] [Revised: 02/02/2018] [Accepted: 02/06/2018] [Indexed: 11/24/2022] Open
Abstract
What cortical mechanisms allow humans to easily discern the expression or identity of a face? Subjects detected changes in expression or identity of a stream of dynamic faces while we measured BOLD responses from topographically and functionally defined areas throughout the visual hierarchy. Responses in dorsal areas increased during the expression task, whereas responses in ventral areas increased during the identity task, consistent with previous studies. Similar to ventral areas, early visual areas showed increased activity during the identity task. If visual responses are weighted by perceptual mechanisms according to their magnitude, these increased responses would lead to improved attentional selection of the task-appropriate facial aspect. Alternatively, increased responses could be a signature of a sensitivity enhancement mechanism that improves representations of the attended facial aspect. Consistent with the latter sensitivity enhancement mechanism, attending to expression led to enhanced decoding of exemplars of expression both in early visual and dorsal areas relative to attending identity. Similarly, decoding identity exemplars when attending to identity was improved in dorsal and ventral areas. We conclude that attending to expression or identity of dynamic faces is associated with increased selectivity in representations consistent with sensitivity enhancement.
Collapse
Affiliation(s)
- Katharina Dobs
- Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max-Planck-Ring 8, 72076 Tübingen, Germany; Laboratory for Human Systems Neuroscience, RIKEN Brain Science Institute, 2-1 Hirosawa, Wako, Saitama 351-0198, Japan; Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, 43 Vassar Street, Cambridge, MA 02139, USA.
| | - Johannes Schultz
- Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max-Planck-Ring 8, 72076 Tübingen, Germany; Division of Medical Psychology and Department of Psychiatry, University of Bonn, Sigmund Freud Str. 25, 53105 Bonn, Germany
| | - Isabelle Bülthoff
- Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max-Planck-Ring 8, 72076 Tübingen, Germany
| | - Justin L Gardner
- Laboratory for Human Systems Neuroscience, RIKEN Brain Science Institute, 2-1 Hirosawa, Wako, Saitama 351-0198, Japan; Department of Psychology, Stanford University, 450 Serra Mall, Stanford, CA 94305, USA
| |
Collapse
|
50
|
Yang X, Xu J, Cao L, Li X, Wang P, Wang B, Liu B. Linear Representation of Emotions in Whole Persons by Combining Facial and Bodily Expressions in the Extrastriate Body Area. Front Hum Neurosci 2018; 11:653. [PMID: 29375348 PMCID: PMC5767685 DOI: 10.3389/fnhum.2017.00653] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2017] [Accepted: 12/21/2017] [Indexed: 11/13/2022] Open
Abstract
Our human brain can rapidly and effortlessly perceive a person’s emotional state by integrating the isolated emotional faces and bodies into a whole. Behavioral studies have suggested that the human brain encodes whole persons in a holistic rather than part-based manner. Neuroimaging studies have also shown that body-selective areas prefer whole persons to the sum of their parts. The body-selective areas played a crucial role in representing the relationships between emotions expressed by different parts. However, it remains unclear in which regions the perception of whole persons is represented by a combination of faces and bodies, and to what extent the combination can be influenced by the whole person’s emotions. In the present study, functional magnetic resonance imaging data were collected when participants performed an emotion distinction task. Multi-voxel pattern analysis was conducted to examine how the whole person-evoked responses were associated with the face- and body-evoked responses in several specific brain areas. We found that in the extrastriate body area (EBA), the whole person patterns were most closely correlated with weighted sums of face and body patterns, using different weights for happy expressions but equal weights for angry and fearful ones. These results were unique for the EBA. Our findings tentatively support the idea that the whole person patterns are represented in a part-based manner in the EBA, and modulated by emotions. These data will further our understanding of the neural mechanism underlying perceiving emotional persons.
Collapse
Affiliation(s)
- Xiaoli Yang
- School of Computer Science and Technology, Tianjin Key Laboratory of Cognitive Computing and Applications, Tianjin University, Tianjin, China
| | - Junhai Xu
- School of Computer Science and Technology, Tianjin Key Laboratory of Cognitive Computing and Applications, Tianjin University, Tianjin, China
| | - Linjing Cao
- School of Computer Science and Technology, Tianjin Key Laboratory of Cognitive Computing and Applications, Tianjin University, Tianjin, China
| | - Xianglin Li
- Medical Imaging Research Institute, Binzhou Medical University, Yantai, China
| | - Peiyuan Wang
- Department of Radiology, Yantai Affiliated Hospital of Binzhou Medical University, Yantai, China
| | - Bin Wang
- Medical Imaging Research Institute, Binzhou Medical University, Yantai, China
| | - Baolin Liu
- School of Computer Science and Technology, Tianjin Key Laboratory of Cognitive Computing and Applications, Tianjin University, Tianjin, China.,Research State Key Laboratory of Intelligent Technology and Systems, National Laboratory for Information Science and Technology, Tsinghua University, Beijing, China
| |
Collapse
|