1
|
Font-I-Furnols M, Guerrero L. An overview of drivers and emotions of meat consumption. Meat Sci 2025; 219:109619. [PMID: 39181809 DOI: 10.1016/j.meatsci.2024.109619] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2024] [Revised: 07/26/2024] [Accepted: 07/29/2024] [Indexed: 08/27/2024]
Abstract
Emotions are present in almost everything we do, including meat procurement, preparation and consumption. This paper examines the main drivers of this consumption, including sensory and hedonic properties, physiological needs, historical reasons and habits, social influence, ethical motives, practical aspects and other determinants, exploring the meat-related emotions as both an outcome of consumption and as consumption drivers. Emotions are affected by multiple factors relating to the context, the information provided, and the type of product. Positive emotions such as pleasure, satisfaction, proudness and joyfulness have been described in relation to meat, as well as some neutral or negative ones. To enhance positive emotions and increase meat liking, it is essential to improve animal welfare and promote a more sustainable production, focusing on nutritional and sensory quality and providing consumers with reliable information.
Collapse
Affiliation(s)
| | - Luis Guerrero
- IRTA-Food Quality and Technology, Finca Camps i Armet, Monells, Girona, Spain
| |
Collapse
|
2
|
Nordén F, Iravani B, Schaefer M, Winter AL, Lundqvist M, Arshamian A, Lundström JN. The human olfactory bulb communicates perceived odor valence to the piriform cortex in the gamma band and receives a refined representation back in the beta band. PLoS Biol 2024; 22:e3002849. [PMID: 39401242 PMCID: PMC11501019 DOI: 10.1371/journal.pbio.3002849] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2023] [Revised: 10/24/2024] [Accepted: 09/16/2024] [Indexed: 10/26/2024] Open
Abstract
A core function of the olfactory system is to determine the valence of odors. In humans, central processing of odor valence perception has been shown to take form already within the olfactory bulb (OB), but the neural mechanisms by which this important information is communicated to, and from, the olfactory cortex (piriform cortex, PC) are not known. To assess communication between the 2 nodes, we simultaneously measured odor-dependent neural activity in the OB and PC from human participants while obtaining trial-by-trial valence ratings. By doing so, we could determine when subjective valence information was communicated, what kind of information was transferred, and how the information was transferred (i.e., in which frequency band). Support vector machine (SVM) learning was used on the coherence spectrum and frequency-resolved Granger causality to identify valence-dependent differences in functional and effective connectivity between the OB and PC. We found that the OB communicates subjective odor valence to the PC in the gamma band shortly after odor onset, while the PC subsequently feeds broader valence-related information back to the OB in the beta band. Decoding accuracy was better for negative than positive valence, suggesting a focus on negative valence. Critically, we replicated these findings in an independent data set using additional odors across a larger perceived valence range. Combined, these results demonstrate that the OB and PC communicate levels of subjective odor pleasantness across multiple frequencies, at specific time points, in a direction-dependent pattern in accordance with a two-stage model of odor processing.
Collapse
Affiliation(s)
- Frans Nordén
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - Behzad Iravani
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
- Department of Neurology, Stanford School of Medicine, Stanford, California, United States of America
| | - Martin Schaefer
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - Anja L. Winter
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - Mikael Lundqvist
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - Artin Arshamian
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - Johan N. Lundström
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
- Monell Chemical Senses Center, Philadelphia, Philadelphia, United States of America
- Stockholm University Brain Imaging Centre, Stockholm University, Stockholm, Sweden
| |
Collapse
|
3
|
Kiyokawa H, Hayashi R. Commonalities and variations in emotion representation across modalities and brain regions. Sci Rep 2024; 14:20992. [PMID: 39251743 PMCID: PMC11385795 DOI: 10.1038/s41598-024-71690-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2024] [Accepted: 08/30/2024] [Indexed: 09/11/2024] Open
Abstract
Humans express emotions through various modalities such as facial expressions and natural language. However, the relationships between emotions expressed through different modalities and their correlations with neural activities remain uncertain. Here, we aimed to unveil some of these uncertainties by investigating the similarity of emotion representations across modalities and brain regions. First, we represented various emotion categories as multi-dimensional vectors derived from visual (face), linguistic, and visio-linguistic data, and used representational similarity analysis to compare these modalities. Second, we examined the linear transferability of emotion representation from other modalities to the visual modality. Third, we compared the representational structure derived in the first step with those from brain activities across 360 regions. Our findings revealed that emotion representations share commonalities across modalities with modality-type dependent variations, and they can be linearly mapped from other modalities to the visual modality. Additionally, emotion representations in uni-modalities showed relatively higher similarity with specific brain regions, while multi-modal emotion representation was most similar to representations across the entire brain region. These findings suggest that emotional experiences are represented differently across various brain regions with varying degrees of similarity to different modality types, and that they may be multi-modally conveyable in visual and linguistic domains.
Collapse
Affiliation(s)
- Hiroaki Kiyokawa
- Human Informatics and Interaction Research Institute, National Institute of Advanced Industrial Science and Technology (AIST), Tsukuba, Ibaraki, Japan
- Graduate School of Science and Engineering, Saitama University, Saitama, Japan
| | - Ryusuke Hayashi
- Human Informatics and Interaction Research Institute, National Institute of Advanced Industrial Science and Technology (AIST), Tsukuba, Ibaraki, Japan.
| |
Collapse
|
4
|
Gao Y, Lin W, Liu J, Chen Y, Xiao C, Chen J, Mo L. Emotional contextual effects of face perception: a test of the affective realism hypothesis. THE JOURNAL OF GENERAL PSYCHOLOGY 2024:1-28. [PMID: 39023941 DOI: 10.1080/00221309.2024.2378326] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2023] [Accepted: 06/25/2024] [Indexed: 07/20/2024]
Abstract
Affective feelings naturally infuse individuals' perceptions, serving as valid windows onto the real world. The affective realism hypothesis further explains how these feelings work: as properties of individuals' perceptual experiences, these feelings influence perception. Notably, this hypothesis based on affective feelings with different valences has been substantiated, whereas the existing evidence is not compelling enough. Moreover, whether specific affective feelings can be experienced as properties of target perception remains unclear. Addressing these two issues deepens our understanding of the nature of emotional representation. Hence, we investigated the affective realism hypothesis based on affective feelings with different valences and specific emotions, comparing it with the affective misattribution hypothesis. In Experiment 1, we examined the effects of affective feelings with various valences on targets' perception through the AM (1a) and CFS paradigms (1b). In Experiment 2, we investigated the effects of affective feelings with anger, sadness, and disgust using similar methods. Results from Experiments 1a and 1b consistently indicated significant differences in valence ratings of neutral faces under emotional contexts with varying valences. Experiment 2a revealed significant differences in specific emotion ratings of neutral faces under different specific emotional contexts in the AM paradigm, whereas such differences were not observed in the CFS paradigm in Experiment 2b. We concluded that affective feelings with different valences, rather than specific emotions, can be experienced as inherent properties of target perception, validating the affective realism hypothesis. These findings supported the view that the nature of emotional representation should be described as affective dimensions.
Collapse
Affiliation(s)
- Yuan Gao
- Center for Studies of Psychological Application, South China Normal University, Guangzhou, China
- Key Laboratory of Brain, Cognition and Education Sciences, South China Normal University, Ministry of Education, Guangzhou, China
- School of Psychology, South China Normal University, Guangzhou, China
- School of Psychology, Guangdong Key Laboratory of Mental Health and Cognitive Science, Guangzhou, China
| | - Wuji Lin
- Center for Studies of Psychological Application, South China Normal University, Guangzhou, China
- Key Laboratory of Brain, Cognition and Education Sciences, South China Normal University, Ministry of Education, Guangzhou, China
- School of Psychology, South China Normal University, Guangzhou, China
- School of Psychology, Guangdong Key Laboratory of Mental Health and Cognitive Science, Guangzhou, China
| | - Jiaxi Liu
- Center for Studies of Psychological Application, South China Normal University, Guangzhou, China
- Key Laboratory of Brain, Cognition and Education Sciences, South China Normal University, Ministry of Education, Guangzhou, China
- School of Psychology, South China Normal University, Guangzhou, China
- School of Psychology, Guangdong Key Laboratory of Mental Health and Cognitive Science, Guangzhou, China
| | - Yujie Chen
- Center for Studies of Psychological Application, South China Normal University, Guangzhou, China
- Key Laboratory of Brain, Cognition and Education Sciences, South China Normal University, Ministry of Education, Guangzhou, China
- School of Psychology, South China Normal University, Guangzhou, China
- School of Psychology, Guangdong Key Laboratory of Mental Health and Cognitive Science, Guangzhou, China
| | - Chunqian Xiao
- Center for Studies of Psychological Application, South China Normal University, Guangzhou, China
- Key Laboratory of Brain, Cognition and Education Sciences, South China Normal University, Ministry of Education, Guangzhou, China
- School of Psychology, South China Normal University, Guangzhou, China
- School of Psychology, Guangdong Key Laboratory of Mental Health and Cognitive Science, Guangzhou, China
| | - Jiexin Chen
- Center for Studies of Psychological Application, South China Normal University, Guangzhou, China
- Key Laboratory of Brain, Cognition and Education Sciences, South China Normal University, Ministry of Education, Guangzhou, China
- School of Psychology, South China Normal University, Guangzhou, China
- School of Psychology, Guangdong Key Laboratory of Mental Health and Cognitive Science, Guangzhou, China
| | - Lei Mo
- Center for Studies of Psychological Application, South China Normal University, Guangzhou, China
- Key Laboratory of Brain, Cognition and Education Sciences, South China Normal University, Ministry of Education, Guangzhou, China
- School of Psychology, South China Normal University, Guangzhou, China
- School of Psychology, Guangdong Key Laboratory of Mental Health and Cognitive Science, Guangzhou, China
| |
Collapse
|
5
|
Costa T, Ferraro M, Manuello J, Camasio A, Nani A, Mancuso L, Cauda F, Fox PT, Liloia D. Activation Likelihood Estimation Neuroimaging Meta-Analysis: a Powerful Tool for Emotion Research. Psychol Res Behav Manag 2024; 17:2331-2345. [PMID: 38882233 PMCID: PMC11179639 DOI: 10.2147/prbm.s453035] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2024] [Accepted: 05/31/2024] [Indexed: 06/18/2024] Open
Abstract
Over the past two decades, functional magnetic resonance imaging (fMRI) has become the primary tool for exploring neural correlates of emotion. To enhance the reliability of results in understanding the complex nature of emotional experiences, researchers combine findings from multiple fMRI studies using coordinate-based meta-analysis (CBMA). As one of the most widely employed CBMA methods worldwide, activation likelihood estimation (ALE) is of great importance in affective neuroscience and neuropsychology. This comprehensive review provides an introductory guide for implementing the ALE method in emotion research, outlining the experimental steps involved. By presenting a case study about the emotion of disgust, with regard to both its core and social processing, we offer insightful commentary as to how ALE can enable researchers to produce consistent results and, consequently, fruitfully investigate the neural mechanisms underpinning emotions, facilitating further progress in this field.
Collapse
Affiliation(s)
- Tommaso Costa
- GCS-fMRI, Koelliker Hospital and Department of Psychology, University of Turin, Turin, Italy
- FOCUS Laboratory, Department of Psychology, University of Turin, Turin, Italy
| | - Mario Ferraro
- GCS-fMRI, Koelliker Hospital and Department of Psychology, University of Turin, Turin, Italy
- FOCUS Laboratory, Department of Psychology, University of Turin, Turin, Italy
- Department of Physics, University of Turin, Turin, Italy
| | - Jordi Manuello
- GCS-fMRI, Koelliker Hospital and Department of Psychology, University of Turin, Turin, Italy
- FOCUS Laboratory, Department of Psychology, University of Turin, Turin, Italy
| | - Alessia Camasio
- FOCUS Laboratory, Department of Psychology, University of Turin, Turin, Italy
- Department of Physics, University of Turin, Turin, Italy
| | - Andrea Nani
- FOCUS Laboratory, Department of Psychology, University of Turin, Turin, Italy
| | - Lorenzo Mancuso
- FOCUS Laboratory, Department of Psychology, University of Turin, Turin, Italy
| | - Franco Cauda
- GCS-fMRI, Koelliker Hospital and Department of Psychology, University of Turin, Turin, Italy
- FOCUS Laboratory, Department of Psychology, University of Turin, Turin, Italy
| | - Peter T Fox
- Research Imaging Institute, University of Texas Health Science Center at San Antonio, San Antonio, TX, USA
- Biggs Institute for Alzheimer's and Neurodegenerative Diseases, University of Texas Health Science Center at San Antonio, San Antonio, TX, USA
| | - Donato Liloia
- GCS-fMRI, Koelliker Hospital and Department of Psychology, University of Turin, Turin, Italy
- FOCUS Laboratory, Department of Psychology, University of Turin, Turin, Italy
| |
Collapse
|
6
|
Gonuguntla V, Adebisi AT, Veluvolu KC. Identification of Optimal and Most Significant Event Related Brain Functional Network. IEEE Trans Neural Syst Rehabil Eng 2024; 32:1906-1915. [PMID: 38722721 DOI: 10.1109/tnsre.2024.3399308] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/18/2024]
Abstract
Advancements in network science have facilitated the study of brain communication networks. Existing techniques for identifying event-related brain functional networks (BFNs) often result in fully connected networks. However, determining the optimal and most significant network representation for event-related BFNs is crucial for understanding complex brain networks. The presence of both false and genuine connections in the fully connected network requires network thresholding to eliminate false connections. However, a generalized framework for thresholding in network neuroscience is currently lacking. To address this, we propose four novel methods that leverage network properties, energy, and efficiency to select a generalized threshold level. This threshold serves as the basis for identifying the optimal and most significant event-related BFN. We validate our methods on an openly available emotion dataset and demonstrate their effectiveness in identifying multiple events. Our proposed approach can serve as a versatile thresholding technique to represent the fully connected network as an event-related BFN.
Collapse
|
7
|
Liu J, Hu X, Shen X, Song S, Zhang D. Electrophysiological representations of multivariate human emotion experience. Cogn Emot 2024; 38:378-388. [PMID: 38147431 DOI: 10.1080/02699931.2023.2297272] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2023] [Accepted: 12/14/2023] [Indexed: 12/28/2023]
Abstract
ABSTRACTDespite the fact that human daily emotions are co-occurring by nature, most neuroscience studies have primarily adopted a univariate approach to identify the neural representation of emotion (emotion experience within a single emotion category) without adequate consideration of the co-occurrence of different emotions (emotion experience across different emotion categories simultaneously). To investigate the neural representations of multivariate emotion experience, this study employed the inter-situation representational similarity analysis (RSA) method. Researchers used an EEG dataset of 78 participants who watched 28 video clips and rated their experience on eight emotion categories. The EEG-based electrophysiological representation was extracted as the power spectral density (PSD) feature per channel in the five frequency bands. The inter-situation RSA method revealed significant correlations between the multivariate emotion experience ratings and PSD features in the Alpha and Beta bands, primarily over the frontal and parietal-occipital brain regions. The study found the identified EEG representations to be reliable with sufficient situations and participants. Moreover, through a series of ablation analyses, the inter-situation RSA further demonstrated the stability and specificity of the EEG representations for multivariate emotion experience. These findings highlight the importance of adopting a multivariate perspective for a comprehensive understanding of the neural representation of human emotion experience.
Collapse
Affiliation(s)
- Jin Liu
- Department of Biomedical Engineering, Tsinghua University, Beijing, China
- Tsinghua Laboratory of Brain and Intelligence, Beijing, China
| | - Xin Hu
- Department of Psychiatry, School of Medicine, University of Pittsburgh, Pittsburgh, USA
| | - Xinke Shen
- Department of Biomedical Engineering, Southern University of Science and Technology, Shenzhen, China
| | - Sen Song
- Department of Biomedical Engineering, Tsinghua University, Beijing, China
- Tsinghua Laboratory of Brain and Intelligence, Beijing, China
| | - Dan Zhang
- Tsinghua Laboratory of Brain and Intelligence, Beijing, China
- Department of Psychology, Tsinghua University, Beijing, People's Republic of China
| |
Collapse
|
8
|
Kuai C, Pu J, Wang D, Tan Z, Wang Y, Xue SW. The association between gray matter volume in the hippocampal subfield and antidepressant efficacy mediated by abnormal dynamic functional connectivity. Sci Rep 2024; 14:8940. [PMID: 38637536 PMCID: PMC11026377 DOI: 10.1038/s41598-024-56866-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2024] [Accepted: 03/12/2024] [Indexed: 04/20/2024] Open
Abstract
An abnormality of structures and functions in the hippocampus may have a key role in the pathophysiology of major depressive disorder (MDD). However, it is unclear whether structure factors of the hippocampus effectively impact antidepressant responses by hippocampal functional activity in MDD patients. We collected longitudinal data from 36 MDD patients before and after a 3-month course of antidepressant pharmacotherapy. Additionally, we obtained baseline data from 43 healthy controls matched for sex and age. Using resting-state functional magnetic resonance imaging (rs-fMRI), we estimated the dynamic functional connectivity (dFC) of the hippocampal subregions using a sliding-window method. The gray matter volume was calculated using voxel-based morphometry (VBM). The results indicated that patients with MDD exhibited significantly lower dFC of the left rostral hippocampus (rHipp.L) with the right precentral gyrus, left superior temporal gyrus and left postcentral gyrus compared to healthy controls at baseline. In MDD patients, the dFC of the rHipp.L with right precentral gyrus at baseline was correlated with both the rHipp.L volume and HAMD remission rate, and also mediated the effects of the rHipp.L volume on antidepressant performance. Our findings suggested that the interaction between hippocampal structure and functional activity might affect antidepressant performance, which provided a novel insight into the hippocampus-related neurobiological mechanism of MDD.
Collapse
Affiliation(s)
- Changxiao Kuai
- Center for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, No. 2318, Yuhangtang Rd, Hangzhou, 311121, Zhejiang Province, People's Republic of China
- Institute of Psychological Science, Hangzhou Normal University, Hangzhou, Zhejiang Province, People's Republic of China
- Zhejiang Key Laboratory for Research in Assessment of Cognitive Impairments, Hangzhou, Zhejiang Province, People's Republic of China
| | - Jiayong Pu
- Center for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, No. 2318, Yuhangtang Rd, Hangzhou, 311121, Zhejiang Province, People's Republic of China
- Institute of Psychological Science, Hangzhou Normal University, Hangzhou, Zhejiang Province, People's Republic of China
- Zhejiang Key Laboratory for Research in Assessment of Cognitive Impairments, Hangzhou, Zhejiang Province, People's Republic of China
| | - Donglin Wang
- Center for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, No. 2318, Yuhangtang Rd, Hangzhou, 311121, Zhejiang Province, People's Republic of China.
| | - Zhonglin Tan
- Affiliated Mental Health Center & Hangzhou Seventh People's Hospital, Zhejiang University School of Medicine, Hangzhou, Zhejiang Province, People's Republic of China
| | - Yan Wang
- Center for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, No. 2318, Yuhangtang Rd, Hangzhou, 311121, Zhejiang Province, People's Republic of China
| | - Shao-Wei Xue
- Center for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, No. 2318, Yuhangtang Rd, Hangzhou, 311121, Zhejiang Province, People's Republic of China.
- Institute of Psychological Science, Hangzhou Normal University, Hangzhou, Zhejiang Province, People's Republic of China.
- Zhejiang Key Laboratory for Research in Assessment of Cognitive Impairments, Hangzhou, Zhejiang Province, People's Republic of China.
| |
Collapse
|
9
|
Aydın S, Onbaşı L. Graph theoretical brain connectivity measures to investigate neural correlates of music rhythms associated with fear and anger. Cogn Neurodyn 2024; 18:49-66. [PMID: 38406195 PMCID: PMC10881947 DOI: 10.1007/s11571-023-09931-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2022] [Revised: 10/19/2022] [Accepted: 01/09/2023] [Indexed: 01/26/2023] Open
Abstract
The present study tests the hypothesis that emotions of fear and anger are associated with distinct psychophysiological and neural circuitry according to discrete emotion model due to contrasting neurotransmitter activities, despite being included in the same affective group in many studies due to similar arousal-valance scores of them in emotion models. EEG data is downloaded from OpenNeuro platform with access number of ds002721. Brain connectivity estimations are obtained by using both functional and effective connectivity estimators in analysis of short (2 sec) and long (6 sec) EEG segments across the cortex. In tests, discrete emotions and resting-states are identified by frequency band specific brain network measures and then contrasting emotional states are deep classified with 5-fold cross-validated Long Short Term Memory Networks. Logistic regression modeling has also been examined to provide robust performance criteria. Commonly, the best results are obtained by using Partial Directed Coherence in Gamma (31.5 - 60.5 H z ) sub-bands of short EEG segments. In particular, Fear and Anger have been classified with accuracy of 91.79%. Thus, our hypothesis is supported by overall results. In conclusion, Anger is found to be characterized by increased transitivity and decreased local efficiency in addition to lower modularity in Gamma-band in comparison to fear. Local efficiency refers functional brain segregation originated from the ability of the brain to exchange information locally. Transitivity refer the overall probability for the brain having adjacent neural populations interconnected, thus revealing the existence of tightly connected cortical regions. Modularity quantifies how well the brain can be partitioned into functional cortical regions. In conclusion, PDC is proposed to graph theoretical analysis of short EEG epochs in presenting robust emotional indicators sensitive to perception of affective sounds.
Collapse
Affiliation(s)
- Serap Aydın
- Department of Biophysics, Faculty of Medicine, Hacettepe University, Sıhhiye, Ankara, Turkey
| | - Lara Onbaşı
- School of Medicine, Hacettepe University, Sıhhiye, Ankara, Turkey
| |
Collapse
|
10
|
Hedaya A, Ver Hoef L. "Amity Seizures": A previously unreported semiology localizing to a circuit between the right hippocampus and orbitofrontal area. Epilepsy Behav Rep 2024; 25:100649. [PMID: 38323089 PMCID: PMC10844940 DOI: 10.1016/j.ebr.2024.100649] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2023] [Revised: 01/17/2024] [Accepted: 01/21/2024] [Indexed: 02/08/2024] Open
Abstract
We describe a case of focal epilepsy with a semiology consisting of behaviors indicating an enthusiastic desire for those around him to get along and engage in friendly relations, which we refer to as "amity seizures". The patient was a 41-year-old right-handed male with seizures since age 26. Semiology consisted of stereotyped enthusiastic behaviors such as expressing "Peace! Peace!… Come on, we all on the same team, right?!", and giving hugs, kisses, and high-fives to those around him. On SEEG evaluation, 2 independent areas of seizure onset were identified, the right hippocampus and right posterior orbitofrontal area. Locally confined seizures had bland manifestation. However, spread from right hippocampus to right orbitofrontal area, or vice versa, elicited his typical amity seizure semiology. To our knowledge this is the first report of the seizure semiology we have coined "Amity seizures". While emotions were once thought to localize to discrete brain regions, they are now accepted to arise from networks across multiple brain regions. The fact that this behavior only occurred when seizures spread from either of 2 onset zones to the other suggests that this semiology results from network engagement between, and likely beyond, either onset zone.
Collapse
Affiliation(s)
- Alexander Hedaya
- Department of Neurology, University of Alabama at Birmingham, Birmingham, AL 25233, USA
| | - Lawrence Ver Hoef
- Department of Neurology, University of Alabama at Birmingham, Birmingham, AL 25233, USA
- Birmingham VA Medical Center, Neurology Service, Birmingham, AL 35233, USA
| |
Collapse
|
11
|
Zhang Z, Chen T, Liu Y, Wang C, Zhao K, Liu CH, Fu X. Decoding the temporal representation of facial expression in face-selective regions. Neuroimage 2023; 283:120442. [PMID: 37926217 DOI: 10.1016/j.neuroimage.2023.120442] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2023] [Revised: 10/23/2023] [Accepted: 11/02/2023] [Indexed: 11/07/2023] Open
Abstract
The ability of humans to discern facial expressions in a timely manner typically relies on distributed face-selective regions for rapid neural computations. To study the time course in regions of interest for this process, we used magnetoencephalography (MEG) to measure neural responses participants viewed facial expressions depicting seven types of emotions (happiness, sadness, anger, disgust, fear, surprise, and neutral). Analysis of the time-resolved decoding of neural responses in face-selective sources within the inferior parietal cortex (IP-faces), lateral occipital cortex (LO-faces), fusiform gyrus (FG-faces), and posterior superior temporal sulcus (pSTS-faces) revealed that facial expressions were successfully classified starting from ∼100 to 150 ms after stimulus onset. Interestingly, the LO-faces and IP-faces showed greater accuracy than FG-faces and pSTS-faces. To examine the nature of the information processed in these face-selective regions, we entered with facial expression stimuli into a convolutional neural network (CNN) to perform similarity analyses against human neural responses. The results showed that neural responses in the LO-faces and IP-faces, starting ∼100 ms after the stimuli, were more strongly correlated with deep representations of emotional categories than with image level information from the input images. Additionally, we observed a relationship between the behavioral performance and the neural responses in the LO-faces and IP-faces, but not in the FG-faces and lpSTS-faces. Together, these results provided a comprehensive picture of the time course and nature of information involved in facial expression discrimination across multiple face-selective regions, which advances our understanding of how the human brain processes facial expressions.
Collapse
Affiliation(s)
- Zhihao Zhang
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing 100101, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China
| | - Tong Chen
- Chongqing Key Laboratory of Non-Linear Circuit and Intelligent Information Processing, Southwest University, Chongqing 400715, China; Chongqing Key Laboratory of Artificial Intelligence and Service Robot Control Technology, Chongqing 400715, China
| | - Ye Liu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing 100101, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China
| | - Chongyang Wang
- Department of Computer Science and Technology, Tsinghua University, Beijing 100084, China
| | - Ke Zhao
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing 100101, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China.
| | - Chang Hong Liu
- Department of Psychology, Bournemouth University, Dorset, United Kingdom
| | - Xiaolan Fu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing 100101, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China.
| |
Collapse
|
12
|
Morgenroth E, Vilaclara L, Muszynski M, Gaviria J, Vuilleumier P, Van De Ville D. Probing neurodynamics of experienced emotions-a Hitchhiker's guide to film fMRI. Soc Cogn Affect Neurosci 2023; 18:nsad063. [PMID: 37930850 PMCID: PMC10656947 DOI: 10.1093/scan/nsad063] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2023] [Revised: 08/04/2023] [Accepted: 11/01/2023] [Indexed: 11/08/2023] Open
Abstract
Film functional magnetic resonance imaging (fMRI) has gained tremendous popularity in many areas of neuroscience. However, affective neuroscience remains somewhat behind in embracing this approach, even though films lend themselves to study how brain function gives rise to complex, dynamic and multivariate emotions. Here, we discuss the unique capabilities of film fMRI for emotion research, while providing a general guide of conducting such research. We first give a brief overview of emotion theories as these inform important design choices. Next, we discuss films as experimental paradigms for emotion elicitation and address the process of annotating them. We then situate film fMRI in the context of other fMRI approaches, and present an overview of results from extant studies so far with regard to advantages of film fMRI. We also give an overview of state-of-the-art analysis techniques including methods that probe neurodynamics. Finally, we convey limitations of using film fMRI to study emotion. In sum, this review offers a practitioners' guide to the emerging field of film fMRI and underscores how it can advance affective neuroscience.
Collapse
Affiliation(s)
- Elenor Morgenroth
- Neuro-X Institute, École Polytechnique Fédérale de Lausanne, Geneva 1202, Switzerland
- Department of Radiology and Medical Informatics, University of Geneva, Geneva 1202, Switzerland
- Swiss Center for Affective Sciences, University of Geneva, Geneva 1202, Switzerland
| | - Laura Vilaclara
- Neuro-X Institute, École Polytechnique Fédérale de Lausanne, Geneva 1202, Switzerland
- Department of Radiology and Medical Informatics, University of Geneva, Geneva 1202, Switzerland
| | - Michal Muszynski
- Department of Basic Neurosciences, University of Geneva, Geneva 1202, Switzerland
| | - Julian Gaviria
- Swiss Center for Affective Sciences, University of Geneva, Geneva 1202, Switzerland
- Department of Basic Neurosciences, University of Geneva, Geneva 1202, Switzerland
- Department of Psychiatry, University of Geneva, Geneva 1202, Switzerland
| | - Patrik Vuilleumier
- Swiss Center for Affective Sciences, University of Geneva, Geneva 1202, Switzerland
- Department of Basic Neurosciences, University of Geneva, Geneva 1202, Switzerland
- CIBM Center for Biomedical Imaging, Geneva 1202, Switzerland
| | - Dimitri Van De Ville
- Neuro-X Institute, École Polytechnique Fédérale de Lausanne, Geneva 1202, Switzerland
- Department of Radiology and Medical Informatics, University of Geneva, Geneva 1202, Switzerland
- CIBM Center for Biomedical Imaging, Geneva 1202, Switzerland
| |
Collapse
|
13
|
Talwar S, Barbero FM, Calce RP, Collignon O. Automatic Brain Categorization of Discrete Auditory Emotion Expressions. Brain Topogr 2023; 36:854-869. [PMID: 37639111 PMCID: PMC10522533 DOI: 10.1007/s10548-023-00983-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2022] [Accepted: 06/21/2023] [Indexed: 08/29/2023]
Abstract
Seamlessly extracting emotional information from voices is crucial for efficient interpersonal communication. However, it remains unclear how the brain categorizes vocal expressions of emotion beyond the processing of their acoustic features. In our study, we developed a new approach combining electroencephalographic recordings (EEG) in humans with a frequency-tagging paradigm to 'tag' automatic neural responses to specific categories of emotion expressions. Participants were presented with a periodic stream of heterogeneous non-verbal emotional vocalizations belonging to five emotion categories: anger, disgust, fear, happiness and sadness at 2.5 Hz (stimuli length of 350 ms with a 50 ms silent gap between stimuli). Importantly, unknown to the participant, a specific emotion category appeared at a target presentation rate of 0.83 Hz that would elicit an additional response in the EEG spectrum only if the brain discriminates the target emotion category from other emotion categories and generalizes across heterogeneous exemplars of the target emotion category. Stimuli were matched across emotion categories for harmonicity-to-noise ratio, spectral center of gravity and pitch. Additionally, participants were presented with a scrambled version of the stimuli with identical spectral content and periodicity but disrupted intelligibility. Both types of sequences had comparable envelopes and early auditory peripheral processing computed via the simulation of the cochlear response. We observed that in addition to the responses at the general presentation frequency (2.5 Hz) in both intact and scrambled sequences, a greater peak in the EEG spectrum at the target emotion presentation rate (0.83 Hz) and its harmonics emerged in the intact sequence in comparison to the scrambled sequence. The greater response at the target frequency in the intact sequence, together with our stimuli matching procedure, suggest that the categorical brain response elicited by a specific emotion is at least partially independent from the low-level acoustic features of the sounds. Moreover, responses at the fearful and happy vocalizations presentation rates elicited different topographies and different temporal dynamics, suggesting that different discrete emotions are represented differently in the brain. Our paradigm revealed the brain's ability to automatically categorize non-verbal vocal emotion expressions objectively (at a predefined frequency of interest), behavior-free, rapidly (in few minutes of recording time) and robustly (with a high signal-to-noise ratio), making it a useful tool to study vocal emotion processing and auditory categorization in general and in populations where behavioral assessments are more challenging.
Collapse
Affiliation(s)
- Siddharth Talwar
- Institute for Research in Psychology (IPSY) & Neuroscience (IoNS), Louvain Bionics, University of Louvain (UCLouvain), Louvain, Belgium.
| | - Francesca M Barbero
- Institute for Research in Psychology (IPSY) & Neuroscience (IoNS), Louvain Bionics, University of Louvain (UCLouvain), Louvain, Belgium
| | - Roberta P Calce
- Institute for Research in Psychology (IPSY) & Neuroscience (IoNS), Louvain Bionics, University of Louvain (UCLouvain), Louvain, Belgium
| | - Olivier Collignon
- Institute for Research in Psychology (IPSY) & Neuroscience (IoNS), Louvain Bionics, University of Louvain (UCLouvain), Louvain, Belgium.
- School of Health Sciences, HES-SO Valais-Wallis, The Sense Innovation and Research Center, Lausanne and Sion, Switzerland.
| |
Collapse
|
14
|
Lee Y, Seo Y, Lee Y, Lee D. Dimensional emotions are represented by distinct topographical brain networks. Int J Clin Health Psychol 2023; 23:100408. [PMID: 37663040 PMCID: PMC10472247 DOI: 10.1016/j.ijchp.2023.100408] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2023] [Accepted: 08/21/2023] [Indexed: 09/05/2023] Open
Abstract
The ability to recognize others' facial emotions has become increasingly important after the COVID-19 pandemic, which causes stressful situations in emotion regulation. Considering the importance of emotion in maintaining a social life, emotion knowledge to perceive and label emotions of oneself and others requires an understanding of affective dimensions, such as emotional valence and emotional arousal. However, limited information is available about whether the behavioral representation of affective dimensions is similar to their neural representation. To explore the relationship between the brain and behavior in the representational geometries of affective dimensions, we constructed a behavioral paradigm in which emotional faces were categorized into geometric spaces along the valence, arousal, and valence and arousal dimensions. Moreover, we compared such representations to neural representations of the faces acquired by functional magnetic resonance imaging. We found that affective dimensions were similarly represented in the behavior and brain. Specifically, behavioral and neural representations of valence were less similar to those of arousal. We also found that valence was represented in the dorsolateral prefrontal cortex, frontal eye fields, precuneus, and early visual cortex, whereas arousal was represented in the cingulate gyrus, middle frontal gyrus, orbitofrontal cortex, fusiform gyrus, and early visual cortex. In conclusion, the current study suggests that dimensional emotions are similarly represented in the behavior and brain and are presented with differential topographical organizations in the brain.
Collapse
Affiliation(s)
| | | | - Youngju Lee
- Cognitive Science Research Group, Korea Brain Research Institute, 61 Cheomdan-ro, Dong-gu, Daegu 41062, Republic of Korea
| | - Dongha Lee
- Cognitive Science Research Group, Korea Brain Research Institute, 61 Cheomdan-ro, Dong-gu, Daegu 41062, Republic of Korea
| |
Collapse
|
15
|
Czepiel A, Fink LK, Seibert C, Scharinger M, Kotz SA. Aesthetic and physiological effects of naturalistic multimodal music listening. Cognition 2023; 239:105537. [PMID: 37487303 DOI: 10.1016/j.cognition.2023.105537] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2022] [Revised: 05/31/2023] [Accepted: 06/24/2023] [Indexed: 07/26/2023]
Abstract
Compared to audio only (AO) conditions, audiovisual (AV) information can enhance the aesthetic experience of a music performance. However, such beneficial multimodal effects have yet to be studied in naturalistic music performance settings. Further, peripheral physiological correlates of aesthetic experiences are not well-understood. Here, participants were invited to a concert hall for piano performances of Bach, Messiaen, and Beethoven, which were presented in two conditions: AV and AO. They rated their aesthetic experience (AE) after each piece (Experiment 1 and 2), while peripheral signals (cardiorespiratory measures, skin conductance, and facial muscle activity) were continuously measured (Experiment 2). Factor scores of AE were significantly higher in the AV condition in both experiments. LF/HF ratio, a heart rhythm that represents activation of the sympathetic nervous system, was higher in the AO condition, suggesting increased arousal, likely caused by less predictable sound onsets in the AO condition. We present partial evidence that breathing was faster and facial muscle activity was higher in the AV condition, suggesting that observing a performer's movements likely enhances motor mimicry in these more voluntary peripheral measures. Further, zygomaticus ('smiling') muscle activity was a significant predictor of AE. Thus, we suggest physiological measures are related to AE, but at different levels: the more involuntary measures (i.e., heart rhythms) may reflect more sensory aspects, while the more voluntary measures (i.e., muscular control of breathing and facial responses) may reflect the liking aspect of an AE. In summary, we replicate and extend previous findings that AV information enhances AE in a naturalistic music performance setting. We further show that a combination of self-report and peripheral measures benefit a meaningful assessment of AE in naturalistic music performance settings.
Collapse
Affiliation(s)
- Anna Czepiel
- Department of Music, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany; Department of Neuropsychology and Psychopharmacology, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, the Netherlands.
| | - Lauren K Fink
- Department of Music, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany; Max Planck-NYU Center for Language, Music, and Emotion, Frankfurt am Main, Germany
| | - Christoph Seibert
- Institute for Music Informatics and Musicology, University of Music Karlsruhe, Karlsruhe, Germany
| | - Mathias Scharinger
- Research Group Phonetics, Department of German Linguistics, University of Marburg, Marburg, Germany; Department of Language and Literature, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany
| | - Sonja A Kotz
- Department of Neuropsychology and Psychopharmacology, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, the Netherlands; Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
16
|
Du C, Fu K, Wen B, He H. Topographic representation of visually evoked emotional experiences in the human cerebral cortex. iScience 2023; 26:107571. [PMID: 37664621 PMCID: PMC10470388 DOI: 10.1016/j.isci.2023.107571] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2023] [Revised: 07/03/2023] [Accepted: 08/07/2023] [Indexed: 09/05/2023] Open
Abstract
Affective neuroscience seeks to uncover the neural underpinnings of emotions that humans experience. However, it remains unclear whether an affective space underlies the discrete emotion categories in the human brain, and how it relates to the hypothesized affective dimensions. To address this question, we developed a voxel-wise encoding model to investigate the cortical organization of human emotions. Results revealed that the distributed emotion representations are constructed through a fundamental affective space. We further compared each dimension of this space to 14 hypothesized affective dimensions, and found that many affective dimensions are captured by the fundamental affective space. Our results suggest that emotional experiences are represented by broadly spatial overlapping cortical patterns and form smooth gradients across large areas of the cortex. This finding reveals the specific structure of the affective space and its relationship to hypothesized affective dimensions, while highlighting the distributed nature of emotional representations in the cortex.
Collapse
Affiliation(s)
- Changde Du
- Laboratory of Brain Atlas and Brain-Inspired Intelligence, State Key Laboratory of Multimodal Artificial Intelligence Systems, Institute of Automation, Chinese Academy of Science, Beijing 100190, China
| | - Kaicheng Fu
- Laboratory of Brain Atlas and Brain-Inspired Intelligence, State Key Laboratory of Multimodal Artificial Intelligence Systems, Institute of Automation, Chinese Academy of Science, Beijing 100190, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing 100049, China
| | - Bincheng Wen
- Center for Excellence in Brain Science and Intelligence Technology, Key Laboratory of Primate Neurobiology, Institute of Neuroscience, Chinese Academy of Sciences, Shanghai 200031, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing 100049, China
| | - Huiguang He
- Laboratory of Brain Atlas and Brain-Inspired Intelligence, State Key Laboratory of Multimodal Artificial Intelligence Systems, Institute of Automation, Chinese Academy of Science, Beijing 100190, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing 100049, China
| |
Collapse
|
17
|
Ghomroudi PA, Scaltritti M, Grecucci A. Decoding reappraisal and suppression from neural circuits: A combined supervised and unsupervised machine learning approach. COGNITIVE, AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2023; 23:1095-1112. [PMID: 36977965 PMCID: PMC10400700 DOI: 10.3758/s13415-023-01076-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 02/06/2023] [Indexed: 03/30/2023]
Abstract
Emotion regulation is a core construct of mental health and deficits in emotion regulation abilities lead to psychological disorders. Reappraisal and suppression are two widely studied emotion regulation strategies but, possibly due to methodological limitations in previous studies, a consistent picture of the neural correlates related to the individual differences in their habitual use remains elusive. To address these issues, the present study applied a combination of unsupervised and supervised machine learning algorithms to the structural MRI scans of 128 individuals. First, unsupervised machine learning was used to separate the brain into naturally grouping grey matter circuits. Then, supervised machine learning was applied to predict individual differences in the use of different strategies of emotion regulation. Two predictive models, including structural brain features and psychological ones, were tested. Results showed that a temporo-parahippocampal-orbitofrontal network successfully predicted the individual differences in the use of reappraisal. Differently, insular and fronto-temporo-cerebellar networks successfully predicted suppression. In both predictive models, anxiety, the opposite strategy, and specific emotional intelligence factors played a role in predicting the use of reappraisal and suppression. This work provides new insights regarding the decoding of individual differences from structural features and other psychologically relevant variables while extending previous observations on the neural bases of emotion regulation strategies.
Collapse
Affiliation(s)
- Parisa Ahmadi Ghomroudi
- Clinical and Affective Neuroscience Lab, Department of Psychology and Cognitive Sciences - DiPSCo, University of Trento, Rovereto, Italy.
| | - Michele Scaltritti
- Clinical and Affective Neuroscience Lab, Department of Psychology and Cognitive Sciences - DiPSCo, University of Trento, Rovereto, Italy
| | - Alessandro Grecucci
- Clinical and Affective Neuroscience Lab, Department of Psychology and Cognitive Sciences - DiPSCo, University of Trento, Rovereto, Italy
- Center for Medical Sciences - CISMed, University of Trento, Trento, Italy
| |
Collapse
|
18
|
The EEG microstate representation of discrete emotions. Int J Psychophysiol 2023; 186:33-41. [PMID: 36773887 DOI: 10.1016/j.ijpsycho.2023.02.002] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2022] [Revised: 02/03/2023] [Accepted: 02/07/2023] [Indexed: 02/11/2023]
Abstract
Understanding how human emotions are represented in our brain is a central question in the field of affective neuroscience. While previous studies have mainly adopted a modular and static perspective on the neural representation of emotions, emerging research suggests that emotions may rely on a distributed and dynamic representation. The present study aimed to explore the EEG microstate representations for nine discrete emotions (Anger, Disgust, Fear, Sadness, Neutral, Amusement, Inspiration, Joy and Tenderness). Seventy-eight participants were recruited to watch emotion eliciting videos with their EEGs recorded. Multivariate analysis revealed that different emotions had distinct EEG microstate features. By using the EEG microstate features in the Neutral condition as the reference, the coverage of C, duration of C and occurrence of B were found to be the top-contributing microstate features for the discrete positive and negative emotions. The emotions of Disgust, Fear and Joy were found to be most effectively represented by EEG microstate. The present study provided the first piece of evidence of EEG microstate representation for discrete emotions, highlighting a whole-brain, dynamical representation of human emotions.
Collapse
|
19
|
Abstract
Frameworks of emotional development have tended to focus on how environmental factors shape children's emotion understanding. However, individual experiences of emotion represent a complex interplay between both external environmental inputs and internal somatovisceral signaling. Here, we discuss the importance of afferent signals and coordination between central and peripheral mechanisms in affective response processing. We propose that incorporating somatovisceral theories of emotions into frameworks of emotional development can inform how children understand emotions in themselves and others. We highlight promising directions for future research on emotional development incorporating this perspective, namely afferent cardiac processing and interoception, immune activation, physiological synchrony, and social touch.
Collapse
Affiliation(s)
- Kelly E Faig
- Department of Psychology, Hamilton College, 198 College Hill Road, Clinton, NY 13502
| | - Karen E Smith
- Department of Psychology, the University of Wisconsin, 1500 Highland Blvd, Madison, WI, 53705
| | - Stephanie J Dimitroff
- Department of Psychology, Universität Konstanz, Universitätsstraße 10, 78464 Konstanz, Germany
| |
Collapse
|
20
|
Baggio T, Grecucci A, Meconi F, Messina I. Anxious Brains: A Combined Data Fusion Machine Learning Approach to Predict Trait Anxiety from Morphometric Features. SENSORS (BASEL, SWITZERLAND) 2023; 23:610. [PMID: 36679404 PMCID: PMC9863274 DOI: 10.3390/s23020610] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/07/2022] [Revised: 12/30/2022] [Accepted: 01/01/2023] [Indexed: 06/17/2023]
Abstract
Trait anxiety relates to the steady propensity to experience and report negative emotions and thoughts such as fear and worries across different situations, along with a stable perception of the environment as characterized by threatening stimuli. Previous studies have tried to investigate neuroanatomical features related to anxiety mostly using univariate analyses and thus giving rise to contrasting results. The aim of this study is to build a predictive model of individual differences in trait anxiety from brain morphometric features, by taking advantage of a combined data fusion machine learning approach to allow generalization to new cases. Additionally, we aimed to perform a network analysis to test the hypothesis that anxiety-related networks have a central role in modulating other networks not strictly associated with anxiety. Finally, we wanted to test the hypothesis that trait anxiety was associated with specific cognitive emotion regulation strategies, and whether anxiety may decrease with ageing. Structural brain images of 158 participants were first decomposed into independent covarying gray and white matter networks with a data fusion unsupervised machine learning approach (Parallel ICA). Then, supervised machine learning (decision tree) and backward regression were used to extract and test the generalizability of a predictive model of trait anxiety. Two covarying gray and white matter independent networks successfully predicted trait anxiety. The first network included mainly parietal and temporal regions such as the postcentral gyrus, the precuneus, and the middle and superior temporal gyrus, while the second network included frontal and parietal regions such as the superior and middle temporal gyrus, the anterior cingulate, and the precuneus. We also found that trait anxiety was positively associated with catastrophizing, rumination, other- and self-blame, and negatively associated with positive refocusing and reappraisal. Moreover, trait anxiety was negatively associated with age. This paper provides new insights regarding the prediction of individual differences in trait anxiety from brain and psychological features and can pave the way for future diagnostic predictive models of anxiety.
Collapse
Affiliation(s)
- Teresa Baggio
- Clinical and Affective Neuroscience Lab (CLI.A.N. Lab), Department of Psychology and Cognitive Sciences (DiPSCo), University of Trento, 38068 Rovereto, Italy
| | - Alessandro Grecucci
- Clinical and Affective Neuroscience Lab (CLI.A.N. Lab), Department of Psychology and Cognitive Sciences (DiPSCo), University of Trento, 38068 Rovereto, Italy
- Centre for Medical Sciences, CISMed, University of Trento, 38122 Trento, Italy
| | - Federica Meconi
- Clinical and Affective Neuroscience Lab (CLI.A.N. Lab), Department of Psychology and Cognitive Sciences (DiPSCo), University of Trento, 38068 Rovereto, Italy
| | - Irene Messina
- Clinical and Affective Neuroscience Lab (CLI.A.N. Lab), Department of Psychology and Cognitive Sciences (DiPSCo), University of Trento, 38068 Rovereto, Italy
- Department of Economics, Universitas Mercatorum, 00186 Rome, Italy
| |
Collapse
|
21
|
Extracting a Novel Emotional EEG Topographic Map Based on a Stacked Autoencoder Network. JOURNAL OF HEALTHCARE ENGINEERING 2023; 2023:9223599. [PMID: 36714412 PMCID: PMC9879679 DOI: 10.1155/2023/9223599] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/09/2022] [Revised: 11/02/2022] [Accepted: 12/23/2022] [Indexed: 01/21/2023]
Abstract
Emotion recognition based on brain signals has increasingly become attractive to evaluate human's internal emotional states. Conventional emotion recognition studies focus on developing machine learning and classifiers. However, most of these methods do not provide information on the involvement of different areas of the brain in emotions. Brain mapping is considered as one of the most distinguishing methods of showing the involvement of different areas of the brain in performing an activity. Most mapping techniques rely on projection and visualization of only one of the electroencephalogram (EEG) subband features onto brain regions. The present study aims to develop a new EEG-based brain mapping, which combines several features to provide more complete and useful information on a single map instead of common maps. In this study, the optimal combination of EEG features for each channel was extracted using a stacked autoencoder (SAE) network and visualizing a topographic map. Based on the research hypothesis, autoencoders can extract optimal features for quantitative EEG (QEEG) brain mapping. The DEAP EEG database was employed to extract topographic maps. The accuracy of image classifiers using the convolutional neural network (CNN) was used as a criterion for evaluating the distinction of the obtained maps from a stacked autoencoder topographic map (SAETM) method for different emotions. The average classification accuracy was obtained 0.8173 and 0.8037 in the valence and arousal dimensions, respectively. The extracted maps were also ranked by a team of experts compared to common maps. The results of quantitative and qualitative evaluation showed that the obtained map by SAETM has more information than conventional maps.
Collapse
|
22
|
Emanuel A, Eldar E. Emotions as computations. Neurosci Biobehav Rev 2023; 144:104977. [PMID: 36435390 PMCID: PMC9805532 DOI: 10.1016/j.neubiorev.2022.104977] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2022] [Revised: 10/26/2022] [Accepted: 11/22/2022] [Indexed: 11/26/2022]
Abstract
Emotions ubiquitously impact action, learning, and perception, yet their essence and role remain widely debated. Computational accounts of emotion aspire to answer these questions with greater conceptual precision informed by normative principles and neurobiological data. We examine recent progress in this regard and find that emotions may implement three classes of computations, which serve to evaluate states, actions, and uncertain prospects. For each of these, we use the formalism of reinforcement learning to offer a new formulation that better accounts for existing evidence. We then consider how these distinct computations may map onto distinct emotions and moods. Integrating extensive research on the causes and consequences of different emotions suggests a parsimonious one-to-one mapping, according to which emotions are integral to how we evaluate outcomes (pleasure & pain), learn to predict them (happiness & sadness), use them to inform our (frustration & content) and others' (anger & gratitude) actions, and plan in order to realize (desire & hope) or avoid (fear & anxiety) uncertain outcomes.
Collapse
Affiliation(s)
- Aviv Emanuel
- Department of Psychology, Hebrew University of Jerusalem, Jerusalem 9190501, Israel; Department of Cognitive and Brain Sciences, Hebrew University of Jerusalem, Jerusalem 9190501, Israel.
| | - Eran Eldar
- Department of Psychology, Hebrew University of Jerusalem, Jerusalem 9190501, Israel; Department of Cognitive and Brain Sciences, Hebrew University of Jerusalem, Jerusalem 9190501, Israel.
| |
Collapse
|
23
|
FEDA: Fine-grained emotion difference analysis for facial expression recognition. Biomed Signal Process Control 2023. [DOI: 10.1016/j.bspc.2022.104209] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
24
|
Decoding six basic emotions from brain functional connectivity patterns. SCIENCE CHINA LIFE SCIENCES 2022; 66:835-847. [PMID: 36378473 DOI: 10.1007/s11427-022-2206-3] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/06/2022] [Accepted: 09/26/2022] [Indexed: 11/16/2022]
Abstract
Although distinctive neural and physiological states are suggested to underlie the six basic emotions, basic emotions are often indistinguishable from functional magnetic resonance imaging (fMRI) voxelwise activation (VA) patterns. Here, we hypothesize that functional connectivity (FC) patterns across brain regions may contain emotion-representation information beyond VA patterns. We collected whole-brain fMRI data while human participants viewed pictures of faces expressing one of the six basic emotions (i.e., anger, disgust, fear, happiness, sadness, and surprise) or showing neutral expressions. We obtained FC patterns for each emotion across brain regions over the whole brain and applied multivariate pattern decoding to decode emotions in the FC pattern representation space. Our results showed that the whole-brain FC patterns successfully classified not only the six basic emotions from neutral expressions but also each basic emotion from other emotions. An emotion-representation network for each basic emotion that spanned beyond the classical brain regions for emotion processing was identified. Finally, we demonstrated that within the same brain regions, FC-based decoding consistently performed better than VA-based decoding. Taken together, our findings revealed that FC patterns contained emotional information and advocated for paying further attention to the contribution of FCs to emotion processing.
Collapse
|
25
|
Peng S, Xu P, Jiang Y, Gong G. Activation network mapping for integration of heterogeneous fMRI findings. Nat Hum Behav 2022; 6:1417-1429. [PMID: 35654963 DOI: 10.1038/s41562-022-01371-1] [Citation(s) in RCA: 14] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2021] [Accepted: 05/03/2022] [Indexed: 11/08/2022]
Abstract
Functional neuroimaging techniques have been widely used to probe the neural substrates of facial emotion processing in healthy people. However, findings are largely inconsistent across studies. Here, we introduce a new technique termed activation network mapping to examine whether heterogeneous functional magnetic resonance imaging findings localize to a common network for emotion processing. First, using the existing method of activation likelihood estimation meta-analysis, we showed that individual-brain-based reproducibility was low across studies. Second, using activation network mapping, we found that network-based reproducibility across these same studies was higher. Validation analysis indicated that the activation network mapping-localized network aligned with stimulation sites, structural abnormalities and brain lesions that disrupt facial emotion processing. Finally, we verified the generality of the activation network mapping technique by applying it to another cognitive process, that is, rumination. Activation network mapping may potentially be broadly applicable to localize brain networks of cognitive functions.
Collapse
Affiliation(s)
- Shaoling Peng
- State Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, China
| | - Pengfei Xu
- Beijing Key Laboratory of Applied Experimental Psychology, National Demonstration Center for Experimental Psychology Education (BNU), Faculty of Psychology, Beijing Normal University, Beijing, China
- Center for Emotion and Brain, Shenzhen Institute of Neuroscience, Shenzhen, China
| | - Yaya Jiang
- State Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, China
| | - Gaolang Gong
- State Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China.
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, China.
- Chinese Institute for Brain Research, Beijing, China.
| |
Collapse
|
26
|
Floreani ED, Orlandi S, Chau T. A pediatric near-infrared spectroscopy brain-computer interface based on the detection of emotional valence. Front Hum Neurosci 2022; 16:938708. [PMID: 36211121 PMCID: PMC9540519 DOI: 10.3389/fnhum.2022.938708] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2022] [Accepted: 09/05/2022] [Indexed: 11/27/2022] Open
Abstract
Brain-computer interfaces (BCIs) are being investigated as an access pathway to communication for individuals with physical disabilities, as the technology obviates the need for voluntary motor control. However, to date, minimal research has investigated the use of BCIs for children. Traditional BCI communication paradigms may be suboptimal given that children with physical disabilities may face delays in cognitive development and acquisition of literacy skills. Instead, in this study we explored emotional state as an alternative access pathway to communication. We developed a pediatric BCI to identify positive and negative emotional states from changes in hemodynamic activity of the prefrontal cortex (PFC). To train and test the BCI, 10 neurotypical children aged 8–14 underwent a series of emotion-induction trials over four experimental sessions (one offline, three online) while their brain activity was measured with functional near-infrared spectroscopy (fNIRS). Visual neurofeedback was used to assist participants in regulating their emotional states and modulating their hemodynamic activity in response to the affective stimuli. Child-specific linear discriminant classifiers were trained on cumulatively available data from previous sessions and adaptively updated throughout each session. Average online valence classification exceeded chance across participants by the last two online sessions (with 7 and 8 of the 10 participants performing better than chance, respectively, in Sessions 3 and 4). There was a small significant positive correlation with online BCI performance and age, suggesting older participants were more successful at regulating their emotional state and/or brain activity. Variability was seen across participants in regards to BCI performance, hemodynamic response, and discriminatory features and channels. Retrospective offline analyses yielded accuracies comparable to those reported in adult affective BCI studies using fNIRS. Affective fNIRS-BCIs appear to be feasible for school-aged children, but to further gauge the practical potential of this type of BCI, replication with more training sessions, larger sample sizes, and end-users with disabilities is necessary.
Collapse
Affiliation(s)
- Erica D. Floreani
- Bloorview Research Institute, Holland Bloorview Kids Rehabilitation Hospital, Toronto, ON, Canada
- Institute of Biomedical Engineering, University of Toronto, Toronto, ON, Canada
- *Correspondence: Erica D. Floreani
| | - Silvia Orlandi
- Bloorview Research Institute, Holland Bloorview Kids Rehabilitation Hospital, Toronto, ON, Canada
- Department of Biomedical Engineering, University of Bologna, Bologna, Italy
| | - Tom Chau
- Bloorview Research Institute, Holland Bloorview Kids Rehabilitation Hospital, Toronto, ON, Canada
- Institute of Biomedical Engineering, University of Toronto, Toronto, ON, Canada
| |
Collapse
|
27
|
Zhao R, Zhang T, Zhou S, Huang L. Emotional Brain Network Community Division Study Based on an Improved Immunogenetic Algorithm. Brain Sci 2022; 12:brainsci12091159. [PMID: 36138897 PMCID: PMC9496822 DOI: 10.3390/brainsci12091159] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2022] [Revised: 08/19/2022] [Accepted: 08/26/2022] [Indexed: 11/26/2022] Open
Abstract
Emotion analysis has emerged as one of the most prominent study areas in the field of Brain Computer Interface (BCI) due to the critical role that the human brain plays in the creation of human emotions. In this study, a Multi-objective Immunogenetic Community Division Algorithm Based on Memetic Framework (MFMICD) was suggested to study different emotions from the perspective of brain networks. To improve convergence and accuracy, MFMICD incorporates the unique immunity operator based on the traditional genetic algorithm and combines it with the taboo search algorithm. Based on this approach, we examined how the structure of people’s brain networks alters in response to different emotions using the electroencephalographic emotion database. The findings revealed that, in positive emotional states, more brain regions are engaged in emotion dominance, the information exchange between local modules is more frequent, and various emotions cause more varied patterns of brain area interactions than in negative brain states. A brief analysis of the connections between different emotions and brain regions shows that MFMICD is reliable in dividing emotional brain functional networks into communities.
Collapse
Affiliation(s)
- Renjie Zhao
- Bell Honors School, Nanjing University of Posts and Telecommunications, Nanjing 210023, China
| | - Tao Zhang
- School of Materials Science and Engineering, Nanjing University of Posts and Telecommunications, Nanjing 210023, China
| | - Shichao Zhou
- School of Computer Science, Nanjing University of Posts and Telecommunications, Nanjing 210023, China
| | - Liya Huang
- Bell Honors School, Nanjing University of Posts and Telecommunications, Nanjing 210023, China
- Correspondence:
| |
Collapse
|
28
|
Presti P, Ruzzon D, Avanzini P, Caruana F, Rizzolatti G, Vecchiato G. Measuring arousal and valence generated by the dynamic experience of architectural forms in virtual environments. Sci Rep 2022; 12:13376. [PMID: 35927322 PMCID: PMC9352685 DOI: 10.1038/s41598-022-17689-9] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2021] [Accepted: 07/29/2022] [Indexed: 11/13/2022] Open
Abstract
The built environment represents the stage surrounding our everyday life activities. To investigate how architectural design impacts individuals' affective states, we measured subjective judgments of perceived valence (pleasant and unpleasant) and arousal after the dynamic experience of a progressive change of macro visuospatial dimensions of virtual spaces. To this aim, we developed a parametric model that allowed us to create 54 virtual architectural designs characterized by a progressive change of sidewalls distance, ceiling and windows height, and color of the environment. Decreasing sidewalls distance, ceiling height variation, and increasing windows height significantly affected the participants' emotional state within virtual environments. Indeed, such architectural designs generated high arousing and unpleasant states according to subjective judgment. Overall, we observed that valence and arousal scores are affected by all the dynamic form factors which modulated the spaciousness of the surrounding. Showing that the dynamic experience of virtual environments enables the possibility of measuring the emotional impact of macro spatial architectural features, the present findings may lay the groundwork for future experiments investigating the effects that the architectural design has on individuals' mental state as a fundamental factor for the creation of future spaces.
Collapse
Affiliation(s)
- Paolo Presti
- Institute of Neuroscience, National Research Council of Italy, 43125, Parma, Italy.,Department of Medicine and Surgery, University of Parma, 43125, Parma, Italy
| | - Davide Ruzzon
- TUNED, Lombardini22, 20143, Milan, Italy.,Dipartimento Culture del Progetto, IUAV, 30125, Venice, Italy
| | - Pietro Avanzini
- Institute of Neuroscience, National Research Council of Italy, 43125, Parma, Italy
| | - Fausto Caruana
- Institute of Neuroscience, National Research Council of Italy, 43125, Parma, Italy
| | - Giacomo Rizzolatti
- Institute of Neuroscience, National Research Council of Italy, 43125, Parma, Italy
| | - Giovanni Vecchiato
- Institute of Neuroscience, National Research Council of Italy, 43125, Parma, Italy.
| |
Collapse
|
29
|
Dong F, Zhang Z, Chu T, Che K, Li Y, Gai Q, Shi Y, Ma H, Zhao F, Mao N, Xie H. Altered dynamic amplitude of low-frequency fluctuations in patients with postpartum depression. Behav Brain Res 2022; 433:113980. [PMID: 35809693 DOI: 10.1016/j.bbr.2022.113980] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2022] [Revised: 06/21/2022] [Accepted: 06/22/2022] [Indexed: 11/17/2022]
Abstract
BACKGROUND Postpartum depression (PPD) is a common mood disorder with increasing incidence year by year. However, the dynamic changes in local neural activity of patients with PPD remain unclear. In this study, we utilized the dynamic amplitude of low-frequency fluctuation (dALFF) method to investigate the abnormal temporal variability of local neural activity and its potential correlation with clinical severity in PPD. METHODS Twenty-four patients with PPD and nineteen healthy primiparous mothers controls (HCs) matched for age, education level and body mass index were examined by resting-state functional magnetic resonance imaging (rs-fMRI). A sliding-window method was used to assess the dALFF, and a k-means clustering method was used to identify dALFF states. Two-sample t-test was used to compare the differences of dALFF variability and state metrics between PPD and HCs. Pearson correlation analysis was used to analyze the relationship between dALFF variability, states metrics and clinical severity. RESULTS (1) Patients with PPD had lower variance of dALFF than HCs in the cognitive control network, cerebellar network and sensorimotor network. (2) Four dALFF states were identified, and patients with PPD spent more time on state 2 than the other three states. The number of transitions between the four dALFF states increased in the patients compared with that in HCs. (3) Multiple dALFF states were found to be correlated with the severity of depression. The variance of dALFF in the right middle frontal gyrus was negatively correlated with the Edinburgh postnatal depression scale score. CONCLUSION This study provides new insights into the brain dysfunction of PPD from the perspective of dynamic local brain activity, highlighting the important role of dALFF variability in understanding the neurophysiological mechanisms of PPD.
Collapse
Affiliation(s)
- Fanghui Dong
- School of Medical Imaging, Binzhou Medical University, No. 346 Guanhai Road, Yantai, Shandong 264003, PR China; Department of Radiology, Yantai Yuhuangding Hospital, Qingdao University, Yantai, Shandong 264000, PR China
| | - Zhongsheng Zhang
- Department of Radiology, Yantai Yuhuangding Hospital, Qingdao University, Yantai, Shandong 264000, PR China
| | - Tongpeng Chu
- Department of Radiology, Yantai Yuhuangding Hospital, Qingdao University, Yantai, Shandong 264000, PR China
| | - Kaili Che
- Department of Radiology, Yantai Yuhuangding Hospital, Qingdao University, Yantai, Shandong 264000, PR China
| | - Yuna Li
- Department of Radiology, Yantai Yuhuangding Hospital, Qingdao University, Yantai, Shandong 264000, PR China
| | - Qun Gai
- Department of Radiology, Yantai Yuhuangding Hospital, Qingdao University, Yantai, Shandong 264000, PR China
| | - Yinghong Shi
- Department of Radiology, Yantai Yuhuangding Hospital, Qingdao University, Yantai, Shandong 264000, PR China
| | - Heng Ma
- Department of Radiology, Yantai Yuhuangding Hospital, Qingdao University, Yantai, Shandong 264000, PR China
| | - Feng Zhao
- School of Compute Science and Technology, Shandong Technology and Business University, Yantai, Shandong 264000, PR China
| | - Ning Mao
- Department of Radiology, Yantai Yuhuangding Hospital, Qingdao University, Yantai, Shandong 264000, PR China.
| | - Haizhu Xie
- School of Medical Imaging, Binzhou Medical University, No. 346 Guanhai Road, Yantai, Shandong 264003, PR China; Department of Radiology, Yantai Yuhuangding Hospital, Qingdao University, Yantai, Shandong 264000, PR China.
| |
Collapse
|
30
|
Revers H, Van Deun K, Strijbosch W, Vroomen J, Bastiaansen M. Decoding the neural responses to experiencing disgust and sadness. Brain Res 2022; 1793:148034. [DOI: 10.1016/j.brainres.2022.148034] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2022] [Revised: 06/20/2022] [Accepted: 07/26/2022] [Indexed: 11/02/2022]
|
31
|
Karni-Visel Y, Hershkowitz I, Lamb ME, Blasbalg U. Emotional valence and the types of information provided by children in forensic interviews. CHILD ABUSE & NEGLECT 2022; 129:105639. [PMID: 35468317 DOI: 10.1016/j.chiabu.2022.105639] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/17/2021] [Revised: 03/01/2022] [Accepted: 04/15/2022] [Indexed: 06/14/2023]
Abstract
BACKGROUND Emotions can powerfully affect memory retrieval although this effect has seldom been studied in everyday contexts. OBJECTIVE This study aimed to explore the association between children's verbal emotional expressions and the type of information reported during forensic interviews. PARTICIPANTS AND SETTING The sample included 198 interviews with 4- to 14-year-old (M = 9.36, SD = 2.37) alleged victims of repeated physical abuse perpetrated by family members conducted using the Revised NICHD Protocol which emphasizes a supportive interviewing style. METHODS Interview videos were transcribed and each conversational turn was coded to reflect the amount and type of children's verbal emotional expressions, forensic information provided, interviewers' demeanor, and type of question asked. RESULTS The verbal expression of negative emotions was positively associated with the production of more central details (β = 0.29, SE = 0.05, p < 0.001) and peripheral details (β = 0.66, SE = 0.07, p < 0.001), while the verbal expression of positive emotions was correlated with peripheral details (β = 0.29, SE = 0.15, p = 0.047). The verbal expression of negative emotions was associated with the production of more specific details (β = 0.73, SE = 0.06, p < 0.001]) and less generic information (β = -0.39, SE = 0.18, p = 0.029) whereas positive emotions were associated only with increased specific information (β = 0.28, SE = 0.12, p = 0.025). CONCLUSIONS These findings highlight how emotional expression, especially of negative emotions, enhances the quantity and quality of children's reports in forensic contexts.
Collapse
Affiliation(s)
- Yael Karni-Visel
- The Louis and Gabi Weisfeld School of Social Work, Bar Ilan University, Israel.
| | | | | | | |
Collapse
|
32
|
Sikka P, Stenberg J, Vorobyev V, Gross JJ. The neural bases of expressive suppression: A systematic review of functional neuroimaging studies. Neurosci Biobehav Rev 2022; 138:104708. [PMID: 35636561 DOI: 10.1016/j.neubiorev.2022.104708] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2021] [Revised: 05/01/2022] [Accepted: 05/19/2022] [Indexed: 11/18/2022]
Abstract
Expressive suppression refers to the inhibition of emotion-expressive behavior (e.g., facial expressions of emotion). Although it is a commonly used emotion regulation strategy with well-documented consequences for well-being, little is known about its underlying mechanisms. In this systematic review, we for the first time synthesize functional neuroimaging studies on the neural bases of expressive suppression in non-clinical populations. The 12 studies included in this review contrasted the use of expressive suppression to simply watching emotional stimuli. Results showed that expressive suppression consistently increased activation of frontoparietal regions, especially the dorsolateral and ventrolateral prefrontal cortices and inferior parietal cortex, but decreased activation in temporo-occipital areas. Results regarding the involvement of the insula and amygdala were inconsistent with studies showing increased, decreased, or no changes in activation. These mixed findings underscore the importance of distinguishing expressive suppression from other forms of suppression and highlight the need to pay more attention to experimental design and neuroimaging data analysis procedures. We discuss these conceptual and methodological issues and provide suggestions for future research.
Collapse
Affiliation(s)
- Pilleriin Sikka
- Department of Psychology, Stanford University, 94305, USA; Department of Psychology, University of Turku, 20014, Finland; Department of Cognitive Neuroscience and Philosophy, University of Skövde, 541 28, Sweden.
| | - Jonathan Stenberg
- Department of Cognitive Neuroscience and Philosophy, University of Skövde, 541 28, Sweden
| | - Victor Vorobyev
- Turku University Hospital, 20521, Finland; Department of Radiology, University of Turku, 20520, Finland
| | - James J Gross
- Department of Psychology, Stanford University, 94305, USA
| |
Collapse
|
33
|
Li Y, Zhang M, Liu S, Luo W. EEG decoding of multidimensional information from emotional faces. Neuroimage 2022; 258:119374. [PMID: 35700944 DOI: 10.1016/j.neuroimage.2022.119374] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2022] [Revised: 06/03/2022] [Accepted: 06/10/2022] [Indexed: 10/18/2022] Open
Abstract
Humans can detect and recognize faces quickly, but there has been little research on the temporal dynamics of the different dimensional face information that is extracted. The present study aimed to investigate the time course of neural responses to the representation of different dimensional face information, such as age, gender, emotion, and identity. We used support vector machine decoding to obtain representational dissimilarity matrices of event-related potential responses to different faces for each subject over time. In addition, we performed representational similarity analysis with the model representational dissimilarity matrices that contained different dimensional face information. Three significant findings were observed. First, the extraction process of facial emotion occurred before that of facial identity and lasted for a long time, which was specific to the right frontal region. Second, arousal was preferentially extracted before valence during the processing of facial emotional information. Third, different dimensional face information exhibited representational stability during different periods. In conclusion, these findings reveal the precise temporal dynamics of multidimensional information processing in faces and provide powerful support for computational models on emotional face perception.
Collapse
Affiliation(s)
- Yiwen Li
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China
| | - Mingming Zhang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China
| | - Shuaicheng Liu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China
| | - Wenbo Luo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China.
| |
Collapse
|
34
|
Rainer LJ, Kronbichler M, Kuchukhidze G, Trinka E, Langthaler PB, Kronbichler L, Said-Yuerekli S, Kirschner M, Zimmermann G, Höfler J, Schmid E, Braun M. Emotional Word Processing in Patients With Juvenile Myoclonic Epilepsy. Front Neurol 2022; 13:875950. [PMID: 35720080 PMCID: PMC9201996 DOI: 10.3389/fneur.2022.875950] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2022] [Accepted: 05/10/2022] [Indexed: 11/13/2022] Open
Abstract
Objective According to Panksepp's hierarchical emotion model, emotion processing relies on three functionally and neuroanatomically distinct levels. These levels comprise subcortical networks (primary level), the limbic system (secondary level), and the neocortex (tertiary level) and are suggested to serve differential emotional processing. We aimed to validate and extend previous evidence of discrete and dimensional emotion processing in patient with juvenile myoclonic epilepsy (JME). Methods We recorded brain activity of patients with JME and healthy controls in response to lexical decisions to words reflecting the discrete emotion fear and the affective dimension negativity previously suggested to rely on different brain regions and to reflect different levels of processing. In all study participants, we tested verbal cognitive functions, as well as the relationship of psychiatric conditions, seizure types and duration of epilepsy and emotional word processing. Results In support of the hierarchical emotion model, we found an interaction of discrete emotion and affective dimensional processing in the right amygdala likely to reflect secondary level processing. Brain activity related to affective dimensional processing was found in the right inferior frontal gyrus and is suggested to reflect tertiary level processing. Psychiatric conditions, type of seizure nor mono- vs. polytherapy and duration of epilepsy within patients did not have any effect on the processing of emotional words. In addition, no differences in brain activity or response times between patients and controls were observed, despite neuropsychological testing revealed slightly decreased verbal intelligence, verbal fluency and reading speed in patients with JME. Significance These results were interpreted to be in line with the hierarchical emotion model and to highlight the amygdala's role in processing biologically relevant stimuli, as well as to suggest a semantic foundation of affective dimensional processing in prefrontal cortex. A lack of differences in brain activity of patients with JME and healthy controls in response to the emotional content of words could point to unaffected implicit emotion processing in patients with JME.
Collapse
Affiliation(s)
- Lucas Johannes Rainer
- Department of Neurology, Christian-Doppler Medical Centre, Paracelsus Medical University, Centre for Cognitive Neuroscience Salzburg, Member of the European Reference Network, Epicare, Salzburg, Austria
- Neuroscience Institute, Christian-Doppler Medical Centre, Paracelsus Medical University, Centre for Cognitive Neuroscience, Salzburg, Austria
- Department of Psychiatry, Psychotherapy & Psychosomatics, Christian-Doppler Medical Centre, Paracelsus Medical University, Salzburg, Austria
| | - Martin Kronbichler
- Neuroscience Institute, Christian-Doppler Medical Centre, Paracelsus Medical University, Centre for Cognitive Neuroscience, Salzburg, Austria
- Department of Psychology, Naturwissenschaftliche Fakultaet, Centre for Cognitive Neuroscience, Paris-Lodron University, Salzburg, Austria
| | - Giorgi Kuchukhidze
- Department of Neurology, Christian-Doppler Medical Centre, Paracelsus Medical University, Centre for Cognitive Neuroscience Salzburg, Member of the European Reference Network, Epicare, Salzburg, Austria
| | - Eugen Trinka
- Department of Neurology, Christian-Doppler Medical Centre, Paracelsus Medical University, Centre for Cognitive Neuroscience Salzburg, Member of the European Reference Network, Epicare, Salzburg, Austria
- Neuroscience Institute, Christian-Doppler Medical Centre, Paracelsus Medical University, Centre for Cognitive Neuroscience, Salzburg, Austria
- Department of Public Health, Health Services Research and Health Technology Assessment, UMIT–University for Health Sciences, Medical Informatics and Technology, Hall in Tirol, Austria
- Karl-Landsteiner Institute for Neurorehabilitation and Space Neurology, Salzburg, Austria
| | - Patrick Benjamin Langthaler
- Department of Neurology, Christian-Doppler Medical Centre, Paracelsus Medical University, Centre for Cognitive Neuroscience Salzburg, Member of the European Reference Network, Epicare, Salzburg, Austria
- Department of Mathematics, Paris-Lodron University, Naturwissenschaftliche Fakultaet, Salzburg, Austria
| | - Lisa Kronbichler
- Neuroscience Institute, Christian-Doppler Medical Centre, Paracelsus Medical University, Centre for Cognitive Neuroscience, Salzburg, Austria
- Department of Psychiatry, Psychotherapy & Psychosomatics, Christian-Doppler Medical Centre, Paracelsus Medical University, Salzburg, Austria
| | - Sarah Said-Yuerekli
- Department of Neurology, Christian-Doppler Medical Centre, Paracelsus Medical University, Centre for Cognitive Neuroscience Salzburg, Member of the European Reference Network, Epicare, Salzburg, Austria
- Department of Psychology, Naturwissenschaftliche Fakultaet, Centre for Cognitive Neuroscience, Paris-Lodron University, Salzburg, Austria
| | - Margarita Kirschner
- Department of Neurology, Christian-Doppler Medical Centre, Paracelsus Medical University, Centre for Cognitive Neuroscience Salzburg, Member of the European Reference Network, Epicare, Salzburg, Austria
| | - Georg Zimmermann
- Team Biostatistics and Big Medical Data, IDA Lab Salzburg, Paracelsus Medical University, Salzburg, Austria
- Research and Innovation Management, Paracelsus Medical University, Salzburg, Austria
| | - Julia Höfler
- Department of Neurology, Christian-Doppler Medical Centre, Paracelsus Medical University, Centre for Cognitive Neuroscience Salzburg, Member of the European Reference Network, Epicare, Salzburg, Austria
| | - Elisabeth Schmid
- Department of Neurology, Christian-Doppler Medical Centre, Paracelsus Medical University, Centre for Cognitive Neuroscience Salzburg, Member of the European Reference Network, Epicare, Salzburg, Austria
| | - Mario Braun
- Department of Psychology, Naturwissenschaftliche Fakultaet, Centre for Cognitive Neuroscience, Paris-Lodron University, Salzburg, Austria
| |
Collapse
|
35
|
Fear Detection in Multimodal Affective Computing: Physiological Signals versus Catecholamine Concentration. SENSORS 2022; 22:s22114023. [PMID: 35684644 PMCID: PMC9183081 DOI: 10.3390/s22114023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/30/2022] [Revised: 05/16/2022] [Accepted: 05/18/2022] [Indexed: 12/04/2022]
Abstract
Affective computing through physiological signals monitoring is currently a hot topic in the scientific literature, but also in the industry. Many wearable devices are being developed for health or wellness tracking during daily life or sports activity. Likewise, other applications are being proposed for the early detection of risk situations involving sexual or violent aggressions, with the identification of panic or fear emotions. The use of other sources of information, such as video or audio signals will make multimodal affective computing a more powerful tool for emotion classification, improving the detection capability. There are other biological elements that have not been explored yet and that could provide additional information to better disentangle negative emotions, such as fear or panic. Catecholamines are hormones produced by the adrenal glands, two small glands located above the kidneys. These hormones are released in the body in response to physical or emotional stress. The main catecholamines, namely adrenaline, noradrenaline and dopamine have been analysed, as well as four physiological variables: skin temperature, electrodermal activity, blood volume pulse (to calculate heart rate activity. i.e., beats per minute) and respiration rate. This work presents a comparison of the results provided by the analysis of physiological signals in reference to catecholamine, from an experimental task with 21 female volunteers receiving audiovisual stimuli through an immersive environment in virtual reality. Artificial intelligence algorithms for fear classification with physiological variables and plasma catecholamine concentration levels have been proposed and tested. The best results have been obtained with the features extracted from the physiological variables. Adding catecholamine’s maximum variation during the five minutes after the video clip visualization, as well as adding the five measurements (1-min interval) of these levels, are not providing better performance in the classifiers.
Collapse
|
36
|
Tu G, Wen J, Liu H, Chen S, Zheng L, Jiang D. Exploration meets exploitation: Multitask learning for emotion recognition based on discrete and dimensional models. Knowl Based Syst 2022. [DOI: 10.1016/j.knosys.2021.107598] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
|
37
|
Peng-Li D, Andersen T, Finlayson G, Byrne DV, Wang QJ. The impact of environmental sounds on food reward. Physiol Behav 2021; 245:113689. [PMID: 34954199 DOI: 10.1016/j.physbeh.2021.113689] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2021] [Revised: 10/01/2021] [Accepted: 12/22/2021] [Indexed: 12/19/2022]
Abstract
Wanting and liking are both components of food reward, but they manifest in fundamentally different neural substrates. While wanting denotes anticipatory and motivational behaviors, liking is associated with consummatory and hedonic experiences. These distinct constructs have also been quantitatively dissociated in behavioral paradigms. Indeed, internal, physiological, and interoceptive states affect the degree to which the food presented is valued. However, how contextual sensory cues might impact these appetitive and rewarding responses to food remains unexplored. In light of the increasing empirical focus on sound in food research, we investigated the influence of environmental soundscapes on explicit liking, explicit wanting, implicit wanting, choice frequency, and reaction time of healthy/unhealthy food using an online version of the Leeds Food Preference Questionnaire (LFPQ). Soft nature sounds and loud restaurant noises were employed to induce emotional relaxation and arousal respectively. One hundred and one healthy university students completed a repeated-measure design of the LFPQ; once with each soundscape playing in the background. Generalized linear mixed model analyses detected a significant interaction effect between soundscape and food type on choice frequency, yet the post hoc analyses did not reach significance. No interaction effects between soundscape and food type on wanting or liking were discovered. However, hypothesis-driven analyses found that nature sounds increased explicit liking of healthy (vs. unhealthy) foods, while no effect of soundscape on any wanting measures (explicit or implicit) were observed. Finally, exploratory analyses indicated that restaurant noise (vs. nature sound) induced faster response times for both healthy and unhealthy foods. The study exemplifies that in an online setting, contextual auditory manipulation of certain food reward measures and decision processes is feasible.
Collapse
Affiliation(s)
- Danni Peng-Li
- Food Quality Perception & Society Team, iSENSE Lab, Department of Food Science, Aarhus University, Agro Food Park 48, Aarhus, 8200 Denmark; Sino-Danish College (SDC), University of Chinese Academy of Sciences, Beijing, China; Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.
| | - Tjark Andersen
- Food Quality Perception & Society Team, iSENSE Lab, Department of Food Science, Aarhus University, Agro Food Park 48, Aarhus, 8200 Denmark; Sino-Danish College (SDC), University of Chinese Academy of Sciences, Beijing, China
| | - Graham Finlayson
- Appetite & Energy Balance Research Group, University of Leeds, Leeds, United Kingdom
| | - Derek Victor Byrne
- Food Quality Perception & Society Team, iSENSE Lab, Department of Food Science, Aarhus University, Agro Food Park 48, Aarhus, 8200 Denmark; Sino-Danish College (SDC), University of Chinese Academy of Sciences, Beijing, China
| | - Qian Janice Wang
- Food Quality Perception & Society Team, iSENSE Lab, Department of Food Science, Aarhus University, Agro Food Park 48, Aarhus, 8200 Denmark; Sino-Danish College (SDC), University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
38
|
Saarimäki H, Glerean E, Smirnov D, Mynttinen H, Jääskeläinen IP, Sams M, Nummenmaa L. Classification of emotion categories based on functional connectivity patterns of the human brain. Neuroimage 2021; 247:118800. [PMID: 34896586 PMCID: PMC8803541 DOI: 10.1016/j.neuroimage.2021.118800] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2021] [Revised: 12/05/2021] [Accepted: 12/08/2021] [Indexed: 12/01/2022] Open
Abstract
Neurophysiological and psychological models posit that emotions depend on connections across wide-spread corticolimbic circuits. While previous studies using pattern recognition on neuroimaging data have shown differences between various discrete emotions in brain activity patterns, less is known about the differences in functional connectivity. Thus, we employed multivariate pattern analysis on functional magnetic resonance imaging data (i) to develop a pipeline for applying pattern recognition in functional connectivity data, and (ii) to test whether connectivity patterns differ across emotion categories. Six emotions (anger, fear, disgust, happiness, sadness, and surprise) and a neutral state were induced in 16 participants using one-minute-long emotional narratives with natural prosody while brain activity was measured with functional magnetic resonance imaging (fMRI). We computed emotion-wise connectivity matrices both for whole-brain connections and for 10 previously defined functionally connected brain subnetworks and trained an across-participant classifier to categorize the emotional states based on whole-brain data and for each subnetwork separately. The whole-brain classifier performed above chance level with all emotions except sadness, suggesting that different emotions are characterized by differences in large-scale connectivity patterns. When focusing on the connectivity within the 10 subnetworks, classification was successful within the default mode system and for all emotions. We thus show preliminary evidence for consistently different sustained functional connectivity patterns for instances of emotion categories particularly within the default mode system.
Collapse
Affiliation(s)
- Heini Saarimäki
- Faculty of Social Sciences, Tampere University, FI-33014 Tampere University, Tampere, Finland; Department of Neuroscience and Biomedical Engineering, School of Science, Aalto University, Espoo, Finland.
| | - Enrico Glerean
- Department of Neuroscience and Biomedical Engineering, School of Science, Aalto University, Espoo, Finland; Advanced Magnetic Imaging (AMI) Centre, Aalto NeuroImaging, School of Science, Aalto University, Espoo, Finland; Turku PET Centre and Department of Psychology, University of Turku, Turku, Finland; Department of Computer Science, School of Science, Aalto University, Espoo, Finland; International Laboratory of Social Neurobiology, Institute for Cognitive Neuroscience, HSE University, Moscow, Russian Federation
| | - Dmitry Smirnov
- Department of Neuroscience and Biomedical Engineering, School of Science, Aalto University, Espoo, Finland
| | - Henri Mynttinen
- Department of Neuroscience and Biomedical Engineering, School of Science, Aalto University, Espoo, Finland
| | - Iiro P Jääskeläinen
- Department of Neuroscience and Biomedical Engineering, School of Science, Aalto University, Espoo, Finland; International Laboratory of Social Neurobiology, Institute for Cognitive Neuroscience, HSE University, Moscow, Russian Federation
| | - Mikko Sams
- Department of Neuroscience and Biomedical Engineering, School of Science, Aalto University, Espoo, Finland; Department of Computer Science, School of Science, Aalto University, Espoo, Finland
| | - Lauri Nummenmaa
- Turku PET Centre and Department of Psychology, University of Turku, Turku, Finland
| |
Collapse
|
39
|
The representational dynamics of perceived voice emotions evolve from categories to dimensions. Nat Hum Behav 2021; 5:1203-1213. [PMID: 33707658 PMCID: PMC7611700 DOI: 10.1038/s41562-021-01073-0] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2020] [Accepted: 02/08/2021] [Indexed: 01/31/2023]
Abstract
Long-standing affective science theories conceive the perception of emotional stimuli either as discrete categories (for example, an angry voice) or continuous dimensional attributes (for example, an intense and negative vocal emotion). Which position provides a better account is still widely debated. Here we contrast the positions to account for acoustics-independent perceptual and cerebral representational geometry of perceived voice emotions. We combined multimodal imaging of the cerebral response to heard vocal stimuli (using functional magnetic resonance imaging and magneto-encephalography) with post-scanning behavioural assessment of voice emotion perception. By using representational similarity analysis, we find that categories prevail in perceptual and early (less than 200 ms) frontotemporal cerebral representational geometries and that dimensions impinge predominantly on a later limbic-temporal network (at 240 ms and after 500 ms). These results reconcile the two opposing views by reframing the perception of emotions as the interplay of cerebral networks with different representational dynamics that emphasize either categories or dimensions.
Collapse
|
40
|
Tozzi L, Tuzhilina E, Glasser MF, Hastie TJ, Williams LM. Relating whole-brain functional connectivity to self-reported negative emotion in a large sample of young adults using group regularized canonical correlation analysis. Neuroimage 2021; 237:118137. [PMID: 33951512 PMCID: PMC8536403 DOI: 10.1016/j.neuroimage.2021.118137] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2020] [Revised: 04/20/2021] [Accepted: 04/21/2021] [Indexed: 12/15/2022] Open
Abstract
The goal of our study was to use functional connectivity to map brain function to self-reports of negative emotion. In a large dataset of healthy individuals derived from the Human Connectome Project (N = 652), first we quantified functional connectivity during a negative face-matching task to isolate patterns induced by emotional stimuli. Then, we did the same in a complementary task-free resting state condition. To identify the relationship between functional connectivity in these two conditions and self-reports of negative emotion, we introduce group regularized canonical correlation analysis (GRCCA), a novel algorithm extending canonical correlations analysis to model the shared common properties of functional connectivity within established brain networks. To minimize overfitting, we optimized the regularization parameters of GRCCA using cross-validation and tested the significance of our results in a held-out portion of the data set using permutations. GRCCA consistently outperformed plain regularized canonical correlation analysis. The only canonical correlation that generalized to the held-out test set was based on resting state data (r = 0.175, permutation test p = 0.021). This canonical correlation loaded primarily on Anger-aggression. It showed high loadings in the cingulate, orbitofrontal, superior parietal, auditory and visual cortices, as well as in the insula. Subcortically, we observed high loadings in the globus pallidus. Regarding brain networks, it loaded primarily on the primary visual, orbito-affective and ventral multimodal networks. Here, we present the first neuroimaging application of GRCCA, a novel algorithm for regularized canonical correlation analyses that takes into account grouping of the variables during the regularization scheme. Using GRCCA, we demonstrate that functional connections involving the visual, orbito-affective and multimodal networks are promising targets for investigating functional correlates of subjective anger and aggression. Crucially, our approach and findings also highlight the need of cross-validation, regularization and testing on held out data for correlational neuroimaging studies to avoid inflated effects.
Collapse
Affiliation(s)
- Leonardo Tozzi
- Department of Psychiatry and Behavioral Sciences, Stanford University, Stanford, USA
| | - Elena Tuzhilina
- Department of Statistics, Stanford University, Stanford, USA
| | - Matthew F Glasser
- Departments of Radiology and Neuroscience, Washington University, St. Louis, USA
| | - Trevor J Hastie
- Department of Statistics, Stanford University, Stanford, USA
| | - Leanne M Williams
- Department of Psychiatry and Behavioral Sciences, Stanford University, Stanford, USA; Sierra-Pacific Mental Illness Research, Education and Clinical Center (MIRECC) Veterans Affairs Palo Alto Health Care System, Palo Alto, CA, USA.
| |
Collapse
|
41
|
Satpute AB, Lindquist KA. At the Neural Intersection Between Language and Emotion. AFFECTIVE SCIENCE 2021; 2:207-220. [PMID: 36043170 PMCID: PMC9382959 DOI: 10.1007/s42761-021-00032-2] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2020] [Accepted: 01/25/2021] [Indexed: 10/21/2022]
Abstract
What role does language play in emotion? Behavioral research shows that emotion words such as "anger" and "fear" alter emotion experience, but questions still remain about mechanism. Here, we review the neuroscience literature to examine whether neural processes associated with semantics are also involved in emotion. Our review suggests that brain regions involved in the semantic processing of words: (i) are engaged during experiences of emotion, (ii) coordinate with brain regions involved in affect to create emotions, (iii) hold representational content for emotion, and (iv) may be necessary for constructing emotional experience. We relate these findings with respect to four theoretical relationships between language and emotion, which we refer to as "non-interactive," "interactive," "constitutive," and "deterministic." We conclude that findings are most consistent with the interactive and constitutive views with initial evidence suggestive of a constitutive view, in particular. We close with several future directions that may help test hypotheses of the constitutive view.
Collapse
Affiliation(s)
- Ajay B. Satpute
- Department of Psychology, Northeastern University, 360 Huntington Ave, 125 NI, Boston, MA 02115 USA
| | - Kristen A. Lindquist
- Department of Psychology and Neuroscience, University of North Carolina, Chapel Hill, NC USA
| |
Collapse
|
42
|
Facial expression recognition: A meta-analytic review of theoretical models and neuroimaging evidence. Neurosci Biobehav Rev 2021; 127:820-836. [PMID: 34052280 DOI: 10.1016/j.neubiorev.2021.05.023] [Citation(s) in RCA: 25] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2020] [Revised: 04/03/2021] [Accepted: 05/24/2021] [Indexed: 11/23/2022]
Abstract
Discrimination of facial expressions is an elementary function of the human brain. While the way emotions are represented in the brain has long been debated, common and specific neural representations in recognition of facial expressions are also complicated. To examine brain organizations and asymmetry on discrete and dimensional facial emotions, we conducted an activation likelihood estimation meta-analysis and meta-analytic connectivity modelling on 141 studies with a total of 3138 participants. We found consistent engagement of the amygdala and a common set of brain networks across discrete and dimensional emotions. The left-hemisphere dominance of the amygdala and AI across categories of facial expression, but category-specific lateralization of the vmPFC, suggesting a flexibly asymmetrical neural representations of facial expression recognition. These results converge to characteristic activation and connectivity patterns across discrete and dimensional emotion categories in recognition of facial expressions. Our findings provide the first quantitatively meta-analytic brain network-based evidence supportive of the psychological constructionist hypothesis in facial expression recognition.
Collapse
|
43
|
Trakas M. No trace beyond their name? Affective Memories, a forgotten concept. ANNEE PSYCHOLOGIQUE 2021. [DOI: 10.3917/anpsy1.212.0129] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
|
44
|
Contactless differentiation of pleasant and unpleasant valence: Assessment of the acoustic startle eyeblink response with infrared reflectance oculography. Behav Res Methods 2021; 53:2092-2104. [PMID: 33754323 DOI: 10.3758/s13428-021-01555-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/27/2021] [Indexed: 11/08/2022]
Abstract
The ability to distinguish between discrete emotions by monitoring autonomic or facial features has been an elusive "holy grail" for fields such as psychophysiology, affective computing, and human-computer interface design. However, cross-validated models are lacking, and contemporary theory suggests that emotions may lack distinct physiological or facial "signatures." Therefore, in this study, we propose a reorientation toward distinguishing between pleasant and unpleasant affective valence. We focus on the acoustic eyeblink response, which exhibits affective modulation but remains underutilized. The movement of the eyelid was monitored in a contactless manner via infrared reflectance oculography at 1 kHz while 36 participants viewed normatively pleasant, neutral, and unpleasant images, and 50-ms bursts of white noise were presented binaurally via headphones. Startle responses while viewing pleasant images exhibited significantly smaller amplitudes than those while viewing unpleasant images, with a large effect size (d = 1.56). The affective modulation of the eyeblink startle response is a robust phenomenon that can be assessed in a contactless manner. As research continues on whether systems based on psychophysiological or facial features can distinguish between discrete emotions, the eyeblink startle response offers a relatively simple way to distinguish between pleasant and unpleasant affective valence.
Collapse
|
45
|
Aberrant state-related dynamic amplitude of low-frequency fluctuations of the emotion network in major depressive disorder. J Psychiatr Res 2021; 133:23-31. [PMID: 33307351 DOI: 10.1016/j.jpsychires.2020.12.003] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/12/2020] [Revised: 11/25/2020] [Accepted: 12/01/2020] [Indexed: 12/17/2022]
Abstract
Major depressive disorder (MDD) is a highly prevalent mental disorder that is typically characterized by pervasive and persistent low mood. This durable emotional disturbance may represent a key aspect of the neuropathology of MDD, typified by the wide-ranging distribution of brain alterations involved in emotion processing. However, little is known about whether these alterations are represented as the state properties of dynamic amplitude of low-frequency fluctuation (dALFF) variability in the emotion network. To address this question, we investigated the time-varying intrinsic brain activity derived from resting-state functional magnetic resonance imaging (R-fMRI). Data were obtained from 50 MDD patients and 37 sex- and age-matched healthy controls; a sliding-window method was used to assess dALFF in the emotion network, and two reoccurring dALFF states throughout the entire R-fMRI scan were then identified using a k-means clustering method. The results showed that MDD patients had a significant decrease in dALFF variability in the emotion network and its three modules located in the lateral paralimbic, media posterior, and visual association regions. Altered state-wise dALFF was also observed in MDD patients. Specifically, we found that these altered dALFF measurements in the emotion network were related to scores on the Hamilton Rating Scale for Depression (HAMD) among patients with MDD. The detection and estimation of these temporal dynamic alterations could advance our knowledge about the brain mechanisms underlying emotional dysfunction in MDD.
Collapse
|
46
|
Zhao L, Wang D, Xue SW, Tan Z, Luo H, Wang Y, Li H, Pan C, Fu S, Hu X, Lan Z, Xiao Y, Kuai C. Antidepressant Treatment-Induced State-Dependent Reconfiguration of Emotion Regulation Networks in Major Depressive Disorder. Front Psychiatry 2021; 12:771147. [PMID: 35069281 PMCID: PMC8770425 DOI: 10.3389/fpsyt.2021.771147] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/05/2021] [Accepted: 11/23/2021] [Indexed: 11/13/2022] Open
Abstract
Deficits in emotion regulation are the main clinical features, common risk factors, and treatment-related targets for major depressive disorder (MDD). The neural bases of emotion regulation are moving beyond specific functions and emphasizing instead the integrative functions of spatially distributed brain areas that work together as large-scale brain networks, but it is still unclear whether the dynamic interactions among these emotion networks would be the target of clinical intervention for MDD. Data were collected from 70 MDD patients and 43 sex- and age-matched healthy controls. The dynamic functional connectivity (dFC) between emotion regions was estimated via a sliding-window method based on resting-state functional magnetic resonance imaging (R-fMRI). A k-means clustering method was applied to classify all time windows across all participants into several dFC states reflecting recurring functional interaction patterns among emotion regions over time. The results showed that four dFC states were identified in the emotion networks. Their alterations of state-related occurrence proportion were found in MDD and subsequently normalized following 12-week antidepressant treatment. Baseline strong dFC could predict the reduction rate of Hamilton Depression Rating Scale (HAMD) scores. These findings highlighted the state-dependent reconfiguration of emotion regulation networks in MDD patients owing to antidepressant treatment.
Collapse
Affiliation(s)
- Lei Zhao
- Centre for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, Hangzhou, China.,Institute of Psychological Science, Hangzhou Normal University, Hangzhou, China.,Zhejiang Key Laboratory for Research in Assessment of Cognitive Impairments, Hangzhou, China
| | - Donglin Wang
- Centre for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, Hangzhou, China.,Institute of Psychological Science, Hangzhou Normal University, Hangzhou, China.,Zhejiang Key Laboratory for Research in Assessment of Cognitive Impairments, Hangzhou, China
| | - Shao-Wei Xue
- Centre for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, Hangzhou, China.,Institute of Psychological Science, Hangzhou Normal University, Hangzhou, China.,Zhejiang Key Laboratory for Research in Assessment of Cognitive Impairments, Hangzhou, China
| | - Zhonglin Tan
- Affiliated Mental Health Center & Hangzhou Seventh People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Hong Luo
- Centre for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, Hangzhou, China.,Institute of Psychological Science, Hangzhou Normal University, Hangzhou, China.,Zhejiang Key Laboratory for Research in Assessment of Cognitive Impairments, Hangzhou, China
| | - Yan Wang
- Centre for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, Hangzhou, China.,Institute of Psychological Science, Hangzhou Normal University, Hangzhou, China.,Zhejiang Key Laboratory for Research in Assessment of Cognitive Impairments, Hangzhou, China
| | - Hanxiaoran Li
- Centre for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, Hangzhou, China.,Institute of Psychological Science, Hangzhou Normal University, Hangzhou, China.,Zhejiang Key Laboratory for Research in Assessment of Cognitive Impairments, Hangzhou, China
| | - Chenyuan Pan
- Centre for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, Hangzhou, China.,Institute of Psychological Science, Hangzhou Normal University, Hangzhou, China.,Zhejiang Key Laboratory for Research in Assessment of Cognitive Impairments, Hangzhou, China
| | - Sufen Fu
- Centre for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, Hangzhou, China.,Institute of Psychological Science, Hangzhou Normal University, Hangzhou, China.,Zhejiang Key Laboratory for Research in Assessment of Cognitive Impairments, Hangzhou, China
| | - Xiwen Hu
- Affiliated Mental Health Center & Hangzhou Seventh People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Zhihui Lan
- Centre for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, Hangzhou, China.,Institute of Psychological Science, Hangzhou Normal University, Hangzhou, China.,Zhejiang Key Laboratory for Research in Assessment of Cognitive Impairments, Hangzhou, China
| | - Yang Xiao
- Centre for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, Hangzhou, China.,Institute of Psychological Science, Hangzhou Normal University, Hangzhou, China.,Zhejiang Key Laboratory for Research in Assessment of Cognitive Impairments, Hangzhou, China
| | - Changxiao Kuai
- Centre for Cognition and Brain Disorders, The Affiliated Hospital of Hangzhou Normal University, Hangzhou, China.,Institute of Psychological Science, Hangzhou Normal University, Hangzhou, China.,Zhejiang Key Laboratory for Research in Assessment of Cognitive Impairments, Hangzhou, China
| |
Collapse
|
47
|
Jeong JW, Kim HT, Lee SH, Lee H. Effects of an Audiovisual Emotion Perception Training for Schizophrenia: A Preliminary Study. Front Psychiatry 2021; 12:522094. [PMID: 34025462 PMCID: PMC8131526 DOI: 10.3389/fpsyt.2021.522094] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/20/2019] [Accepted: 03/18/2021] [Indexed: 11/13/2022] Open
Abstract
Individuals with schizophrenia show a reduced ability to integrate facial and vocal information in emotion perception. Although emotion perception has been a target for treatment, no study has yet examined the effect of multimodal training on emotion perception in schizophrenia. In the present study, we developed an audiovisual emotion perception training and test in which a voice and a face were simultaneously presented, and subjects were asked to judge whether the emotions of the voice and the face matched. The voices were either angry or happy, and the faces were morphed on a continuum ranging from angry to happy. Sixteen patients with schizophrenia participated in six training sessions and three test sessions (i.e., pre-training, post-training, and generalization). Eighteen healthy controls participated only in pre-training test session. Prior to training, the patients with schizophrenia performed significantly worse than did the controls in the recognition of anger; however, following the training, the patients showed a significant improvement in recognizing anger, which was maintained and generalized to a new set of stimuli. The patients also improved the recognition of happiness following the training, but this effect was not maintained or generalized. These results provide preliminary evidence that a multimodal, audiovisual training may yield improvements in anger perception for patients with schizophrenia.
Collapse
Affiliation(s)
- Ji Woon Jeong
- Department of Psychology, Korea University, Seoul, South Korea
| | - Hyun Taek Kim
- Department of Psychology, Korea University, Seoul, South Korea
| | - Seung-Hwan Lee
- Department of Psychiatry, Ilsan-Paik Hospital, Inje University, Goyang, South Korea
| | - Hyejeen Lee
- Department of Psychology, Chonnam National University, Gwangju, South Korea
| |
Collapse
|
48
|
Bagheri E, Esteban PG, Cao HL, Beir AD, Lefeber D, Vanderborght B. An Autonomous Cognitive Empathy Model Responsive to Users’ Facial Emotion Expressions. ACM T INTERACT INTEL 2020. [DOI: 10.1145/3341198] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
Successful social robot services depend on how robots can interact with users. The effective service can be obtained through smooth, engaged, and humanoid interactions in which robots react properly to a user’s affective state. This article proposes a novel Automatic Cognitive Empathy Model, ACEM, for humanoid robots to achieve longer and more engaged human-robot interactions (HRI) by considering humans’ emotions and replying to them appropriately. The proposed model continuously detects the affective states of a user based on facial expressions and generates desired, either parallel or reactive, empathic behaviors that are already adapted to the user’s personality. Users’ affective states are detected using a stacked autoencoder network that is trained and tested on the RAVDESS dataset.
The overall proposed empathic model is verified throughout an experiment, where different emotions are triggered in participants and then empathic behaviors are applied based on proposed hypothesis. The results confirm the effectiveness of the proposed model in terms of related social and friendship concepts that participants perceived during interaction with the robot.
Collapse
Affiliation(s)
- Elahe Bagheri
- Robotics and Multibody Mechanics Research Group, Vrije Universiteit Brussel and Flanders Make, Brussels, Belgium
| | - Pablo G. Esteban
- Robotics and Multibody Mechanics Research Group, Vrije Universiteit Brussel and Flanders Make, Brussels, Belgium
| | - Hoang-Long Cao
- Robotics and Multibody Mechanics Research Group, Vrije Universiteit Brussel and Flanders Make, Brussels, Belgium
| | - Albert De Beir
- Robotics and Multibody Mechanics Research Group, Vrije Universiteit Brussel and Flanders Make, Brussels, Belgium
| | - Dirk Lefeber
- Robotics and Multibody Mechanics Research Group, Vrije Universiteit Brussel and Flanders Make, Brussels, Belgium
| | - Bram Vanderborght
- Robotics and Multibody Mechanics Research Group, Vrije Universiteit Brussel and Flanders Make, Brussels, Belgium
| |
Collapse
|
49
|
Leitão J, Meuleman B, Van De Ville D, Vuilleumier P. Computational imaging during video game playing shows dynamic synchronization of cortical and subcortical networks of emotions. PLoS Biol 2020; 18:e3000900. [PMID: 33180768 PMCID: PMC7685507 DOI: 10.1371/journal.pbio.3000900] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2020] [Revised: 11/24/2020] [Accepted: 10/06/2020] [Indexed: 01/09/2023] Open
Abstract
Emotions are multifaceted phenomena affecting mind, body, and behavior. Previous studies sought to link particular emotion categories (e.g., fear) or dimensions (e.g., valence) to specific brain substrates but generally found distributed and overlapping activation patterns across various emotions. In contrast, distributed patterns accord with multi-componential theories whereby emotions emerge from appraisal processes triggered by current events, combined with motivational, expressive, and physiological mechanisms orchestrating behavioral responses. According to this framework, components are recruited in parallel and dynamically synchronized during emotion episodes. Here, we use functional MRI (fMRI) to investigate brain-wide systems engaged by theoretically defined components and measure their synchronization during an interactive emotion-eliciting video game. We show that each emotion component recruits large-scale cortico-subcortical networks, and that moments of dynamic synchronization between components selectively engage basal ganglia, sensory-motor structures, and midline brain areas. These neural results support theoretical accounts grounding emotions onto embodied and action-oriented functions triggered by synchronized component processes.
Collapse
Affiliation(s)
- Joana Leitão
- Laboratory for Behavioral Neurology and Imaging of Cognition, Department of Fundamental Neuroscience, University of Geneva, Geneva, Switzerland.,Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| | - Ben Meuleman
- Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| | - Dimitri Van De Ville
- Institute of Bioengineering, Center for Neuroprosthetics, École Polytechnique Fédérale de Lausanne (EPFL), Geneva, Switzerland.,Department of Radiology and Medical Informatics, University of Geneva, Geneva, Switzerland
| | - Patrik Vuilleumier
- Laboratory for Behavioral Neurology and Imaging of Cognition, Department of Fundamental Neuroscience, University of Geneva, Geneva, Switzerland.,Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| |
Collapse
|
50
|
Song Y, Zhong J, Jia Z, Liang D. Emotional prosody recognition in children with high-functioning autism under the influence of emotional intensity: Based on the perspective of emotional dimension theory. JOURNAL OF COMMUNICATION DISORDERS 2020; 88:106032. [PMID: 32937183 DOI: 10.1016/j.jcomdis.2020.106032] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/28/2018] [Revised: 07/08/2020] [Accepted: 07/09/2020] [Indexed: 06/11/2023]
Abstract
This paper investigated the ability of Mandarin-speaking children with high-functioning autism (HFA) to recognize the four categories of emotional prosody, namely, happiness, anger, sadness and fear, in moderate- and high-intensity emotional conditions using auditory discrimination tasks. Thirty-four children with HFA between 5 and 7 years of age and 34 typically developing (TD) controls participated in this study. In moderate-intensity conditions, children with HFA scored lower than TD children in the recognition of the four categories of emotional prosody, indicating an overall impairment. With an increase in the intensity of emotion, children with HFA showed improved accuracy for anger, decreased accuracy for happiness, but no change in accuracy for either sadness or fear. An analysis of error patterns demonstrated that unlike TD children, children with HFA were inclined to mistake happiness for anger, with the two categories differing in valence, and this inclination deepened as the intensity increased. In discriminating between sadness and fear, which have a slight arousal difference, both groups showed difficulty in moderate-intensity conditions. In high-intensity conditions, TD children were inclined to perceive stimuli as exhibiting fear, which demonstrates comparatively high arousal; thus, they were more accurate for fear, while HFA children were not sensitive to increases in arousal, showing no noticeable effect. These findings indicated that children with HFA have a mechanism distinct from that of TD children in emotional prosody recognition, exhibiting various degrees of impairment in this regard.
Collapse
Affiliation(s)
- Yiqi Song
- School of Chinese Language and Culture, Nanjing Normal University, PR China
| | - Jianxiu Zhong
- School of Chinese Language and Culture, Nanjing Normal University, PR China
| | - Zhongheng Jia
- Foreign Languages Department, Tongji Zhejiang College, PR China
| | - Dandan Liang
- School of Chinese Language and Culture, Nanjing Normal University, PR China; Collaborative Innovation Center for Language Ability, Jiangsu Normal University, PR China.
| |
Collapse
|