1
|
Deming P, Griffiths S, Jalava J, Koenigs M, Larsen RR. Psychopathy and medial frontal cortex: A systematic review reveals predominantly null relationships. Neurosci Biobehav Rev 2024; 167:105904. [PMID: 39343080 DOI: 10.1016/j.neubiorev.2024.105904] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2024] [Revised: 08/20/2024] [Accepted: 09/22/2024] [Indexed: 10/01/2024]
Abstract
Theories have posited that psychopathy is caused by dysfunction in the medial frontal cortex, including ventromedial prefrontal cortex (vmPFC), anterior cingulate cortex (ACC), and dorsomedial prefrontal cortex (dmPFC). Recent reviews have questioned the reproducibility of neuroimaging findings within this field. We conducted a systematic review to describe the consistency of magnetic resonance imaging (MRI) findings according to anatomical subregion (vmPFC, ACC, dmPFC), experimental task, psychopathy assessment, study power, and peak coordinates of significant effects. Searches of PsycInfo and MEDLINE databases produced 77 functional and 24 structural MRI studies that analyzed the medial frontal cortex in relation to psychopathy in adult samples. Findings were predominantly null (85.4 % of 1573 tests across the three medial frontal regions). Studies with higher power observed null effects at marginally lower rates. Finally, peak coordinates of significant effects were widely dispersed. The evidence failed to support theories positing the medial frontal cortex as a consistent neural correlate of psychopathy. Theory and methods in the field should be revised to account for predominantly null neuroimaging findings.
Collapse
Affiliation(s)
- Philip Deming
- Department of Psychology, Northeastern University, Boston, MA, United States.
| | - Stephanie Griffiths
- Department of Psychology, Okanagan College, Penticton, BC, Canada; Werklund School of Education, University of Calgary, Calgary, AB, Canada
| | - Jarkko Jalava
- Department of Interdisciplinary Studies, Okanagan College, Penticton, BC, Canada
| | - Michael Koenigs
- Department of Psychiatry, University of Wisconsin-Madison, Madison, WI, United States
| | - Rasmus Rosenberg Larsen
- Forensic Science Program and Department of Philosophy, University of Toronto Mississauga, Mississauga, ON, Canada
| |
Collapse
|
2
|
Lee KM, Satpute AB. More than labels: neural representations of emotion words are widely distributed across the brain. Soc Cogn Affect Neurosci 2024; 19:nsae043. [PMID: 38903026 PMCID: PMC11259136 DOI: 10.1093/scan/nsae043] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2023] [Revised: 03/15/2024] [Accepted: 06/20/2024] [Indexed: 06/22/2024] Open
Abstract
Although emotion words such as "anger," "disgust," "happiness," or "pride" are often thought of as mere labels, increasing evidence points to language as being important for emotion perception and experience. Emotion words may be particularly important for facilitating access to the emotion concepts. Indeed, deficits in semantic processing or impaired access to emotion words interfere with emotion perception. Yet, it is unclear what these behavioral findings mean for affective neuroscience. Thus, we examined the brain areas that support processing of emotion words using representational similarity analysis of functional magnetic resonance imaging data (N = 25). In the task, participants saw 10 emotion words (e.g. "anger," "happiness") while in the scanner. Participants rated each word based on its valence on a continuous scale ranging from 0 (Pleasant/Good) to 1 (Unpleasant/Bad) scale to ensure they were processing the words. Our results revealed that a diverse range of brain areas including prefrontal, midline cortical, and sensorimotor regions contained information about emotion words. Notably, our results overlapped with many regions implicated in decoding emotion experience by prior studies. Our results raise questions about what processes are being supported by these regions during emotion experience.
Collapse
Affiliation(s)
- Kent M Lee
- Department of Psychology, Northeastern University, 125 Nightingale Hall, Boston, MA 02115, USA
| | - Ajay B Satpute
- Department of Psychology, Northeastern University, 125 Nightingale Hall, Boston, MA 02115, USA
| |
Collapse
|
3
|
Epp S, Walker A, Boudes E, Bray S, Noel M, Rayner L, Rasic N, Miller JV. Brain Function and Pain Interference After Pediatric Intensive Interdisciplinary Pain Treatment. Clin J Pain 2024; 40:393-399. [PMID: 38606879 DOI: 10.1097/ajp.0000000000001216] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2023] [Accepted: 04/02/2024] [Indexed: 04/13/2024]
Abstract
OBJECTIVES Intensive interdisciplinary pain treatments (IIPTs) are programs that aim to improve functioning in youth with severe chronic pain. Little is known about how the brain changes after IIPT; however, decreased brain responses to emotional stimuli have been identified previously in pediatric chronic pain relative to healthy controls. We examined whether IIPT increased brain responses to emotional stimuli, and whether this change was associated with a reduction in pain interference. PATIENTS AND METHODS Twenty youths with chronic pain aged 14 to 18 years were scanned using functional magnetic resonance imaging, pre and post-IIPT. During the functional magnetic resonance imaging, patients were presented with emotional stimuli (ie, faces expressing happiness/fear), neutral expressions, and control (ie, scrambled) images. Patients completed a measure of pain interference pre and post-IIPT. Paired t tests were used to examine differences in brain activation in response to emotional versus neutral stimuli, pre to post-IIPT. Data from significant brain clusters were entered into linear mixed models to examine the relationships between brain activation and impairment pre and post-IIPT. RESULTS Patients demonstrated a decrease in middle frontal gyrus (MFG) activation in response to emotional stimuli (happy + fear) relative to scrambled images, between pre and post-IIPT ( P < 0.05). Lower MFG activation was associated with lower pain interference, pre and post-IIPT ( P < 0.05). CONCLUSION Contrary to our hypothesis, IIPT was associated with a reduction in MFG activation to emotional stimuli, and this change was associated with reduced pain interference. The MFG is a highly interconnected brain area involved in both pain chronification and antinociception. With further validation of these results, the MFG may represent an important biomarker for evaluating patient treatment response and target for future pain interventions.
Collapse
Affiliation(s)
- Spencer Epp
- Department of Anesthesiology, Perioperative and Pain Medicine
| | - Andrew Walker
- Department of Anesthesiology, Perioperative and Pain Medicine
| | | | - Signe Bray
- Department of Radiology, Cumming School of Medicine
- Hotchkiss Brain Institute
- Owerko Centre, Alberta Children's Hospital Research Institute
- Alberta Children's Hospital Research Institute
| | - Melanie Noel
- Department of Radiology, Psychology
- Hotchkiss Brain Institute
- Owerko Centre, Alberta Children's Hospital Research Institute
- Alberta Children's Hospital Research Institute
- Vi Riddell Children's Pain and Rehabilitation Centre, Alberta Children's Hospital, Calgary, AB, Canada
| | - Laura Rayner
- Department of Anesthesiology, Perioperative and Pain Medicine
| | - Nivez Rasic
- Department of Anesthesiology, Perioperative and Pain Medicine
- Alberta Children's Hospital Research Institute
- Vi Riddell Children's Pain and Rehabilitation Centre, Alberta Children's Hospital, Calgary, AB, Canada
| | - Jillian Vinall Miller
- Department of Anesthesiology, Perioperative and Pain Medicine
- Department of Radiology, Psychology
- O'Brien Institute for Public Health, University of Calgary
- Hotchkiss Brain Institute
- Owerko Centre, Alberta Children's Hospital Research Institute
- Alberta Children's Hospital Research Institute
- Vi Riddell Children's Pain and Rehabilitation Centre, Alberta Children's Hospital, Calgary, AB, Canada
| |
Collapse
|
4
|
Alcalá-López D, Mei N, Margolles P, Soto D. Brain-wide representation of social knowledge. Soc Cogn Affect Neurosci 2024; 19:nsae032. [PMID: 38804694 PMCID: PMC11173195 DOI: 10.1093/scan/nsae032] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2023] [Revised: 02/28/2024] [Accepted: 05/30/2024] [Indexed: 05/29/2024] Open
Abstract
Understanding how the human brain maps different dimensions of social conceptualizations remains a key unresolved issue. We performed a functional magnetic resonance imaging (MRI) study in which participants were exposed to audio definitions of personality traits and asked to simulate experiences associated with the concepts. Half of the concepts were affective (e.g. empathetic), and the other half were non-affective (e.g. intelligent). Orthogonally, half of the concepts were highly likable (e.g. sincere) and half were socially undesirable (e.g. liar). Behaviourally, we observed that the dimension of social desirability reflected the participant's subjective ratings better than affect. FMRI decoding results showed that both social desirability and affect could be decoded in local patterns of activity through distributed brain regions including the superior temporal, inferior frontal, precuneus and key nodes of the default mode network in posterior/anterior cingulate and ventromedial prefrontal cortex. Decoding accuracy was better for social desirability than affect. A representational similarity analysis further demonstrated that a deep language model significantly predicted brain activity associated with the concepts in bilateral regions of superior and anterior temporal lobes. The results demonstrate a brain-wide representation of social knowledge, involving default model network systems that support the multimodal simulation of social experience, with a further reliance on language-related preprocessing.
Collapse
Affiliation(s)
- Daniel Alcalá-López
- Consciousness group, Basque Center on Cognition, Brain and Language, San Sebastian 20009, Spain
| | - Ning Mei
- Psychology Department, Shenzhen University, Nanshan district, Guangdong province 3688, China
| | - Pedro Margolles
- Consciousness group, Basque Center on Cognition, Brain and Language, San Sebastian 20009, Spain
| | - David Soto
- Consciousness group, Basque Center on Cognition, Brain and Language, San Sebastian 20009, Spain
| |
Collapse
|
5
|
Mochalski LN, Friedrich P, Li X, Kröll JP, Eickhoff SB, Weis S. Inter- and intra-subject similarity in network functional connectivity across a full narrative movie. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.05.14.594107. [PMID: 38798405 PMCID: PMC11118367 DOI: 10.1101/2024.05.14.594107] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2024]
Abstract
Naturalistic paradigms, such as watching movies during functional magnetic resonance imaging (fMRI), are thought to prompt the emotional and cognitive processes typically elicited in real life situations. Therefore, naturalistic viewing (NV) holds great potential for studying individual differences. However, in how far NV elicits similarity within and between subjects on a network level, particularly depending on emotions portrayed in movies, is currently unknown. We used the studyforrest dataset to investigate the inter- and intra-subject similarity in network functional connectivity (NFC) of 14 meta-analytically defined networks across a full narrative, audio-visual movie split into 8 consecutive movie segments. We characterized the movie segments by valence and arousal portrayed within the sequences, before utilizing a linear mixed model to analyze which factors explain inter- and intra-subject similarity. Our results showed that the model best explaining inter-subject similarity comprised network, movie segment, valence and a movie segment by valence interaction. Intra-subject similarity was influenced significantly by the same factors and an additional three-way interaction between movie segment, valence and arousal. Overall, inter- and intra-subject similarity in NFC were sensitive to the ongoing narrative and emotions in the movie. Lowest similarity both within and between subjects was seen in the emotional regulation network and networks associated with long-term memory processing, which might be explained by specific features and content of the movie. We conclude that detailed characterization of movie features is crucial for NV research.
Collapse
|
6
|
Reddan M, Ong D, Wager T, Mattek S, Kahhale I, Zaki J. Neural signatures of emotional inference and experience align during social consensus. RESEARCH SQUARE 2023:rs.3.rs-3487248. [PMID: 38014230 PMCID: PMC10680919 DOI: 10.21203/rs.3.rs-3487248/v1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/29/2023]
Abstract
Humans seamlessly transform dynamic social signals into inferences about the internal states of the people around them. To understand the neural processes that sustain this transformation, we collected fMRI data from participants (N = 100) while they rated the emotional intensity of people (targets) describing significant life events. Targets rated themselves on the same scale to indicate the intended "ground truth" emotional intensity of their videos. Next, we developed two multivariate models of observer brain activity- the first predicted the "ground truth" (r = 0.50, p < 0.0001) and the second predicted observer inferences (r = 0.53, p < 0.0001). When individuals make more accurate inferences, there is greater moment-by-moment concordance between these two models, suggesting that an observer's brain activity contains latent representations of other people's emotional states. Using naturalistic socioemotional stimuli and machine learning, we developed reliable brain signatures that predict what an observer thinks about a target, what the target thinks about themselves, and the correspondence between them. These signatures can be applied in clinical data to better our understanding of socioemotional dysfunction.
Collapse
|
7
|
Saxena A, Shovestul BJ, Dudek EM, Reda S, Venkataraman A, Lamberti JS, Dodell-Feder D. Training volitional control of the theory of mind network with real-time fMRI neurofeedback. Neuroimage 2023; 279:120334. [PMID: 37591479 DOI: 10.1016/j.neuroimage.2023.120334] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2023] [Revised: 07/12/2023] [Accepted: 08/14/2023] [Indexed: 08/19/2023] Open
Abstract
Is there a way improve our ability to understand the minds of others? Towards addressing this question, here, we conducted a single-arm, proof-of-concept study to evaluate whether real-time fMRI neurofeedback (rtfMRI-NF) from the temporo-parietal junction (TPJ) leads to volitional control of the neural network subserving theory of mind (ToM; the process by which we attribute and reason about the mental states of others). As additional aims, we evaluated the strategies used to self-regulate the network and whether volitional control of the ToM network was moderated by participant characteristics and associated with improved performance on behavioral measures. Sixteen participants underwent fMRI while completing a task designed to individually-localize the TPJ, and then three separate rtfMRI-NF scans during which they completed multiple runs of a training task while receiving intermittent, activation-based feedback from the TPJ, and one run of a transfer task in which no neurofeedback was provided. Region-of-interest analyses demonstrated volitional control in most regions during the training tasks and during the transfer task, although the effects were smaller in magnitude and not observed in one of the neurofeedback targets for the transfer task. Text analysis demonstrated that volitional control was most strongly associated with thinking about prior social experiences when up-regulating the neural signal. Analysis of behavioral performance and brain-behavior associations largely did not reveal behavior changes except for a positive association between volitional control in RTPJ and changes in performance on one ToM task. Exploratory analysis suggested neurofeedback-related learning occurred, although some degree of volitional control appeared to be conferred with the initial self-regulation strategy provided to participants (i.e., without the neurofeedback signal). Critical study limitations include the lack of a control group and pre-rtfMRI transfer scan, which prevents a more direct assessment of neurofeedback-induced volitional control, and a small sample size, which may have led to an overestimate and/or unreliable estimate of study effects. Nonetheless, together, this study demonstrates the feasibility of training volitional control of a social cognitive brain network, which may have important clinical applications. Given the study's limitations, findings from this study should be replicated with more robust experimental designs.
Collapse
Affiliation(s)
- Abhishek Saxena
- Department of Psychology, University of Rochester, 500 Wilson Blvd Rochester, NY 14627 USA
| | - Bridget J Shovestul
- Department of Psychology, University of Rochester, 500 Wilson Blvd Rochester, NY 14627 USA
| | - Emily M Dudek
- Department of Psychology, University of Houston, 3695 Cullen Boulevard Houston, TX 77204 USA
| | - Stephanie Reda
- Department of Psychology, University of Rochester, 500 Wilson Blvd Rochester, NY 14627 USA
| | - Arun Venkataraman
- School of Medicine and Dentistry, University of Rochester Medical Center, 601 Elmwood Avenue, Rochester, NY 14642 USA
| | - J Steven Lamberti
- Department of Psychiatry, University of Rochester Medical Center, 601 Elmwood Avenue, Rochester, NY 14642 USA
| | - David Dodell-Feder
- Department of Psychology, University of Rochester, 500 Wilson Blvd Rochester, NY 14627 USA; Department of Neuroscience, University of Rochester Medical Center, 601 Elmwood Avenue, Rochester, NY 14642 USA.
| |
Collapse
|
8
|
Richardson H, Saxe R, Bedny M. Neural correlates of theory of mind reasoning in congenitally blind children. Dev Cogn Neurosci 2023; 63:101285. [PMID: 37591011 PMCID: PMC10450415 DOI: 10.1016/j.dcn.2023.101285] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Revised: 01/19/2023] [Accepted: 07/21/2023] [Indexed: 08/19/2023] Open
Abstract
Vision is an important source of information about other minds for sighted children, especially prior to the onset of language. Visually observed actions, eye gaze, and facial expressions of others provide information about mental states, such as beliefs, desires, and emotions. Does such experience contribute causally to the development of cortical networks supporting social cognition? To address this question we compared functional development of brain regions supporting theory of mind (ToM), as well as behavioral ToM reasoning, across congenitally blind (n=17) and sighted (n=114) children and adolescents (4-17 years old). We find that blind children in this age range show slightly lower ToM behavioral performance relative to sighted children. Likewise, the functional profile of ToM brain regions is qualitatively similar, but quantitatively weaker in blind relative to sighted children. Alongside prior research, these data suggest that vision facilitates, but is not necessary for, ToM development.
Collapse
Affiliation(s)
- H Richardson
- School of Philosophy, Psychology, and Language Sciences, The University of Edinburgh, United Kingdom.
| | - R Saxe
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - M Bedny
- Department of Psychological and Brain Sciences, Johns Hopkins University, Baltimore, MD, USA
| |
Collapse
|
9
|
Du C, Fu K, Wen B, He H. Topographic representation of visually evoked emotional experiences in the human cerebral cortex. iScience 2023; 26:107571. [PMID: 37664621 PMCID: PMC10470388 DOI: 10.1016/j.isci.2023.107571] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2023] [Revised: 07/03/2023] [Accepted: 08/07/2023] [Indexed: 09/05/2023] Open
Abstract
Affective neuroscience seeks to uncover the neural underpinnings of emotions that humans experience. However, it remains unclear whether an affective space underlies the discrete emotion categories in the human brain, and how it relates to the hypothesized affective dimensions. To address this question, we developed a voxel-wise encoding model to investigate the cortical organization of human emotions. Results revealed that the distributed emotion representations are constructed through a fundamental affective space. We further compared each dimension of this space to 14 hypothesized affective dimensions, and found that many affective dimensions are captured by the fundamental affective space. Our results suggest that emotional experiences are represented by broadly spatial overlapping cortical patterns and form smooth gradients across large areas of the cortex. This finding reveals the specific structure of the affective space and its relationship to hypothesized affective dimensions, while highlighting the distributed nature of emotional representations in the cortex.
Collapse
Affiliation(s)
- Changde Du
- Laboratory of Brain Atlas and Brain-Inspired Intelligence, State Key Laboratory of Multimodal Artificial Intelligence Systems, Institute of Automation, Chinese Academy of Science, Beijing 100190, China
| | - Kaicheng Fu
- Laboratory of Brain Atlas and Brain-Inspired Intelligence, State Key Laboratory of Multimodal Artificial Intelligence Systems, Institute of Automation, Chinese Academy of Science, Beijing 100190, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing 100049, China
| | - Bincheng Wen
- Center for Excellence in Brain Science and Intelligence Technology, Key Laboratory of Primate Neurobiology, Institute of Neuroscience, Chinese Academy of Sciences, Shanghai 200031, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing 100049, China
| | - Huiguang He
- Laboratory of Brain Atlas and Brain-Inspired Intelligence, State Key Laboratory of Multimodal Artificial Intelligence Systems, Institute of Automation, Chinese Academy of Science, Beijing 100190, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing 100049, China
| |
Collapse
|
10
|
Kuhnke P, Kiefer M, Hartwigsen G. Conceptual representations in the default, control and attention networks are task-dependent and cross-modal. BRAIN AND LANGUAGE 2023; 244:105313. [PMID: 37595340 DOI: 10.1016/j.bandl.2023.105313] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/30/2023] [Revised: 07/03/2023] [Accepted: 08/10/2023] [Indexed: 08/20/2023]
Abstract
Conceptual knowledge is central to human cognition. Neuroimaging studies suggest that conceptual processing involves modality-specific and multimodal brain regions in a task-dependent fashion. However, it remains unclear (1) to what extent conceptual feature representations are also modulated by the task, (2) whether conceptual representations in multimodal regions are indeed cross-modal, and (3) how the conceptual system relates to the large-scale functional brain networks. To address these issues, we conducted multivariate pattern analyses on fMRI data. 40 participants performed three tasks-lexical decision, sound judgment, and action judgment-on written words. We found that (1) conceptual feature representations are strongly modulated by the task, (2) conceptual representations in several multimodal regions are cross-modal, and (3) conceptual feature retrieval involves the default, frontoparietal control, and dorsal attention networks. Conceptual representations in these large-scale networks are task-dependent and cross-modal. Our findings support theories that assume conceptual processing to rely on a flexible, multi-level architecture.
Collapse
Affiliation(s)
- Philipp Kuhnke
- Wilhelm Wundt Institute for Psychology, Leipzig University, Germany; Lise Meitner Research Group Cognition and Plasticity, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.
| | | | - Gesa Hartwigsen
- Wilhelm Wundt Institute for Psychology, Leipzig University, Germany; Lise Meitner Research Group Cognition and Plasticity, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
11
|
Kragel PA, Treadway MT, Admon R, Pizzagalli DA, Hahn EC. A mesocorticolimbic signature of pleasure in the human brain. Nat Hum Behav 2023; 7:1332-1343. [PMID: 37386105 DOI: 10.1038/s41562-023-01639-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2022] [Accepted: 05/22/2023] [Indexed: 07/01/2023]
Abstract
Pleasure is a fundamental driver of human behaviour, yet its neural basis remains largely unknown. Rodent studies highlight opioidergic neural circuits connecting the nucleus accumbens, ventral pallidum, insula and orbitofrontal cortex as critical for the initiation and regulation of pleasure, and human neuroimaging studies exhibit some translational parity. However, whether activation in these regions conveys a generalizable representation of pleasure regulated by opioidergic mechanisms remains unclear. Here we use pattern recognition techniques to develop a human functional magnetic resonance imaging signature of mesocorticolimbic activity unique to states of pleasure. In independent validation tests, this signature is sensitive to pleasant tastes and affect evoked by humour. The signature is spatially co-extensive with mu-opioid receptor gene expression, and its response is attenuated by the opioid antagonist naloxone. These findings provide evidence for a basis of pleasure in humans that is distributed across brain systems.
Collapse
Affiliation(s)
- Philip A Kragel
- Department of Psychology, Emory University, Atlanta, GA, USA.
- Department of Psychiatry and Behavioral Sciences, Emory University, Atlanta, GA, USA.
| | - Michael T Treadway
- Department of Psychology, Emory University, Atlanta, GA, USA
- Department of Psychiatry and Behavioral Sciences, Emory University, Atlanta, GA, USA
| | - Roee Admon
- Department of Psychiatry, Harvard Medical School and McLean Hospital, Belmont, MA, USA
- School of Psychological Sciences, University of Haifa, Haifa, Israel
| | - Diego A Pizzagalli
- Department of Psychiatry, Harvard Medical School and McLean Hospital, Belmont, MA, USA
| | - Evan C Hahn
- Department of Psychology, Emory University, Atlanta, GA, USA
| |
Collapse
|
12
|
Schwartz E, Alreja A, Richardson RM, Ghuman A, Anzellotti S. Intracranial Electroencephalography and Deep Neural Networks Reveal Shared Substrates for Representations of Face Identity and Expressions. J Neurosci 2023; 43:4291-4303. [PMID: 37142430 PMCID: PMC10255163 DOI: 10.1523/jneurosci.1277-22.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2022] [Revised: 03/25/2023] [Accepted: 04/17/2023] [Indexed: 05/06/2023] Open
Abstract
According to a classical view of face perception (Bruce and Young, 1986; Haxby et al., 2000), face identity and facial expression recognition are performed by separate neural substrates (ventral and lateral temporal face-selective regions, respectively). However, recent studies challenge this view, showing that expression valence can also be decoded from ventral regions (Skerry and Saxe, 2014; Li et al., 2019), and identity from lateral regions (Anzellotti and Caramazza, 2017). These findings could be reconciled with the classical view if regions specialized for one task (either identity or expression) contain a small amount of information for the other task (that enables above-chance decoding). In this case, we would expect representations in lateral regions to be more similar to representations in deep convolutional neural networks (DCNNs) trained to recognize facial expression than to representations in DCNNs trained to recognize face identity (the converse should hold for ventral regions). We tested this hypothesis by analyzing neural responses to faces varying in identity and expression. Representational dissimilarity matrices (RDMs) computed from human intracranial recordings (n = 11 adults; 7 females) were compared with RDMs from DCNNs trained to label either identity or expression. We found that RDMs from DCNNs trained to recognize identity correlated with intracranial recordings more strongly in all regions tested-even in regions classically hypothesized to be specialized for expression. These results deviate from the classical view, suggesting that face-selective ventral and lateral regions contribute to the representation of both identity and expression.SIGNIFICANCE STATEMENT Previous work proposed that separate brain regions are specialized for the recognition of face identity and facial expression. However, identity and expression recognition mechanisms might share common brain regions instead. We tested these alternatives using deep neural networks and intracranial recordings from face-selective brain regions. Deep neural networks trained to recognize identity and networks trained to recognize expression learned representations that correlate with neural recordings. Identity-trained representations correlated with intracranial recordings more strongly in all regions tested, including regions hypothesized to be expression specialized in the classical hypothesis. These findings support the view that identity and expression recognition rely on common brain regions. This discovery may require reevaluation of the roles that the ventral and lateral neural pathways play in processing socially relevant stimuli.
Collapse
Affiliation(s)
- Emily Schwartz
- Department of Psychology and Neuroscience, Boston College, Chestnut Hill, Massachusetts 02467
| | - Arish Alreja
- Center for the Neural Basis of Cognition, Carnegie Mellon University/University of Pittsburgh, Pittsburgh, Pennsylvania 15213
- Neuroscience Institute, Carnegie Mellon University, Pittsburgh, Pennsylvania 15213
- Machine Learning Department, Carnegie Mellon University, Pittsburgh, Pennsylvania 15213
- Department of Neurological Surgery, University of Pittsburgh Medical Center Presbyterian, Pittsburgh, Pennsylvania 15213
| | - R Mark Richardson
- Department of Neurosurgery, Massachusetts General Hospital, Boston, Massachusetts 02114
- Harvard Medical School, Boston, Massachusetts 02115
| | - Avniel Ghuman
- Center for the Neural Basis of Cognition, Carnegie Mellon University/University of Pittsburgh, Pittsburgh, Pennsylvania 15213
- Department of Neurological Surgery, University of Pittsburgh Medical Center Presbyterian, Pittsburgh, Pennsylvania 15213
- Center for Neuroscience, University of Pittsburgh, Pittsburgh, Pennsylvania 15260
| | - Stefano Anzellotti
- Department of Psychology and Neuroscience, Boston College, Chestnut Hill, Massachusetts 02467
| |
Collapse
|
13
|
Schwartz E, O’Nell K, Saxe R, Anzellotti S. Challenging the Classical View: Recognition of Identity and Expression as Integrated Processes. Brain Sci 2023; 13:296. [PMID: 36831839 PMCID: PMC9954353 DOI: 10.3390/brainsci13020296] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2022] [Revised: 02/01/2023] [Accepted: 02/02/2023] [Indexed: 02/12/2023] Open
Abstract
Recent neuroimaging evidence challenges the classical view that face identity and facial expression are processed by segregated neural pathways, showing that information about identity and expression are encoded within common brain regions. This article tests the hypothesis that integrated representations of identity and expression arise spontaneously within deep neural networks. A subset of the CelebA dataset is used to train a deep convolutional neural network (DCNN) to label face identity (chance = 0.06%, accuracy = 26.5%), and the FER2013 dataset is used to train a DCNN to label facial expression (chance = 14.2%, accuracy = 63.5%). The identity-trained and expression-trained networks each successfully transfer to labeling both face identity and facial expression on the Karolinska Directed Emotional Faces dataset. This study demonstrates that DCNNs trained to recognize face identity and DCNNs trained to recognize facial expression spontaneously develop representations of facial expression and face identity, respectively. Furthermore, a congruence coefficient analysis reveals that features distinguishing between identities and features distinguishing between expressions become increasingly orthogonal from layer to layer, suggesting that deep neural networks disentangle representational subspaces corresponding to different sources.
Collapse
Affiliation(s)
- Emily Schwartz
- Department of Psychology and Neuroscience, Boston College, Boston, MA 02467, USA
| | - Kathryn O’Nell
- Department of Psychological and Brain Sciences, Dartmouth College, Hanover, NH 03755, USA
| | - Rebecca Saxe
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139, USA
| | - Stefano Anzellotti
- Department of Psychology and Neuroscience, Boston College, Boston, MA 02467, USA
| |
Collapse
|
14
|
Chiou R, Cox CR, Lambon Ralph MA. Bipartite functional fractionation within the neural system for social cognition supports the psychological continuity of self versus other. Cereb Cortex 2023; 33:1277-1299. [PMID: 35394005 PMCID: PMC9930627 DOI: 10.1093/cercor/bhac135] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2021] [Revised: 02/10/2022] [Accepted: 03/14/2022] [Indexed: 11/12/2022] Open
Abstract
Research of social neuroscience establishes that regions in the brain's default-mode network (DN) and semantic network (SN) are engaged by socio-cognitive tasks. Research of the human connectome shows that DN and SN regions are both situated at the transmodal end of a cortical gradient but differ in their loci along this gradient. Here we integrated these 2 bodies of research, used the psychological continuity of self versus other as a "test-case," and used functional magnetic resonance imaging to investigate whether these 2 networks would encode social concepts differently. We found a robust dissociation between the DN and SN-while both networks contained sufficient information for decoding broad-stroke distinction of social categories, the DN carried more generalizable information for cross-classifying across social distance and emotive valence than did the SN. We also found that the overarching distinction of self versus other was a principal divider of the representational space while social distance was an auxiliary factor (subdivision, nested within the principal dimension), and this representational landscape was more manifested in the DN than in the SN. Taken together, our findings demonstrate how insights from connectome research can benefit social neuroscience and have implications for clarifying the 2 networks' differential contributions to social cognition.
Collapse
Affiliation(s)
| | - Christopher R Cox
- Department of Psychology, Louisiana State University, LA 70803, Baton Rouge, United States
| | - Matthew A Lambon Ralph
- MRC Cognition & Brain Science Unit, University of Cambridge, Cambridge, CB2 7EF, United Kingdom
| |
Collapse
|
15
|
Tanaka T, Okamoto N, Kida I, Haruno M. The initial decrease in 7T-BOLD signals detected by hyperalignment contains information to decode facial expressions. Neuroimage 2022; 262:119537. [DOI: 10.1016/j.neuroimage.2022.119537] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2022] [Revised: 07/11/2022] [Accepted: 08/02/2022] [Indexed: 10/31/2022] Open
|
16
|
Fang M, Poskanzer C, Anzellotti S. PyMVPD: A Toolbox for Multivariate Pattern Dependence. Front Neuroinform 2022; 16:835772. [PMID: 35811995 PMCID: PMC9262406 DOI: 10.3389/fninf.2022.835772] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2021] [Accepted: 05/17/2022] [Indexed: 11/13/2022] Open
Abstract
Cognitive tasks engage multiple brain regions. Studying how these regions interact is key to understand the neural bases of cognition. Standard approaches to model the interactions between brain regions rely on univariate statistical dependence. However, newly developed methods can capture multivariate dependence. Multivariate pattern dependence (MVPD) is a powerful and flexible approach that trains and tests multivariate models of the interactions between brain regions using independent data. In this article, we introduce PyMVPD: an open source toolbox for multivariate pattern dependence. The toolbox includes linear regression models and artificial neural network models of the interactions between regions. It is designed to be easily customizable. We demonstrate example applications of PyMVPD using well-studied seed regions such as the fusiform face area (FFA) and the parahippocampal place area (PPA). Next, we compare the performance of different model architectures. Overall, artificial neural networks outperform linear regression. Importantly, the best performing architecture is region-dependent: MVPD subdivides cortex in distinct, contiguous regions whose interaction with FFA and PPA is best captured by different models.
Collapse
|
17
|
Vaccaro AG, Heydari P, Christov-Moore L, Damasio A, Kaplan JT. Perspective-taking is associated with increased discriminability of affective states in the ventromedial prefrontal cortex. Soc Cogn Affect Neurosci 2022; 17:1082-1090. [PMID: 35579186 PMCID: PMC9714424 DOI: 10.1093/scan/nsac035] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Revised: 04/05/2022] [Accepted: 05/16/2022] [Indexed: 01/12/2023] Open
Abstract
Recent work using multivariate-pattern analysis (MVPA) on functional magnetic resonance imaging (fMRI) data has found that distinct affective states produce correspondingly distinct patterns of neural activity in the cerebral cortex. However, it is unclear whether individual differences in the distinctiveness of neural patterns evoked by affective stimuli underlie empathic abilities such as perspective-taking (PT). Accordingly, we examined whether we could predict PT tendency from the classification of blood-oxygen-level-dependent (BOLD) fMRI activation patterns while participants (n = 57) imagined themselves in affectively charged scenarios. We used an MVPA searchlight analysis to map where in the brain activity patterns permitted the classification of four affective states: happiness, sadness, fear and disgust. Classification accuracy was significantly above chance levels in most of the prefrontal cortex and in the posterior medial cortices. Furthermore, participants' self-reported PT was positively associated with classification accuracy in the ventromedial prefrontal cortex and insula. This finding has implications for understanding affective processing in the prefrontal cortex and for interpreting the cognitive significance of classifiable affective brain states. Our multivariate approach suggests that PT ability may rely on the grain of internally simulated affective representations rather than simply the global strength.
Collapse
Affiliation(s)
- Anthony G Vaccaro
- Jon Brain and Creativity Institute, Department of Psychology, University of Southern California, Los Angeles, CA 90089-0001, USA
| | - Panthea Heydari
- Jon Brain and Creativity Institute, Department of Psychology, University of Southern California, Los Angeles, CA 90089-0001, USA
| | - Leonardo Christov-Moore
- Jon Brain and Creativity Institute, Department of Psychology, University of Southern California, Los Angeles, CA 90089-0001, USA
| | - Antonio Damasio
- Jon Brain and Creativity Institute, Department of Psychology, University of Southern California, Los Angeles, CA 90089-0001, USA
| | - Jonas T Kaplan
- Correspondence should be addressed to Jonas T. Kaplan, Brain and Creativity Institute, 3620A McClintock Ave, Los Angeles, CA 90089, USA. E-mail:
| |
Collapse
|
18
|
Izumika R, Cabeza R, Tsukiura T. Neural Mechanisms of Perceiving and Subsequently Recollecting Emotional Facial Expressions in Young and Older Adults. J Cogn Neurosci 2022; 34:1183-1204. [PMID: 35468212 DOI: 10.1162/jocn_a_01851] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
It is known that emotional facial expressions modulate the perception and subsequent recollection of faces and that aging alters these modulatory effects. Yet, the underlying neural mechanisms are not well understood, and they were the focus of the current fMRI study. We scanned healthy young and older adults while perceiving happy, neutral, or angry faces paired with names. Participants were then provided with the names of the faces and asked to recall the facial expression of each face. fMRI analyses focused on the fusiform face area (FFA), the posterior superior temporal sulcus (pSTS), the OFC, the amygdala, and the hippocampus (HC). Univariate activity, multivariate pattern (MVPA), and functional connectivity analyses were performed. The study yielded two main sets of findings. First, in pSTS and the amygdala, univariate activity and MVPA discrimination during the processing of facial expressions were similar in young and older adults, whereas in FFA and OFC, MVPA discriminated facial expressions less accurately in older than young adults. These findings suggest that facial expression representations in FFA and OFC reflect age-related dedifferentiation and positivity effect. Second, HC-OFC connectivity showed subsequent memory effects (SMEs) for happy expressions in both age groups, HC-FFA connectivity exhibited SMEs for happy and neutral expressions in young adults, and HC-pSTS interactions displayed SMEs for happy expressions in older adults. These results could be related to compensatory mechanisms and positivity effects in older adults. Taken together, the results clarify the effects of aging on the neural mechanisms in perceiving and encoding facial expressions.
Collapse
|
19
|
Takatsuru Y, Motegi S, Nishikata T, Sato H, Yonemochi K. Frontal medial cortex and angular gyrus functional connectivity is related to sex and age differences in odor sensitivity. J Neuroimaging 2022; 32:611-616. [PMID: 35355361 DOI: 10.1111/jon.12994] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2021] [Revised: 02/23/2022] [Accepted: 03/13/2022] [Indexed: 11/30/2022] Open
Abstract
BACKGROUND AND PURPOSE Odor preference is one of the key factors for the rehabilitation of the swallowing function. On the other hand, sensitivity to odor differs between sexes and decreases with age. These factors rely on brain neuronal circuits. However, it remains not fully clarified which neuronal circuit determines the sex and age differences in odor sensitivity. In this study, we carried out both the odor sensitivity test and functional MRI (fMRI) to find the key neuronal circuits determining sex and age differences in odor sensitivity. METHODS Healthy volunteers (28 males, aged 27-62 years, and 30 females, aged 21-59 years) participated in this study. Some of them (seven males and seven females) underwent fMRI. We prepared five odorous test substances and presented each substance at 1 minute intervals. After 5 minutes of questioning about food intake, the subjects were asked to recall each of the test substances presented from the list. In the fMRI study, all the subjects underwent 15 minutes of the prestimulation, stimulation with peppermint odor, and poststimulation sessions. RESULTS The odor test score was significantly higher in females than in males and showed an age-dependent decrease. We found four functional connectivities whose degrees were significantly different between males and females. One of them, the functional connectivity between the frontal medial cortex (MedFC) and the left angular gyrus (AG. l), showed an age-dependent change. CONCLUSIONS The functional MedFC-AG.l connectivity is one of the important neuronal circuits that affect the sex- and age-dependent odor sensitivity.
Collapse
Affiliation(s)
- Yusuke Takatsuru
- Division of Multidimensional Clinical Medicine, Department of Nutrition and Health Sciences, Toyo University, Itakura, Japan.,Department of Medicine, Johmoh Hospital, Maebashi, Japan
| | - Shunichi Motegi
- Department of Radiological Sciences, International University of Health and Welfare, Otawara, Japan
| | | | - Hideyasu Sato
- Department of Food Life Sciences, Toyo University, Itakura, Japan
| | - Keita Yonemochi
- Department of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan
| |
Collapse
|
20
|
Lee KM, Lee S, Satpute AB. Sinful pleasures and pious woes? Using fMRI to examine evaluative and hedonic emotion knowledge. Soc Cogn Affect Neurosci 2022; 17:986-994. [PMID: 35348768 PMCID: PMC9629474 DOI: 10.1093/scan/nsac024] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2021] [Revised: 02/24/2022] [Accepted: 03/22/2022] [Indexed: 01/12/2023] Open
Abstract
Traditionally, lust and pride have been considered pleasurable, yet sinful in the West. Conversely, guilt is often considered aversive, yet valuable. These emotions illustrate how evaluations about specific emotions and beliefs about their hedonic properties may often diverge. Evaluations about specific emotions may shape important aspects of emotional life (e.g. in emotion regulation, emotion experience and acquisition of emotion concepts). Yet these evaluations are often understudied in affective neuroscience. Prior work in emotion regulation, affective experience, evaluation/attitudes and decision-making point to anterior prefrontal areas as candidates for supporting evaluative emotion knowledge. Thus, we examined the brain areas associated with evaluative and hedonic emotion knowledge, with a focus on the anterior prefrontal cortex. Participants (N = 25) made evaluative and hedonic ratings about emotion knowledge during functional magnetic resonance imaging (fMRI). We found that greater activity in the medial prefrontal cortex (mPFC), ventromedial PFC (vmPFC) and precuneus was associated with an evaluative (vs hedonic) focus on emotion knowledge. Our results suggest that the mPFC and vmPFC, in particular, may play a role in evaluating discrete emotions.
Collapse
Affiliation(s)
- Kent M Lee
- Correspondence should be addressed to Kent M. Lee, Department of Psychology, Northeastern University, 125 Nightingale Hall, Boston, MA, USA. E-mail:
| | - SuhJin Lee
- Department of Neurobiology, University of Pittsburgh School of Medicine, Pittsburgh, PA 15213, USA
| | - Ajay B Satpute
- Department of Psychology, Northeastern University, Boston, MA 02115, USA
| |
Collapse
|
21
|
Parelman JM, Doré BP, Cooper N, O’Donnell MB, Chan HY, Falk EB. Overlapping Functional Representations of Self- and Other-Related Thought are Separable Through Multivoxel Pattern Classification. Cereb Cortex 2022; 32:1131-1141. [PMID: 34398230 PMCID: PMC8924429 DOI: 10.1093/cercor/bhab272] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2021] [Revised: 07/14/2021] [Accepted: 07/15/2021] [Indexed: 12/13/2022] Open
Abstract
Self-reflection and thinking about the thoughts and behaviors of others are important skills for humans to function in the social world. These two processes overlap in terms of the component processes involved, and share overlapping functional organizations within the human brain, in particular within the medial prefrontal cortex (MPFC). Several functional models have been proposed to explain these two processes, but none has directly explored the extent to which they are distinctly represented within different parts of the brain. This study used multivoxel pattern classification to quantify the separability of self- and other-related thought in the MPFC and expanded this question to the entire brain. Using a large-scale mega-analytic dataset, spanning three separate studies (n = 142), we find that self- and other-related thought can be reliably distinguished above chance within the MPFC, posterior cingulate cortex and temporal lobes. We highlight subcomponents of the ventral MPFC that are particularly important in representing self-related thought, and subcomponents of the orbitofrontal cortex robustly involved in representing other-related thought. Our findings indicate that representations of self- and other-related thought in the human brain are described best by a distributed pattern rather than stark localization or a purely ventral to dorsal linear gradient in the MPFC.
Collapse
Affiliation(s)
- Jacob M Parelman
- Annenberg School for Communication, University of Pennsylvania, Philadelphia, PA 19104, USA
| | - Bruce P Doré
- Desautels Faculty of Management, McGill University, H3A 1G5, Montreal, Canada
| | - Nicole Cooper
- Annenberg School for Communication, University of Pennsylvania, Philadelphia, PA 19104, USA
| | | | - Hang-Yee Chan
- Amsterdam School of Communication Research, University of Amsterdam, Nieuwe Achtergracht 166, 1018 WV Amsterdam, The Netherlands
| | - Emily B Falk
- Annenberg School for Communication, University of Pennsylvania, Philadelphia, PA 19104, USA
| |
Collapse
|
22
|
Foster C. A Distributed Model of Face and Body Integration. Neurosci Insights 2022; 17:26331055221119221. [PMID: 35991808 PMCID: PMC9386443 DOI: 10.1177/26331055221119221] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2022] [Accepted: 07/26/2022] [Indexed: 11/17/2022] Open
Abstract
Separated face- and body-responsive brain networks have been identified that show strong responses when observers view faces and bodies. It has been proposed that face and body processing may be initially separated in the lateral occipitotemporal cortex and then combined into a whole person representation in the anterior temporal cortex, or elsewhere in the brain. However, in contrast to this proposal, our recent study identified a common coding of face and body orientation (ie, facing direction) in the lateral occipitotemporal cortex, demonstrating an integration of face and body information at an early stage of face and body processing. These results, in combination with findings that show integration of face and body identity in the lateral occipitotemporal, parahippocampal and superior parietal cortex, and face and body emotional expression in the posterior superior temporal sulcus and medial prefrontal cortex, suggest that face and body integration may be more distributed than previously considered. I propose a new model of face and body integration, where areas at the intersection of face- and body-responsive regions play a role in integrating specific properties of faces and bodies, and distributed regions across the brain contribute to high-level, abstract integration of shared face and body properties.
Collapse
Affiliation(s)
- Celia Foster
- Biopsychology and Cognitive Neuroscience, Faculty of Psychology and Sports Science, Bielefeld University, Bielefeld, Germany
- Center of Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| |
Collapse
|
23
|
Lee KM, Ferreira-Santos F, Satpute AB. Predictive processing models and affective neuroscience. Neurosci Biobehav Rev 2021; 131:211-228. [PMID: 34517035 PMCID: PMC9074371 DOI: 10.1016/j.neubiorev.2021.09.009] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Revised: 02/10/2021] [Accepted: 09/07/2021] [Indexed: 01/17/2023]
Abstract
The neural bases of affective experience remain elusive. Early neuroscience models of affect searched for specific brain regions that uniquely carried out the computations that underlie dimensions of valence and arousal. However, a growing body of work has failed to identify these circuits. Research turned to multivariate analyses, but these strategies, too, have made limited progress. Predictive processing models offer exciting new directions to address this problem. Here, we use predictive processing models as a lens to critique prevailing functional neuroimaging research practices in affective neuroscience. Our review highlights how much work relies on rigid assumptions that are inconsistent with a predictive processing approach. We outline the central aspects of a predictive processing model and draw out their implications for research in affective and cognitive neuroscience. Predictive models motivate a reformulation of "reverse inference" in cognitive neuroscience, and placing a greater emphasis on external validity in experimental design.
Collapse
Affiliation(s)
- Kent M Lee
- Northeastern University, 360 Huntington Ave, 125 NI, Boston, MA 02118, USA.
| | - Fernando Ferreira-Santos
- Laboratory of Neuropsychophysiology, Faculty of Psychology and Education Sciences, University of Porto, Portugal
| | - Ajay B Satpute
- Northeastern University, 360 Huntington Ave, 125 NI, Boston, MA 02118, USA
| |
Collapse
|
24
|
Murray T, O'Brien J, Sagiv N, Garrido L. The role of stimulus-based cues and conceptual information in processing facial expressions of emotion. Cortex 2021; 144:109-132. [PMID: 34666297 DOI: 10.1016/j.cortex.2021.08.007] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2020] [Revised: 07/16/2021] [Accepted: 08/09/2021] [Indexed: 01/07/2023]
Abstract
Face shape and surface textures are two important cues that aid in the perception of facial expressions of emotion. Additionally, this perception is also influenced by high-level emotion concepts. Across two studies, we use representational similarity analysis to investigate the relative roles of shape, surface, and conceptual information in the perception, categorisation, and neural representation of facial expressions. In Study 1, 50 participants completed a perceptual task designed to measure the perceptual similarity of expression pairs, and a categorical task designed to measure the confusability between expression pairs when assigning emotion labels to a face. We used representational similarity analysis and constructed three models of the similarities between emotions using distinct information. Two models were based on stimulus-based cues (face shapes and surface textures) and one model was based on emotion concepts. Using multiple linear regression, we found that behaviour during both tasks was related with the similarity of emotion concepts. The model based on face shapes was more related with behaviour in the perceptual task than in the categorical, and the model based on surface textures was more related with behaviour in the categorical than the perceptual task. In Study 2, 30 participants viewed facial expressions while undergoing fMRI, allowing for the measurement of brain representational geometries of facial expressions of emotion in three core face-responsive regions (the Fusiform Face Area, Occipital Face Area, and Superior Temporal Sulcus), and a region involved in theory of mind (Medial Prefrontal Cortex). Across all four regions, the representational distances between facial expression pairs were related to the similarities of emotion concepts, but not to either of the stimulus-based cues. Together, these results highlight the important top-down influence of high-level emotion concepts both in behavioural tasks and in the neural representation of facial expressions.
Collapse
Affiliation(s)
- Thomas Murray
- Psychology Department, School of Biological and Behavioural Sciences, Queen Mary University London, United Kingdom.
| | - Justin O'Brien
- Centre for Cognitive Neuroscience, Department of Life Sciences, Brunel University London, United Kingdom
| | - Noam Sagiv
- Centre for Cognitive Neuroscience, Department of Life Sciences, Brunel University London, United Kingdom
| | - Lúcia Garrido
- Department of Psychology, City, University of London, United Kingdom
| |
Collapse
|
25
|
Weaverdyck ME, Thornton MA, Tamir DI. The representational structure of mental states generalizes across target people and stimulus modalities. Neuroimage 2021; 238:118258. [PMID: 34118394 PMCID: PMC8327621 DOI: 10.1016/j.neuroimage.2021.118258] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2020] [Revised: 06/06/2021] [Accepted: 06/07/2021] [Indexed: 10/26/2022] Open
Abstract
Each individual experiences mental states in their own idiosyncratic way, yet perceivers can accurately understand a huge variety of states across unique individuals. How do they accomplish this feat? Do people think about their own anger in the same ways as another person's anger? Is reading about someone's anxiety the same as seeing it? Here, we test the hypothesis that a common conceptual core unites mental state representations across contexts. Across three studies, participants judged the mental states of multiple targets, including a generic other, the self, a socially close other, and a socially distant other. Participants viewed mental state stimuli in multiple modalities, including written scenarios and images. Using representational similarity analysis, we found that brain regions associated with social cognition expressed stable neural representations of mental states across both targets and modalities. Together, these results suggest that people use stable models of mental states across different people and contexts.
Collapse
Affiliation(s)
- Miriam E Weaverdyck
- Department of Psychology, Princeton University, Princeton, NJ 08544, United States.
| | - Mark A Thornton
- Department of Psychology, Princeton University, Princeton, NJ 08544, United States
| | - Diana I Tamir
- Department of Psychology, Princeton University, Princeton, NJ 08544, United States; Princeton Neuroscience Institute, Princeton University, Princeton, NJ 08544, United States
| |
Collapse
|
26
|
Qiu S, Mei G. Spontaneous recovery of adaptation aftereffects of natural facial categories. Vision Res 2021; 188:202-210. [PMID: 34365177 DOI: 10.1016/j.visres.2021.07.015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2021] [Revised: 07/07/2021] [Accepted: 07/23/2021] [Indexed: 10/20/2022]
Abstract
Adaptation to a natural face attribute such as a happy face can bias the perception of a subsequent face in this dimension such as a neutral face. Such face adaptation aftereffects have been widely found in many natural facial categories. However, how temporally tuned mechanisms could control the temporal dynamics of natural face adaptation aftereffects remains unknown. To address the question, we used a deadaptation paradigm to examine whether the spontaneous recovery of natural facial aftereffects would emerge in four natural facial categories including variable categories (emotional expressions in Experiment 1 and eye gaze in Experiment 2) and invariable categories (facial gender in Experiment 3 and facial identity in Experiment 4). In the deadaptation paradigm, participants adapted to a face with an extreme attribute (such as a 100% angry face in Experiment 1) for a relatively long duration, and then deadapted to a face with an opposite extreme attribute (such as a 100% happy face in Experiment 1) for a relatively short duration. The time courses of face adaptation aftereffects were measured using a top-up manner. Deadaptation only masked the effects of initial longer-lasting adaptation, and the spontaneous recovery of adaptation aftereffects was observed at the post-test stage for all four natural facial categories. These results likely indicate that the temporal dynamics of adaptation aftereffects of natural facial categories may be controlled by multiple temporally tuned mechanisms.
Collapse
Affiliation(s)
- Shiming Qiu
- School of Psychology, Guizhou Normal University, Guiyang, PR China
| | - Gaoxing Mei
- School of Psychology, Guizhou Normal University, Guiyang, PR China.
| |
Collapse
|
27
|
Richardson H, Taylor J, Kane-Grade F, Powell L, Bosquet Enlow M, Nelson C. Preferential responses to faces in superior temporal and medial prefrontal cortex in three-year-old children. Dev Cogn Neurosci 2021; 50:100984. [PMID: 34246062 PMCID: PMC8274289 DOI: 10.1016/j.dcn.2021.100984] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2021] [Revised: 06/04/2021] [Accepted: 06/29/2021] [Indexed: 10/25/2022] Open
Abstract
Perceiving faces and understanding emotions are key components of human social cognition. Prior research with adults and infants suggests that these social cognitive functions are supported by superior temporal cortex (STC) and medial prefrontal cortex (MPFC). We used functional near-infrared spectroscopy (fNIRS) to characterize functional responses in these cortical regions to faces in early childhood. Three-year-old children (n = 88, M(SD) = 3.15(.16) years) passively viewed faces that varied in emotional content and valence (happy, angry, fearful, neutral) and, for fearful and angry faces, intensity (100%, 40%), while undergoing fNIRS. Bilateral STC and MPFC showed greater oxygenated hemoglobin concentration values to all faces relative to objects. MPFC additionally responded preferentially to happy faces relative to neutral faces. We did not detect preferential responses to angry or fearful faces, or overall differences in response magnitude by emotional valence (100% happy vs. fearful and angry) or intensity (100% vs. 40% fearful and angry). In exploratory analyses, preferential responses to faces in MPFC were not robustly correlated with performance on tasks of early social cognition. These results link and extend adult and infant research on functional responses to faces in STC and MPFC and contribute to the characterization of the neural correlates of early social cognition.
Collapse
Affiliation(s)
- H. Richardson
- Department of Pediatrics, Boston Children’s Hospital, United States
- Department of Pediatrics, Harvard Medical School, United States
- School of Philosophy, Psychology and Language Sciences, University of Edinburgh, United Kingdom
| | - J. Taylor
- Department of Pediatrics, Boston Children’s Hospital, United States
- Department of Pediatrics, Harvard Medical School, United States
| | - F. Kane-Grade
- Department of Pediatrics, Boston Children’s Hospital, United States
- Department of Pediatrics, Harvard Medical School, United States
- Institute of Child Development, University of Minnesota, United States
| | - L. Powell
- Department of Psychology, University of California San Diego, United States
| | - M. Bosquet Enlow
- Department of Psychiatry, Boston Children’s Hospital, United States
- Department of Psychiatry, Harvard Medical School, United States
| | - C.A. Nelson
- Department of Pediatrics, Boston Children’s Hospital, United States
- Department of Pediatrics, Harvard Medical School, United States
- Graduate School of Education, Harvard University, United States
| |
Collapse
|
28
|
Saarimäki H. Naturalistic Stimuli in Affective Neuroimaging: A Review. Front Hum Neurosci 2021; 15:675068. [PMID: 34220474 PMCID: PMC8245682 DOI: 10.3389/fnhum.2021.675068] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2021] [Accepted: 05/17/2021] [Indexed: 11/13/2022] Open
Abstract
Naturalistic stimuli such as movies, music, and spoken and written stories elicit strong emotions and allow brain imaging of emotions in close-to-real-life conditions. Emotions are multi-component phenomena: relevant stimuli lead to automatic changes in multiple functional components including perception, physiology, behavior, and conscious experiences. Brain activity during naturalistic stimuli reflects all these changes, suggesting that parsing emotion-related processing during such complex stimulation is not a straightforward task. Here, I review affective neuroimaging studies that have employed naturalistic stimuli to study emotional processing, focusing especially on experienced emotions. I argue that to investigate emotions with naturalistic stimuli, we need to define and extract emotion features from both the stimulus and the observer.
Collapse
Affiliation(s)
- Heini Saarimäki
- Human Information Processing Laboratory, Faculty of Social Sciences, Tampere University, Tampere, Finland
| |
Collapse
|
29
|
McDonald KR, Pearson JM, Huettel SA. Dorsolateral and dorsomedial prefrontal cortex track distinct properties of dynamic social behavior. Soc Cogn Affect Neurosci 2021; 15:383-393. [PMID: 32382757 PMCID: PMC7308662 DOI: 10.1093/scan/nsaa053] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2019] [Revised: 03/06/2020] [Accepted: 03/27/2020] [Indexed: 12/21/2022] Open
Abstract
Understanding how humans make competitive decisions in complex environments is a key goal of decision neuroscience. Typical experimental paradigms constrain behavioral complexity (e.g. choices in discrete-play games), and thus, the underlying neural mechanisms of dynamic social interactions remain incompletely understood. Here, we collected fMRI data while humans played a competitive real-time video game against both human and computer opponents, and then, we used Bayesian non-parametric methods to link behavior to neural mechanisms. Two key cognitive processes characterized behavior in our task: (i) the coupling of one’s actions to another’s actions (i.e. opponent sensitivity) and (ii) the advantageous timing of a given strategic action. We found that the dorsolateral prefrontal cortex displayed selective activation when the subject’s actions were highly sensitive to the opponent’s actions, whereas activation in the dorsomedial prefrontal cortex increased proportionally to the advantageous timing of actions to defeat one’s opponent. Moreover, the temporoparietal junction tracked both of these behavioral quantities as well as opponent social identity, indicating a more general role in monitoring other social agents. These results suggest that brain regions that are frequently implicated in social cognition and value-based decision-making also contribute to the strategic tracking of the value of social actions in dynamic, multi-agent contexts.
Collapse
Affiliation(s)
- Kelsey R McDonald
- Duke Institute for Brain Sciences, Duke University, Durham, NC 27710, USA.,Center for Cognitive Neuroscience, Duke University, Durham, NC 27710, USA.,Department of Psychology and Neuroscience, Duke University, Durham, NC 27708, USA
| | - John M Pearson
- Duke Institute for Brain Sciences, Duke University, Durham, NC 27710, USA.,Center for Cognitive Neuroscience, Duke University, Durham, NC 27710, USA.,Department of Psychology and Neuroscience, Duke University, Durham, NC 27708, USA.,Department of Biostatistics and Bioinformatics, Duke University Medical School, Durham, NC 27710, USA
| | - Scott A Huettel
- Duke Institute for Brain Sciences, Duke University, Durham, NC 27710, USA.,Center for Cognitive Neuroscience, Duke University, Durham, NC 27710, USA.,Department of Psychology and Neuroscience, Duke University, Durham, NC 27708, USA
| |
Collapse
|
30
|
Arbula S, Pisanu E, Rumiati RI. Representation of social content in dorsomedial prefrontal cortex underlies individual differences in agreeableness trait. Neuroimage 2021; 235:118049. [PMID: 33848626 DOI: 10.1016/j.neuroimage.2021.118049] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2020] [Revised: 04/01/2021] [Accepted: 04/02/2021] [Indexed: 01/31/2023] Open
Abstract
Personality traits reflect key aspects of individual variability in different psychological domains. Understanding the mechanisms that give rise to these differences requires an exhaustive investigation of the behaviors associated with such traits, and their underlying neural sources. Here we investigated the mechanisms underlying agreeableness, one of the five major dimensions of personality, which has been linked mainly to socio-cognitive functions. In particular, we examined whether individual differences in the neural representations of social information are related to differences in agreeableness of individuals. To this end, we adopted a multivariate representational similarity approach that captured within single individuals the activation pattern similarity of social and non-social content, and tested its relation to the agreeableness trait in a hypothesis-driven manner. The main result confirmed our prediction: processing social and non-social content led to similar patterns of activation in individuals with low agreeableness, while in more agreeable individuals these patterns were more dissimilar. Critically, this association between agreeableness and encoding similarity of social and random content was significant only in the dorsomedial prefrontal cortex, a brain region consistently involved during attributions of mental states. The present finding reveals the link between neural mechanisms underlying social information processing and agreeableness, a personality trait highly related to socio-cognitive abilities, thereby providing a step forward in characterizing its neural determinants. Furthermore, it emphasizes the advantage of multivariate pattern analysis approaches in capturing and understanding the neural sources of individual variations.
Collapse
Affiliation(s)
- Sandra Arbula
- Neuroscience Area, International School for Advanced Studies (SISSA), via Bonomea 265, Trieste 34136, Italy.
| | - Elisabetta Pisanu
- Neuroscience Area, International School for Advanced Studies (SISSA), via Bonomea 265, Trieste 34136, Italy
| | - Raffaella I Rumiati
- Neuroscience Area, International School for Advanced Studies (SISSA), via Bonomea 265, Trieste 34136, Italy; Scuola superiore di studi avanzati Sapienza (SSAS), Rome, Italy
| |
Collapse
|
31
|
Modality-general and modality-specific audiovisual valence processing. Cortex 2021; 138:127-137. [PMID: 33684626 DOI: 10.1016/j.cortex.2021.01.022] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2020] [Revised: 11/08/2020] [Accepted: 01/20/2021] [Indexed: 11/23/2022]
Abstract
A fundamental question in affective neuroscience is whether there is a common hedonic system for valence processing independent of modality, or there are distinct neural systems for different modalities. To address this question, we used both region of interest and whole-brain representational similarity analyses on functional magnetic resonance imaging data to identify modality-general and modality-specific brain areas involved in valence processing across visual and auditory modalities. First, region of interest analyses showed that the superior temporal cortex was associated with both modality-general and auditory-specific models, while the primary visual cortex was associated with the visual-specific model. Second, the whole-brain searchlight analyses also identified both modality-general and modality-specific representations. The modality-general regions included the superior temporal, medial superior frontal, inferior frontal, precuneus, precentral, postcentral, supramarginal, paracentral lobule and middle cingulate cortices. The modality-specific regions included both perceptual cortices and higher-order brain areas. The valence representations derived from individualized behavioral valence ratings were consistent with these results. Together, these findings suggest both modality-general and modality-specific representations of valence.
Collapse
|
32
|
Ho SS, Nakamura Y, Swain JE. Compassion As an Intervention to Attune to Universal Suffering of Self and Others in Conflicts: A Translational Framework. Front Psychol 2021; 11:603385. [PMID: 33505336 PMCID: PMC7829669 DOI: 10.3389/fpsyg.2020.603385] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2020] [Accepted: 12/09/2020] [Indexed: 01/09/2023] Open
Abstract
As interpersonal, racial, social, and international conflicts intensify in the world, it is important to safeguard the mental health of individuals affected by them. According to a Buddhist notion "if you want others to be happy, practice compassion; if you want to be happy, practice compassion," compassion practice is an intervention to cultivate conflict-proof well-being. Here, compassion practice refers to a form of concentrated meditation wherein a practitioner attunes to friend, enemy, and someone in between, thinking, "I'm going to help them (equally)." The compassion meditation is based on Buddhist philosophy that mental suffering is rooted in conceptual thoughts that give rise to generic mental images of self and others and subsequent biases to preserve one's egoism, blocking the ultimate nature of mind. To contextualize compassion meditation scientifically, we adopted a Bayesian active inference framework to incorporate relevant Buddhist concepts, including mind (buddhi), compassion (karuna), aggregates (skandhas), suffering (duhkha), reification (samaropa), conceptual thoughts (vikalpa), and superimposition (prapañca). In this framework, a person is considered a Bayesian Engine that actively constructs phenomena based on the aggregates of forms, sensations, discriminations, actions, and consciousness. When the person embodies rigid beliefs about self and others' identities (identity-grasping beliefs) and the resulting ego-preserving bias, the person's Bayesian Engine malfunctions, failing to use prediction errors to update prior beliefs. To counter this problem, after recognizing the causes of sufferings, a practitioner of the compassion meditation aims to attune to all others equally, friends and enemies alike, suspend identity-based conceptual thoughts, and eventually let go of any identity-grasping belief and ego-preserving bias that obscure reality. We present a brain model for the Bayesian Engine of three components: (a) Relation-Modeling, (b) Reality-Checking, and (c) Conflict-Alarming, which are subserved by (a) the Default-Mode Network (DMN), (b) Frontoparietal Network (FPN) and Ventral Attention Network (VAN), and (c) Salience Network (SN), respectively. Upon perceiving conflicts, the strengthening or weakening of ego-preserving bias will critically depend on whether the SN up-regulates the DMN or FPN/VAN, respectively. We propose that compassion meditation can strengthen brain regions that are conducive for suspending prior beliefs and enhancing the attunements to the counterparts in conflicts.
Collapse
Affiliation(s)
- S. Shaun Ho
- Department of Psychiatry and Behavioral Health, Stony Brook University, Stony Brook, NY, United States
| | - Yoshio Nakamura
- Department of Anesthesiology, Division of Pain Medicine, Pain Research Center, University of Utah School of Medicine, Salt Lake City, UT, United States
| | - James E. Swain
- Department of Psychiatry and Behavioral Health, Stony Brook University, Stony Brook, NY, United States
| |
Collapse
|
33
|
Chen Z, Whitney D. Inferential affective tracking reveals the remarkable speed of context-based emotion perception. Cognition 2020; 208:104549. [PMID: 33340812 DOI: 10.1016/j.cognition.2020.104549] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2020] [Revised: 12/08/2020] [Accepted: 12/09/2020] [Indexed: 10/22/2022]
Abstract
Understanding the emotional states of others is important for social functioning. Recent studies show that context plays an essential role in emotion recognition. However, it remains unclear whether emotion inference from visual scene context is as efficient as emotion recognition from faces. Here, we measured the speed of context-based emotion perception, using Inferential Affective Tracking (IAT) with naturalistic and dynamic videos. Using cross-correlation analyses, we found that inferring affect based on visual context alone is just as fast as tracking affect with all available information including face and body. We further demonstrated that this approach has high precision and sensitivity to sub-second lags. Our results suggest that emotion recognition from dynamic contextual information might be automatic and immediate. Seemingly complex context-based emotion perception is far more efficient than previously assumed.
Collapse
Affiliation(s)
- Zhimin Chen
- Department of Psychology, University of California, Berkeley, CA 94720, United States of America.
| | - David Whitney
- Department of Psychology, University of California, Berkeley, CA 94720, United States of America; Vision Science Program, University of California, Berkeley, CA 94720, United States of America; Helen Wills Neuroscience Institute, University of California, Berkeley, CA 94720, United States of America
| |
Collapse
|
34
|
Shinkareva SV, Gao C, Wedell D. Audiovisual Representations of Valence: a Cross-study Perspective. ACTA ACUST UNITED AC 2020; 1:237-246. [DOI: 10.1007/s42761-020-00023-9] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2020] [Accepted: 10/22/2020] [Indexed: 01/25/2023]
|
35
|
Azari B, Westlin C, Satpute AB, Hutchinson JB, Kragel PA, Hoemann K, Khan Z, Wormwood JB, Quigley KS, Erdogmus D, Dy J, Brooks DH, Barrett LF. Comparing supervised and unsupervised approaches to emotion categorization in the human brain, body, and subjective experience. Sci Rep 2020; 10:20284. [PMID: 33219270 PMCID: PMC7679385 DOI: 10.1038/s41598-020-77117-8] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2020] [Accepted: 09/16/2020] [Indexed: 12/05/2022] Open
Abstract
Machine learning methods provide powerful tools to map physical measurements to scientific categories. But are such methods suitable for discovering the ground truth about psychological categories? We use the science of emotion as a test case to explore this question. In studies of emotion, researchers use supervised classifiers, guided by emotion labels, to attempt to discover biomarkers in the brain or body for the corresponding emotion categories. This practice relies on the assumption that the labels refer to objective categories that can be discovered. Here, we critically examine this approach across three distinct datasets collected during emotional episodes—measuring the human brain, body, and subjective experience—and compare supervised classification solutions with those from unsupervised clustering in which no labels are assigned to the data. We conclude with a set of recommendations to guide researchers towards meaningful, data-driven discoveries in the science of emotion and beyond.
Collapse
Affiliation(s)
- Bahar Azari
- Department of Electrical & Computer Engineering, College of Engineering, Northeastern University, Boston, MA, USA
| | - Christiana Westlin
- Department of Psychology, College of Science, Northeastern University, Boston, MA, USA
| | - Ajay B Satpute
- Department of Psychology, College of Science, Northeastern University, Boston, MA, USA
| | | | - Philip A Kragel
- Institute of Cognitive Science, University of Colorado Boulder, Boulder, CO, USA
| | - Katie Hoemann
- Department of Psychology, College of Science, Northeastern University, Boston, MA, USA
| | - Zulqarnain Khan
- Department of Electrical & Computer Engineering, College of Engineering, Northeastern University, Boston, MA, USA
| | - Jolie B Wormwood
- Department of Psychology, University of New Hampshire, Durham, NH, USA
| | - Karen S Quigley
- Department of Psychology, College of Science, Northeastern University, Boston, MA, USA.,Edith Nourse Rogers Veterans Hospital, Bedford, MA, USA
| | - Deniz Erdogmus
- Department of Electrical & Computer Engineering, College of Engineering, Northeastern University, Boston, MA, USA
| | - Jennifer Dy
- Department of Electrical & Computer Engineering, College of Engineering, Northeastern University, Boston, MA, USA
| | - Dana H Brooks
- Department of Electrical & Computer Engineering, College of Engineering, Northeastern University, Boston, MA, USA.
| | - Lisa Feldman Barrett
- Department of Psychology, College of Science, Northeastern University, Boston, MA, USA. .,Department of Psychiatry, Massachusetts General Hospital, Boston, MA, USA. .,Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Boston, MA, USA.
| |
Collapse
|
36
|
Liang Y, Liu B. Cross-Subject Commonality of Emotion Representations in Dorsal Motion-Sensitive Areas. Front Neurosci 2020; 14:567797. [PMID: 33177977 PMCID: PMC7591793 DOI: 10.3389/fnins.2020.567797] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2020] [Accepted: 09/22/2020] [Indexed: 11/13/2022] Open
Abstract
Emotion perception is a crucial question in cognitive neuroscience and the underlying neural substrates have been the subject of intense study. One of our previous studies demonstrated that motion-sensitive areas are involved in the perception of facial expressions. However, it remains unclear whether emotions perceived from whole-person stimuli can be decoded from the motion-sensitive areas. In addition, if emotions are represented in the motion-sensitive areas, we may further ask whether the representations of emotions in the motion-sensitive areas can be shared across individual subjects. To address these questions, this study collected neural images while participants viewed emotions (joy, anger, and fear) from videos of whole-person expressions (contained both face and body parts) in a block-design functional magnetic resonance imaging (fMRI) experiment. Multivariate pattern analysis (MVPA) was conducted to explore the emotion decoding performance in individual-defined dorsal motion-sensitive regions of interest (ROIs). Results revealed that emotions could be successfully decoded from motion-sensitive ROIs with statistically significant classification accuracies for three emotions as well as positive versus negative emotions. Moreover, results from the cross-subject classification analysis showed that a person’s emotion representation could be robustly predicted by others’ emotion representations in motion-sensitive areas. Together, these results reveal that emotions are represented in dorsal motion-sensitive areas and that the representation of emotions is consistent across subjects. Our findings provide new evidence of the involvement of motion-sensitive areas in the emotion decoding, and further suggest that there exists a common emotion code in the motion-sensitive areas across individual subjects.
Collapse
Affiliation(s)
- Yin Liang
- Faculty of Information Technology, College of Computer Science and Technology, Beijing Artificial Intelligence Institute, Beijing University of Technology, Beijing, China
| | - Baolin Liu
- School of Computer and Communication Engineering, University of Science and Technology Beijing, Beijing, China
| |
Collapse
|
37
|
Leitão J, Meuleman B, Van De Ville D, Vuilleumier P. Computational imaging during video game playing shows dynamic synchronization of cortical and subcortical networks of emotions. PLoS Biol 2020; 18:e3000900. [PMID: 33180768 PMCID: PMC7685507 DOI: 10.1371/journal.pbio.3000900] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2020] [Revised: 11/24/2020] [Accepted: 10/06/2020] [Indexed: 01/09/2023] Open
Abstract
Emotions are multifaceted phenomena affecting mind, body, and behavior. Previous studies sought to link particular emotion categories (e.g., fear) or dimensions (e.g., valence) to specific brain substrates but generally found distributed and overlapping activation patterns across various emotions. In contrast, distributed patterns accord with multi-componential theories whereby emotions emerge from appraisal processes triggered by current events, combined with motivational, expressive, and physiological mechanisms orchestrating behavioral responses. According to this framework, components are recruited in parallel and dynamically synchronized during emotion episodes. Here, we use functional MRI (fMRI) to investigate brain-wide systems engaged by theoretically defined components and measure their synchronization during an interactive emotion-eliciting video game. We show that each emotion component recruits large-scale cortico-subcortical networks, and that moments of dynamic synchronization between components selectively engage basal ganglia, sensory-motor structures, and midline brain areas. These neural results support theoretical accounts grounding emotions onto embodied and action-oriented functions triggered by synchronized component processes.
Collapse
Affiliation(s)
- Joana Leitão
- Laboratory for Behavioral Neurology and Imaging of Cognition, Department of Fundamental Neuroscience, University of Geneva, Geneva, Switzerland.,Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| | - Ben Meuleman
- Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| | - Dimitri Van De Ville
- Institute of Bioengineering, Center for Neuroprosthetics, École Polytechnique Fédérale de Lausanne (EPFL), Geneva, Switzerland.,Department of Radiology and Medical Informatics, University of Geneva, Geneva, Switzerland
| | - Patrik Vuilleumier
- Laboratory for Behavioral Neurology and Imaging of Cognition, Department of Fundamental Neuroscience, University of Geneva, Geneva, Switzerland.,Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| |
Collapse
|
38
|
Bayet L, Perdue KL, Behrendt HF, Richards JE, Westerlund A, Cataldo JK, Nelson CA. Neural responses to happy, fearful and angry faces of varying identities in 5- and 7-month-old infants. Dev Cogn Neurosci 2020; 47:100882. [PMID: 33246304 PMCID: PMC7695867 DOI: 10.1016/j.dcn.2020.100882] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2020] [Revised: 10/19/2020] [Accepted: 11/03/2020] [Indexed: 11/30/2022] Open
Abstract
fNIRS and looking responses to emotional faces were measured in 5- and 7-month-olds. Emotional faces had varying identities within happy, angry, and fearful blocks. Temporo-parietal and frontal activations were observed, particularly to happy faces. Infants looked longer to the mouth region of angry faces. No difference in behavior or neural activity observed between 5- and 7-month-olds.
The processing of facial emotion is an important social skill that develops throughout infancy and early childhood. Here we investigate the neural underpinnings of the ability to process facial emotion across changes in facial identity in cross-sectional groups of 5- and 7-month-old infants. We simultaneously measured neural metabolic, behavioral, and autonomic responses to happy, fearful, and angry faces of different female models using functional near-infrared spectroscopy (fNIRS), eye-tracking, and heart rate measures. We observed significant neural activation to these facial emotions in a distributed set of frontal and temporal brain regions, and longer looking to the mouth region of angry faces compared to happy and fearful faces. No differences in looking behavior or neural activations were observed between 5- and 7-month-olds, although several exploratory, age-independent associations between neural activations and looking behavior were noted. Overall, these findings suggest more developmental stability than previously thought in responses to emotional facial expressions of varying identities between 5- and 7-months of age.
Collapse
Affiliation(s)
- Laurie Bayet
- Boston Children's Hospital, Boston, MA, USA; Harvard Medical School, Boston, MA, USA
| | - Katherine L Perdue
- Boston Children's Hospital, Boston, MA, USA; Harvard Medical School, Boston, MA, USA
| | - Hannah F Behrendt
- Boston Children's Hospital, Boston, MA, USA; Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital RWTH Aachen, Aachen, Germany
| | | | | | | | - Charles A Nelson
- Boston Children's Hospital, Boston, MA, USA; Harvard Medical School, Boston, MA, USA; Harvard Graduate School of Education, Cambridge, MA, USA.
| |
Collapse
|
39
|
Processing communicative facial and vocal cues in the superior temporal sulcus. Neuroimage 2020; 221:117191. [PMID: 32711066 DOI: 10.1016/j.neuroimage.2020.117191] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2020] [Revised: 07/14/2020] [Accepted: 07/19/2020] [Indexed: 11/20/2022] Open
Abstract
Facial and vocal cues provide critical social information about other humans, including their emotional and attentional states and the content of their speech. Recent work has shown that the face-responsive region of posterior superior temporal sulcus ("fSTS") also responds strongly to vocal sounds. Here, we investigate the functional role of this region and the broader STS by measuring responses to a range of face movements, vocal sounds, and hand movements using fMRI. We find that the fSTS responds broadly to different types of audio and visual face action, including both richly social communicative actions, as well as minimally social noncommunicative actions, ruling out hypotheses of specialization for processing speech signals, or communicative signals more generally. Strikingly, however, responses to hand movements were very low, whether communicative or not, indicating a specific role in the analysis of face actions (facial and vocal), not a general role in the perception of any human action. Furthermore, spatial patterns of response in this region were able to decode communicative from noncommunicative face actions, both within and across modality (facial/vocal cues), indicating sensitivity to an abstract social dimension. These functional properties of the fSTS contrast with a region of middle STS that has a selective, largely unimodal auditory response to speech sounds over both communicative and noncommunicative vocal nonspeech sounds, and nonvocal sounds. Region of interest analyses were corroborated by a data-driven independent component analysis, identifying face-voice and auditory speech responses as dominant sources of voxelwise variance across the STS. These results suggest that the STS contains separate processing streams for the audiovisual analysis of face actions and auditory speech processing.
Collapse
|
40
|
Kim MJ, Mattek AM, Shin J. Amygdalostriatal coupling underpins positive but not negative coloring of ambiguous affect. COGNITIVE, AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2020; 20:949-960. [PMID: 32681315 PMCID: PMC7501244 DOI: 10.3758/s13415-020-00812-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Humans routinely integrate affective information from multiple sources. For example, we rarely interpret an emotional facial expression devoid of context. In this paper, we describe the neural correlates of an affective computation that involves integrating multiple sources, by leveraging the ambiguity and subtle feature-based valence signals found in surprised faces. Using functional magnetic resonance imaging, participants reported the valence of surprised faces modulated by positive or negative sentences. Amygdala activity corresponded to the valence value assigned to each contextually modulated face, with greater activity reflecting more negative ratings. Amygdala activity did not track the valence of the faces or sentences per se. Moreover, the amygdala was functionally coupled with the nucleus accumbens only during face trials preceded by positive contextual cues. These data suggest 1) valence-related amygdala activity reflects the integrated valence values rather than the valence values of each individual component, and 2) amygdalostriatal coupling underpins positive but not negative coloring of ambiguous affect.
Collapse
Affiliation(s)
- M Justin Kim
- Department of Psychology, Sungkyunkwan University, Seoul, South Korea.
| | - Alison M Mattek
- Department of Psychology, University of Oregon, Eugene, OR, USA
| | - Jin Shin
- Department of Psychological & Brain Sciences, Washington University in St. Louis, St. Louis, MO, USA
| |
Collapse
|
41
|
Li Y, Richardson RM, Ghuman AS. Posterior Fusiform and Midfusiform Contribute to Distinct Stages of Facial Expression Processing. Cereb Cortex 2020; 29:3209-3219. [PMID: 30124788 DOI: 10.1093/cercor/bhy186] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2018] [Revised: 07/15/2018] [Accepted: 07/19/2018] [Indexed: 11/12/2022] Open
Abstract
Though the fusiform is well-established as a key node in the face perception network, its role in facial expression processing remains unclear, due to competing models and discrepant findings. To help resolve this debate, we recorded from 17 subjects with intracranial electrodes implanted in face sensitive patches of the fusiform. Multivariate classification analysis showed that facial expression information is represented in fusiform activity and in the same regions that represent identity, though with a smaller effect size. Examination of the spatiotemporal dynamics revealed a functional distinction between posterior fusiform and midfusiform expression coding, with posterior fusiform showing an early peak of facial expression sensitivity at around 180 ms after subjects viewed a face and midfusiform showing a later and extended peak between 230 and 460 ms. These results support the hypothesis that the fusiform plays a role in facial expression perception and highlight a qualitative functional distinction between processing in posterior fusiform and midfusiform, with each contributing to temporally segregated stages of expression perception.
Collapse
Affiliation(s)
- Yuanning Li
- Center for the Neural Basis of Cognition, Carnegie Mellon University and University of Pittsburgh, Pittsburgh, PA, USA.,Program in Neural Computation and Machine Learning, Carnegie Mellon University, Pittsburgh, PA, USA.,Department of Neurological Surgery, University of Pittsburgh, Pittsburgh, PA, USA
| | - R Mark Richardson
- Center for the Neural Basis of Cognition, Carnegie Mellon University and University of Pittsburgh, Pittsburgh, PA, USA.,Department of Neurological Surgery, University of Pittsburgh, Pittsburgh, PA, USA
| | - Avniel Singh Ghuman
- Center for the Neural Basis of Cognition, Carnegie Mellon University and University of Pittsburgh, Pittsburgh, PA, USA.,Program in Neural Computation and Machine Learning, Carnegie Mellon University, Pittsburgh, PA, USA.,Department of Neurological Surgery, University of Pittsburgh, Pittsburgh, PA, USA
| |
Collapse
|
42
|
Sel A, Calvo-Merino B, Tsakiris M, Forster B. The somatotopy of observed emotions. Cortex 2020; 129:11-22. [DOI: 10.1016/j.cortex.2020.04.002] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2019] [Revised: 02/20/2020] [Accepted: 04/07/2020] [Indexed: 11/27/2022]
|
43
|
Abstract
Support vector machines (SVMs) are being used increasingly in affective science as a data-driven classification method and feature reduction technique. Whereas traditional statistical methods typically compare group averages on selected variables, SVMs use a predictive algorithm to learn multivariate patterns that optimally discriminate between groups. In this review, we provide a framework for understanding the methods of SVM-based analyses and summarize the findings of seminal studies that use SVMs for classification or data reduction in the behavioral and neural study of emotion and affective disorders. We conclude by discussing promising directions and potential applications of SVMs in future research in affective science.
Collapse
Affiliation(s)
| | - Matthew D. Sacchet
- Center for Depression, Anxiety, and Stress Research, McLean Hospital, Harvard Medical School, USA
| | - Ian H. Gotlib
- Department of Psychology, Stanford Neurosciences Institute, Stanford University, USA
| |
Collapse
|
44
|
Wilson KA, James GA, Kilts CD, Bush KA. Combining Physiological and Neuroimaging Measures to Predict Affect Processing Induced by Affectively Valent Image Stimuli. Sci Rep 2020; 10:9298. [PMID: 32518277 PMCID: PMC7283349 DOI: 10.1038/s41598-020-66109-3] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2019] [Accepted: 05/06/2020] [Indexed: 11/09/2022] Open
Abstract
The importance of affect processing to human behavior has long driven researchers to pursue its measurement. In this study, we compared the relative fidelity of measurements of neural activation and physiology (i.e., heart rate change) in detecting affective valence induction across a broad continuum of conveyed affective valence. We combined intra-subject neural activation based multivariate predictions of affective valence with measures of heart rate (HR) deceleration to predict predefined normative affect rating scores for stimuli drawn from the International Affective Picture System (IAPS) in a population (n = 50) of healthy adults. In sum, we found that patterns of neural activation and HR deceleration significantly, and uniquely, explain the variance in normative valent scores associated with IAPS stimuli; however, we also found that patterns of neural activation explain a significantly greater proportion of that variance. These traits persisted across a range of stimulus sets, differing by the polar-extremity of their positively and negatively valent subsets, which represent the positively and negatively valent polar-extremity of stimulus sets reported in the literature. Overall, these findings support the acquisition of heart rate deceleration concurrently with fMRI to provide convergent validation of induced affect processing in the dimension of affective valence.
Collapse
Affiliation(s)
- Kayla A Wilson
- Brain Imaging Research Center, University of Arkansas for Medical Sciences, Little Rock, USA
| | - G Andrew James
- Brain Imaging Research Center, University of Arkansas for Medical Sciences, Little Rock, USA
| | - Clint D Kilts
- Brain Imaging Research Center, University of Arkansas for Medical Sciences, Little Rock, USA
| | - Keith A Bush
- Brain Imaging Research Center, University of Arkansas for Medical Sciences, Little Rock, USA.
| |
Collapse
|
45
|
A study in affect: Predicting valence from fMRI data. Neuropsychologia 2020; 143:107473. [DOI: 10.1016/j.neuropsychologia.2020.107473] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2019] [Revised: 04/10/2020] [Accepted: 04/19/2020] [Indexed: 12/19/2022]
|
46
|
Horikawa T, Cowen AS, Keltner D, Kamitani Y. The Neural Representation of Visually Evoked Emotion Is High-Dimensional, Categorical, and Distributed across Transmodal Brain Regions. iScience 2020; 23:101060. [PMID: 32353765 PMCID: PMC7191651 DOI: 10.1016/j.isci.2020.101060] [Citation(s) in RCA: 24] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2019] [Revised: 03/11/2020] [Accepted: 04/09/2020] [Indexed: 12/12/2022] Open
Abstract
Central to our subjective lives is the experience of different emotions. Recent behavioral work mapping emotional responses to 2,185 videos found that people experience upward of 27 distinct emotions occupying a high-dimensional space, and that emotion categories, more so than affective dimensions (e.g., valence), organize self-reports of subjective experience. Here, we sought to identify the neural substrates of this high-dimensional space of emotional experience using fMRI responses to all 2,185 videos. Our analyses demonstrated that (1) dozens of video-evoked emotions were accurately predicted from fMRI patterns in multiple brain regions with different regional configurations for individual emotions; (2) emotion categories better predicted cortical and subcortical responses than affective dimensions, outperforming visual and semantic covariates in transmodal regions; and (3) emotion-related fMRI responses had a cluster-like organization efficiently characterized by distinct categories. These results support an emerging theory of the high-dimensional emotion space, illuminating its neural foundations distributed across transmodal regions.
Collapse
Affiliation(s)
- Tomoyasu Horikawa
- Department of Neuroinformatics, ATR Computational Neuroscience Laboratories, Hikaridai, Seika, Soraku, Kyoto, 619-0288, Japan.
| | - Alan S Cowen
- Department of Psychology, University of California, Berkeley, CA 94720-1500, USA
| | - Dacher Keltner
- Department of Psychology, University of California, Berkeley, CA 94720-1500, USA
| | - Yukiyasu Kamitani
- Department of Neuroinformatics, ATR Computational Neuroscience Laboratories, Hikaridai, Seika, Soraku, Kyoto, 619-0288, Japan; Graduate School of Informatics, Kyoto University, Yoshida-honmachi, Sakyo-ku, Kyoto, 606-8501, Japan.
| |
Collapse
|
47
|
Dricu M, Frühholz S. A neurocognitive model of perceptual decision-making on emotional signals. Hum Brain Mapp 2020; 41:1532-1556. [PMID: 31868310 PMCID: PMC7267943 DOI: 10.1002/hbm.24893] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2019] [Revised: 11/18/2019] [Accepted: 11/29/2019] [Indexed: 01/09/2023] Open
Abstract
Humans make various kinds of decisions about which emotions they perceive from others. Although it might seem like a split-second phenomenon, deliberating over which emotions we perceive unfolds across several stages of decisional processing. Neurocognitive models of general perception postulate that our brain first extracts sensory information about the world then integrates these data into a percept and lastly interprets it. The aim of the present study was to build an evidence-based neurocognitive model of perceptual decision-making on others' emotions. We conducted a series of meta-analyses of neuroimaging data spanning 30 years on the explicit evaluations of others' emotional expressions. We find that emotion perception is rather an umbrella term for various perception paradigms, each with distinct neural structures that underline task-related cognitive demands. Furthermore, the left amygdala was responsive across all classes of decisional paradigms, regardless of task-related demands. Based on these observations, we propose a neurocognitive model that outlines the information flow in the brain needed for a successful evaluation of and decisions on other individuals' emotions. HIGHLIGHTS: Emotion classification involves heterogeneous perception and decision-making tasks Decision-making processes on emotions rarely covered by existing emotions theories We propose an evidence-based neuro-cognitive model of decision-making on emotions Bilateral brain processes for nonverbal decisions, left brain processes for verbal decisions Left amygdala involved in any kind of decision on emotions.
Collapse
Affiliation(s)
- Mihai Dricu
- Department of PsychologyUniversity of BernBernSwitzerland
| | - Sascha Frühholz
- Cognitive and Affective Neuroscience Unit, Department of PsychologyUniversity of ZurichZurichSwitzerland
- Neuroscience Center Zurich (ZNZ)University of Zurich and ETH ZurichZurichSwitzerland
- Center for Integrative Human Physiology (ZIHP)University of ZurichZurichSwitzerland
| |
Collapse
|
48
|
Jones NP, Schlund M, Kerestes R, Ladouceur CD. Emotional Interference in Early Adolescence: Positive Reinforcement Modulates the Behavioral and Neural Effects of Negative Emotional Distracters. Cereb Cortex 2020; 30:2642-2657. [PMID: 31812998 PMCID: PMC7175015 DOI: 10.1093/cercor/bhz266] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2019] [Revised: 08/23/2019] [Accepted: 09/17/2019] [Indexed: 12/12/2022] Open
Abstract
Limited research has examined functioning within fronto-limbic systems subserving the resistance to emotional interference in adolescence despite evidence indicating that alterations in these systems are implicated in the developmental trajectories of affective disorders. This study examined the functioning of fronto-limbic systems subserving emotional interference in early adolescence and whether positive reinforcement could modulate these systems to promote resistance to emotional distraction. Fifty healthy early adolescents (10-13 years old) completed an emotional delayed working memory (WM) paradigm in which no distractors (fixation crosshair) and emotional distracters (neutral and negative images) were presented with and without positive reinforcement for correct responses. WM accuracy decreased with negative distracters relative to neutral distracters and no distracters, and activation increased in amygdala and prefrontal cortical (PFC) regions (ventrolateral, dorsomedial, ventromedial, and subgenual anterior cingulate) with negative distracters compared with those with no distracters. Reinforcement improved performance and reduced activation in the amygdala, dorsomedial PFC, and ventrolateral PFC. Decreases in amygdala activation to negative distracters due to reinforcement mediated observed decreases in reaction times. These findings demonstrate that healthy adolescents recruit similar fronto-limbic systems subserving emotional interference as adults and that positive reinforcement can modulate fronto-limbic systems to promote resistance to emotional distraction.
Collapse
Affiliation(s)
- Neil P Jones
- Department of Psychiatry, University of Pittsburgh School of Medicine, Pittsburgh, PA 15213, USA
| | - Michael Schlund
- Department of Psychiatry, University of Pittsburgh School of Medicine, Pittsburgh, PA 15213, USA
- Department of Psychology, Georgia State University, Atlanta, GA 30302, USA
| | - Rebecca Kerestes
- Department of Psychiatry, University of Pittsburgh School of Medicine, Pittsburgh, PA 15213, USA
| | - Cecile D Ladouceur
- Department of Psychiatry, University of Pittsburgh School of Medicine, Pittsburgh, PA 15213, USA
| |
Collapse
|
49
|
Response patterns in the developing social brain are organized by social and emotion features and disrupted in children diagnosed with autism spectrum disorder. Cortex 2020; 125:12-29. [DOI: 10.1016/j.cortex.2019.11.021] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2019] [Revised: 10/11/2019] [Accepted: 11/26/2019] [Indexed: 11/17/2022]
|
50
|
Chan HY, Smidts A, Schoots VC, Sanfey AG, Boksem MAS. Decoding dynamic affective responses to naturalistic videos with shared neural patterns. Neuroimage 2020; 216:116618. [PMID: 32036021 DOI: 10.1016/j.neuroimage.2020.116618] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2019] [Revised: 01/21/2020] [Accepted: 02/05/2020] [Indexed: 11/17/2022] Open
Abstract
This study explored the feasibility of using shared neural patterns from brief affective episodes (viewing affective pictures) to decode extended, dynamic affective sequences in a naturalistic experience (watching movie-trailers). Twenty-eight participants viewed pictures from the International Affective Picture System (IAPS) and, in a separate session, watched various movie-trailers. We first located voxels at bilateral occipital cortex (LOC) responsive to affective picture categories by GLM analysis, then performed between-subject hyperalignment on the LOC voxels based on their responses during movie-trailer watching. After hyperalignment, we trained between-subject machine learning classifiers on the affective pictures, and used the classifiers to decode affective states of an out-of-sample participant both during picture viewing and during movie-trailer watching. Within participants, neural classifiers identified valence and arousal categories of pictures, and tracked self-reported valence and arousal during video watching. In aggregate, neural classifiers produced valence and arousal time series that tracked the dynamic ratings of the movie-trailers obtained from a separate sample. Our findings provide further support for the possibility of using pre-trained neural representations to decode dynamic affective responses during a naturalistic experience.
Collapse
Affiliation(s)
- Hang-Yee Chan
- Department of Marketing Management, Rotterdam School of Management, Erasmus University Rotterdam, the Netherlands.
| | - Ale Smidts
- Department of Marketing Management, Rotterdam School of Management, Erasmus University Rotterdam, the Netherlands
| | - Vincent C Schoots
- Department of Marketing Management, Rotterdam School of Management, Erasmus University Rotterdam, the Netherlands
| | - Alan G Sanfey
- Centre for Cognitive Neuroimaging, Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands
| | - Maarten A S Boksem
- Department of Marketing Management, Rotterdam School of Management, Erasmus University Rotterdam, the Netherlands
| |
Collapse
|