1
|
Chang CY, Chan YC, Chen HC. The differential processing of verbal jokes by neural substrates in indigenous and Han Chinese populations: An fMRI study. Behav Brain Res 2024; 457:114702. [PMID: 37813282 DOI: 10.1016/j.bbr.2023.114702] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2023] [Revised: 09/20/2023] [Accepted: 10/06/2023] [Indexed: 10/11/2023]
Abstract
Limited research has been conducted on humor among the Taiwanese indigenous (IND) population. This study attempted to identify the differential neural correlates of humor comprehension and appreciation between IND and Han Chinese (HAN) populations. Each participant was presented with jokes and non-jokes. IND participants when encountered with jokes displayed a greater activation of the mesolimbic dopaminergic reward system, including the amygdala, midbrain, and nucleus accumbens than HAN participants. This suggests a more pleasurable response and appreciation of humor. The IND group also displayed greater activation in the right temporoparietal junction (rTPJ) than HAN, suggesting that the IND group may experience a greater sense of novelty and be more involved in social understanding, thus exhibiting greater humor appreciation. In terms of humor comprehension, both IND and HAN showed greater activation in the superior temporal gyrus (STG) and dorsal anterior cingulate cortex (dACC). IND exhibited greater activation in the anterior STG (aSTG), while HAN showed greater activation in the posterior STG (pSTG). This suggests that the IND tends to integrate emotional messages, whereas the HAN focuses on comprehending semantic cognitive information. Interestingly, HAN did not show any greater activation than IND in terms of appreciation of humor. These group disparities have substantial implications for advancing our knowledge of the neural mechanisms underlying humor comprehension and appreciation.
Collapse
Affiliation(s)
- Chia-Yueh Chang
- Department of Educational Psychology and Counseling, National Taiwan Normal University, Taipei 10610, Taiwan
| | - Yu-Chen Chan
- Department of Educational Psychology and Counseling, National Tsing Hua University, Hsinchu 300043, Taiwan
| | - Hsueh-Chih Chen
- Department of Educational Psychology and Counseling, National Taiwan Normal University, Taipei 10610, Taiwan; Institute for Research Excellence in Learning Sciences, National Taiwan Normal University, Taipei 10610, Taiwan; Chinese Language and Technology Center, National Taiwan Normal University, Taipei 10610, Taiwan; Social Emotional Education and Development Center, National Taiwan Normal University, Taipei 10610, Taiwan.
| |
Collapse
|
2
|
Zhang M, Wang J, Li Q, Li S, Bao X, Chen X. Temporal characteristics of emotional conflict processing: the modulation role of attachment styles. Int J Psychophysiol 2023; 193:112243. [PMID: 37689370 DOI: 10.1016/j.ijpsycho.2023.112243] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2023] [Revised: 08/05/2023] [Accepted: 09/02/2023] [Indexed: 09/11/2023]
Abstract
Theoretical account of attachment proposed that individual differences in adult attachment styles play a key role in adjusting balance between affective evaluation and cognitive control. Yet, little is known about the temporal characteristics of emotional conflict processing modulated by attachment styles. Accordingly, the present study used event-related potentials (ERP) and multivariate pattern analysis (MVPA) combined with an emotional face-word Stroop task to investigate the temporal dynamics of attachment-related cognitive-affective patterns in emotional conflict processing. The ERP results demonstrated multiple-process of emotional conflict modulated by attachment styles. In early sensory processing, positive faces captured avoidant attachment individuals' attention as reflected in greater P1, while the same situation led to greater N170 in secure and anxious individuals. Crucially, impairment in conflict-monitoring function was found in anxious individuals as reflected by the absence of interference effect on N450, leading to impaired ability of inhibitory control as indicated by decreased slow potential. In contrast, avoidant individuals showed greater slow potential for inhibiting emotional interference. Furthermore, MVPA revealed that the corresponding time window for conflict monitoring was found for emotional distractors decoding rather than congruency decoding in the anxious attachment group. Convergent results from ERPs and MVPA indicated that the deficits in emotional conflict monitoring and resolution among anxious individuals might be due to the excessive approach to emotional distractors, as they habitually use emotional evaluation rather than cognitive control. In summary, the present study provides electrophysiological evidence that attachment styles modulated emotional conflict processing, which highlights the contribution of attachment to social information processing.
Collapse
Affiliation(s)
- Mengke Zhang
- Faculty of Psychology, Southwest University, Chongqing 400715, China
| | - Jing Wang
- School of Psychology, Liaoning Normal University, Dalian 116029, China
| | - Qing Li
- Faculty of Psychology, Southwest University, Chongqing 400715, China
| | - Song Li
- Faculty of Psychology, Southwest University, Chongqing 400715, China
| | - Xiuqin Bao
- Faculty of Psychology, Southwest University, Chongqing 400715, China
| | - Xu Chen
- Faculty of Psychology, Southwest University, Chongqing 400715, China.
| |
Collapse
|
3
|
Guo T, Wang F, Cao N, Liu H. Conflicts influence affects: an FMRI study of emotional effects in a conflict task. Cogn Neurodyn 2022; 16:1261-1271. [PMID: 36408071 PMCID: PMC9666575 DOI: 10.1007/s11571-022-09790-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2021] [Revised: 01/24/2022] [Accepted: 02/08/2022] [Indexed: 11/03/2022] Open
Abstract
Although prior research has confirmed that conflict itself is likely to be aversive, it is unclear whether and how emotional conflicts influence an individual's affective processing. The current fMRI study adopted a lexical valence conflict task via instructing participants to shift lexical valence or not. We found that the involvement of positive emotions enhanced the activation of the middle right temporal gyrus (R-MTG) in the non-conflict condition, whereas such activation attenuated in the conflict condition. In addition, the R-MTG was activated in the opposite way when negative emotions were involved. The functional connectivity and correlation analyses further revealed that the faster the participants processed positive emotional words, the weaker the connectivity between R-MTG and positive emotion-related areas of left MTG in the non-conflict condition would be. In contrast, the faster the participants processed negative emotional words, the stronger the connectivity between R-MTG and negative emotion-related areas of the right cerebellum in the conflict condition would become. These findings suggest that conflicts have different influences on emotional processing.
Collapse
Affiliation(s)
- Tingting Guo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029 China
- Key Laboratory of Brain and Cognitive Neuroscience, Dalian, 116029 Liaoning Province China
| | - Fenqi Wang
- Department of Linguistics, University of Florida, Gainesville, FL 32611-5454 USA
| | - Ningning Cao
- School of Foreign Languages, Northeast Normal University, Changchun, China
| | - Huanhuan Liu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029 China
- Key Laboratory of Brain and Cognitive Neuroscience, Dalian, 116029 Liaoning Province China
| |
Collapse
|
4
|
Tan C, Liu X, Zhang G. Inferring Brain State Dynamics Underlying Naturalistic Stimuli Evoked Emotion Changes With dHA-HMM. Neuroinformatics 2022; 20:737-753. [PMID: 35244856 DOI: 10.1007/s12021-022-09568-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/19/2022] [Indexed: 12/31/2022]
Abstract
The brain functional mechanisms underlying emotional changes have been primarily studied based on the traditional task design with discrete and simple stimuli. However, the brain state transitions when exposed to continuous and naturalistic stimuli with rich affection variations remain poorly understood. This study proposes a dynamic hyperalignment algorithm (dHA) to functionally align the inter-subject neural activity. The hidden Markov model (HMM) was used to study how the brain dynamics responds to emotion during long-time movie-viewing activity. The results showed that dHA significantly improved inter-subject consistency and allowed more consistent temporal HMM states across participants. Afterward, grouping the emotions in a clustering dendrogram revealed a hierarchical grouping of the HMM states. Further emotional sensitivity and specificity analyses of ordered states revealed the most significant differences in happiness and sadness. We then compared the activation map in HMM states during happiness and sadness and found significant differences in the whole brain, but strong activation was observed during both in the superior temporal gyrus, which is related to the early process of emotional prosody processing. A comparison of the inter-network functional connections indicates unique functional connections of the memory retrieval and cognitive network with the cerebellum network during happiness. Moreover, the persistent bilateral connections among salience, cognitive, and sensorimotor networks during sadness may reflect the interaction between high-level cognitive networks and low-level sensory networks. The main results were verified by the second session of the dataset. All these findings enrich our understanding of the brain states related to emotional variation during naturalistic stimuli.
Collapse
Affiliation(s)
- Chenhao Tan
- College of Intelligence and Computing, Tianjin Key Laboratory of Cognitive Computing and Application, Tianjin University, No. 135 Yaguan Road, Haihe Education Park, Tianjin, 300350, People's Republic of China
| | - Xin Liu
- College of Intelligence and Computing, Tianjin Key Laboratory of Cognitive Computing and Application, Tianjin University, No. 135 Yaguan Road, Haihe Education Park, Tianjin, 300350, People's Republic of China
| | - Gaoyan Zhang
- College of Intelligence and Computing, Tianjin Key Laboratory of Cognitive Computing and Application, Tianjin University, No. 135 Yaguan Road, Haihe Education Park, Tianjin, 300350, People's Republic of China.
| |
Collapse
|
5
|
Chen F, Lian J, Zhang G, Guo C. Semantics-Prosody Stroop Effect on English Emotion Word Processing in Chinese College Students With Trait Depression. Front Psychiatry 2022; 13:889476. [PMID: 35733799 PMCID: PMC9207235 DOI: 10.3389/fpsyt.2022.889476] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/04/2022] [Accepted: 05/06/2022] [Indexed: 11/13/2022] Open
Abstract
This study explored the performance of Chinese college students with different severity of trait depression to process English emotional speech under a complete semantics-prosody Stroop effect paradigm in quiet and noisy conditions. A total of 24 college students with high-trait depression and 24 students with low-trait depression participated in this study. They were required to selectively attend to either the prosodic emotion (happy, sad) or semantic valence (positive and negative) of the English words they heard and then respond quickly. Both prosody task and semantic task were performed in quiet and noisy listening conditions. Results showed that the high-trait group reacted slower than the low-trait group in the prosody task due to their bluntness and insensitivity toward emotional processing. Besides, both groups reacted faster under the consistent situation, showing a clear congruency-induced facilitation effect and the wide existence of the Stroop effect in both tasks. Only the Stroop effect played a bigger role during emotional prosody identification in quiet condition, and the noise eliminated such an effect. For the sake of experimental design, both groups spent less time on the prosody task than the semantic task regardless of consistency in all listening conditions, indicating the friendliness of basic emotion identification and the difficulty for second language learners in face of semantic judgment. These findings suggest the unneglectable effects of college students' mood conditions and noise outside on emotion word processing.
Collapse
Affiliation(s)
- Fei Chen
- School of Foreign Languages, Hunan University, Changsha, China
| | - Jing Lian
- School of Foreign Languages, Hunan University, Changsha, China
| | - Gaode Zhang
- School of Foreign Languages, Hunan University, Changsha, China
| | - Chengyu Guo
- School of Foreign Languages, Hunan University, Changsha, China
| |
Collapse
|
6
|
Emotional salience but not valence impacts anterior cingulate cortex conflict processing. COGNITIVE, AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2022; 22:1250-1263. [PMID: 35879595 PMCID: PMC9622519 DOI: 10.3758/s13415-022-01025-9] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Accepted: 07/05/2022] [Indexed: 01/27/2023]
Abstract
Stimuli that evoke emotions are salient, draw attentional resources, and facilitate situationally appropriate behavior in complex or conflicting environments. However, negative and positive emotions may motivate different response strategies. For example, a threatening stimulus might evoke avoidant behavior, whereas a positive stimulus may prompt approaching behavior. Therefore, emotional stimuli might either elicit differential behavioral responses when a conflict arises or simply mark salience. The present study used functional magnetic resonance imaging to investigate valence-specific emotion effects on attentional control in conflict processing by employing an adapted flanker task with neutral, negative, and positive stimuli. Slower responses were observed for incongruent than congruent trials. Neural activity in the dorsal anterior cingulate cortex was associated with conflict processing regardless of emotional stimulus quality. These findings confirm that both negative and positive emotional stimuli mark salience in both low (congruent) and high (incongruent) conflict scenarios. Regardless of the conflict level, emotional stimuli deployed greater attentional resources in goal directed behavior.
Collapse
|
7
|
Liu P, Sutherland M, Pollick FE. Incongruence effects in cross-modal emotional processing in autistic traits: An fMRI study. Neuropsychologia 2021; 161:107997. [PMID: 34425144 DOI: 10.1016/j.neuropsychologia.2021.107997] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2020] [Revised: 07/26/2021] [Accepted: 08/17/2021] [Indexed: 10/20/2022]
Abstract
In everyday life, emotional information is often conveyed by both the face and the voice. Consequently, information presented by one source can alter the way in which information from the other source is perceived, leading to emotional incongruence. Here, we used functional magnetic resonance imaging (fMRI) to examine neutral correlates of two different types of emotional incongruence in audiovisual processing, namely incongruence of emotion-valence and incongruence of emotion-presence. Participants were in two groups, one group with a low Autism Quotient score (LAQ) and one with a high score (HAQ). Each participant experienced emotional (happy, fearful) or neutral faces or voices while concurrently being exposed to emotional (happy, fearful) or neutral voices or faces. They were instructed to attend to either the visual or auditory track. The incongruence effect of emotion-valence was characterized by activation in a wide range of brain regions in both hemispheres involving the inferior frontal gyrus, cuneus, superior temporal gyrus, and middle frontal gyrus. The incongruence effect of emotion-presence was characterized by activation in a set of temporal and occipital regions in both hemispheres, including the middle occipital gyrus, middle temporal gyrus and inferior temporal gyrus. In addition, the present study identified greater recruitment of the right inferior parietal lobule in perceiving audio-visual emotional expressions in HAQ individuals, as compared to the LAQ individuals. Depending on face or voice-to-be attended, different patterns of emotional incongruence were found between the two groups. Specifically, the HAQ group tend to show more incidental processing to visual information whilst the LAQ group tend to show more incidental processing to auditory information during the crossmodal emotional incongruence decoding. These differences might be attributed to different attentional demands and different processing strategies between the two groups.
Collapse
Affiliation(s)
- Peipei Liu
- Department of Psychology, Sun Yat-Sen University, Guangzhou, 510006, China; School of Psychology, University of Glasgow, Glasgow, G12 8QB, UK; School of Education, University of Glasgow, Glasgow, G3 6NH, UK
| | | | - Frank E Pollick
- School of Psychology, University of Glasgow, Glasgow, G12 8QB, UK.
| |
Collapse
|
8
|
Boys with autism spectrum disorder have distinct cortical folding patterns underpinning impaired self-regulation: a surface-based morphometry study. Brain Imaging Behav 2021; 14:2464-2476. [PMID: 31512098 DOI: 10.1007/s11682-019-00199-0] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
Although impaired self-regulation (dysregulation) in autism spectrum disorder (ASD) garnered increasing awareness, the neural mechanism of dysregulation in ASD are far from conclusive. To complement our previous voxel-based morphometry findings, we estimated the cortical thickness, surface area, and local gyrification index based on the surface-based morphometry from structural MRI images in 85 ASD and 65 typically developing control (TDC) boys, aged 7-17 years. Levels of dysregulation were measured by the sum of T-scores of Attention, Aggression, and Anxiety/Depression subscales on the Child Behavior Checklist. We found both ASD and TDC shared similar relationships between dysregulation and cortical folding patterns in the left superior and inferior temporal gyri and the left premotor cortex. Significant diagnosis by dysregulation interactions in cortical folding patterns were identified over the right middle frontal and right lateral orbitofrontal regions. The statistical significance of greater local gyrification index in ASD than TDC in several brain regions disappeared when the level of dysregulation was considered. The findings of shared and distinct neural correlates underpinning dysregulation between ASD and TDC may facilitate the development of targeted interventions in the future. The present work also demonstrates that inter-subject variations in self-regulation may explain some extents of ASD-associated brain morphometric differences, likely suggesting that dysregulation is one of the yardsticks for dissecting the heterogeneity of ASD.
Collapse
|
9
|
Zacharia AA, Ahuja N, Kaur S, Sharma R. State-dependent perception and perceptual reversals during intermittent binocular rivalry: An electrical neuroimaging study. Neurosci Lett 2020; 736:135252. [PMID: 32687954 DOI: 10.1016/j.neulet.2020.135252] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2020] [Revised: 06/26/2020] [Accepted: 07/14/2020] [Indexed: 10/23/2022]
Abstract
The object-context relationship and valence are two important stimulus attributes that affect visual perception. Although previous studies reveal how these two factors affect visual perception individually, the interplay between valence with congruent or incongruent object-context associations during visual perception is explored scarcely. Further, what is perceived, is affected by the intrinsic state of the brain at the moment of appearance of the stimulus which could be assessed by EEG microstates. Hence, the current study was designed to explore how the pre-stimulus EEG microstate influences the perception of emotional congruent and incongruent stimuli as well as perceptual reversals and stability during an intermittent binocular rivalry. Results revealed the association of specific pre-stimulus microstates with the perception of neutral and negative congruent stimuli as well as perceptual reversals and stability. Electrical neuroimaging of these microstates showed higher activation in the precuneus and middle occipital gyrus preceding the perception of neutral congruent stimuli and lingual gyrus preceding the perception of negative congruent stimuli. Increased source activity in superior temporal gyrus and superior frontal gyrus was found preceding stability and lower activation in parahippocampal gyrus was observed preceding reversals. Together these results suggest that the pre-stimulus activation of areas involved in visual priming, retrieval, and semantics leads to congruent perception. Pre-stimulus DMN suppression was required for perceptual reversals whereas stability was accompanied by pre-stimulus activation of areas related to the specific nature of the stimulus. Therefore, we propose that in addition to stimulus attributes, the pre-stimulus intrinsic brain activity could be an important determinant of the performance.
Collapse
Affiliation(s)
- Angel Anna Zacharia
- Stress and Cognitive Electroimaging Lab, Department of Physiology, All India Institute of Medical Sciences, New Delhi, 110029, India
| | - Navdeep Ahuja
- Stress and Cognitive Electroimaging Lab, Department of Physiology, All India Institute of Medical Sciences, New Delhi, 110029, India
| | - Simran Kaur
- Stress and Cognitive Electroimaging Lab, Department of Physiology, All India Institute of Medical Sciences, New Delhi, 110029, India
| | - Ratna Sharma
- Stress and Cognitive Electroimaging Lab, Department of Physiology, All India Institute of Medical Sciences, New Delhi, 110029, India.
| |
Collapse
|
10
|
Gao C, Weber CE, Wedell DH, Shinkareva SV. An fMRI Study of Affective Congruence across Visual and Auditory Modalities. J Cogn Neurosci 2020; 32:1251-1262. [DOI: 10.1162/jocn_a_01553] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/14/2023]
Abstract
Abstract
Evaluating multisensory emotional content is a part of normal day-to-day interactions. We used fMRI to examine brain areas sensitive to congruence of audiovisual valence and their overlap with areas sensitive to valence. Twenty-one participants watched audiovisual clips with either congruent or incongruent valence across visual and auditory modalities. We showed that affective congruence versus incongruence across visual and auditory modalities is identifiable on a trial-by-trial basis across participants. Representations of affective congruence were widely distributed with some overlap with the areas sensitive to valence. Regions of overlap included bilateral superior temporal cortex and right pregenual anterior cingulate. The overlap between the regions identified here and in the emotion congruence literature lends support to the idea that valence may be a key determinant of affective congruence processing across a variety of discrete emotions.
Collapse
|
11
|
What you say versus how you say it: Comparing sentence comprehension and emotional prosody processing using fMRI. Neuroimage 2019; 209:116509. [PMID: 31899288 DOI: 10.1016/j.neuroimage.2019.116509] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2019] [Revised: 12/23/2019] [Accepted: 12/26/2019] [Indexed: 11/24/2022] Open
Abstract
While language processing is often described as lateralized to the left hemisphere (LH), the processing of emotion carried by vocal intonation is typically attributed to the right hemisphere (RH) and more specifically, to areas mirroring the LH language areas. However, the evidence base for this hypothesis is inconsistent, with some studies supporting right-lateralization but others favoring bilateral involvement in emotional prosody processing. Here we compared fMRI activations for an emotional prosody task with those for a sentence comprehension task in 20 neurologically healthy adults, quantifying lateralization using a lateralization index. We observed right-lateralized frontotemporal activations for emotional prosody that roughly mirrored the left-lateralized activations for sentence comprehension. In addition, emotional prosody also evoked bilateral activation in pars orbitalis (BA47), amygdala, and anterior insula. These findings are consistent with the idea that analysis of the auditory speech signal is split between the hemispheres, possibly according to their preferred temporal resolution, with the left preferentially encoding phonetic and the right encoding prosodic information. Once processed, emotional prosody information is fed to domain-general emotion processing areas and integrated with semantic information, resulting in additional bilateral activations.
Collapse
|
12
|
Liu X, Chen Y, Ge J, Mao L. Funny or Angry? Neural Correlates of Individual Differences in Aggressive Humor Processing. Front Psychol 2019; 10:1849. [PMID: 31496969 PMCID: PMC6712686 DOI: 10.3389/fpsyg.2019.01849] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2019] [Accepted: 07/29/2019] [Indexed: 11/13/2022] Open
Abstract
Humor has been a hot topic for social cognition in recent years. The present study focused on the social attribute of humor and showed different stories to participants, which were divided into four types according to the model of humor style, to explore the underlying neural mechanism of point-to-self aggressive humor and how individual differences modulated it. Measuring the degree of anger and funniness, results suggested that aggressive humor helped us in social communication by reducing the degree of anger. The neural activities showed that bilateral temporal lobes and frontal lobes played a synergistic role in the point-to-self aggressive humor processing, while point-to-self non-aggressive humor was dominant in the left-side brain. Results from the region of interest (ROI) analysis showed that the individual differences of the self-control level and the self-construal level may influence the neural processing of point-to-self aggressive humor by modulating the activated levels and patterns of the right inferior orbital frontal gyrus, the right superior temporal lobe, and the right superior frontal lobe.
Collapse
Affiliation(s)
- Xiaoping Liu
- School of Psychological and Cognitive Sciences, Peking University, Beijing, China.,Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing, China
| | - Yueti Chen
- School of Psychological and Cognitive Sciences, Peking University, Beijing, China.,Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing, China
| | - Jianqiao Ge
- Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing, China
| | - Lihua Mao
- School of Psychological and Cognitive Sciences, Peking University, Beijing, China.,Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing, China
| |
Collapse
|
13
|
Aryani A, Hsu CT, Jacobs AM. Affective iconic words benefit from additional sound-meaning integration in the left amygdala. Hum Brain Mapp 2019; 40:5289-5300. [PMID: 31444898 PMCID: PMC6864889 DOI: 10.1002/hbm.24772] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2019] [Revised: 07/21/2019] [Accepted: 07/31/2019] [Indexed: 01/01/2023] Open
Abstract
Recent studies have shown that a similarity between sound and meaning of a word (i.e., iconicity) can help more readily access the meaning of that word, but the neural mechanisms underlying this beneficial role of iconicity in semantic processing remain largely unknown. In an fMRI study, we focused on the affective domain and examined whether affective iconic words (e.g., high arousal in both sound and meaning) activate additional brain regions that integrate emotional information from different domains (i.e., sound and meaning). In line with our hypothesis, affective iconic words, compared to their non‐iconic counterparts, elicited additional BOLD responses in the left amygdala known for its role in multimodal representation of emotions. Functional connectivity analyses revealed that the observed amygdalar activity was modulated by an interaction of iconic condition and activations in two hubs representative for processing sound (left superior temporal gyrus) and meaning (left inferior frontal gyrus) of words. These results provide a neural explanation for the facilitative role of iconicity in language processing and indicate that language users are sensitive to the interaction between sound and meaning aspect of words, suggesting the existence of iconicity as a general property of human language.
Collapse
Affiliation(s)
- Arash Aryani
- Department of Experimental and Neurocognitive Psychology, Freie Universität Berlin, Germany
| | - Chun-Ting Hsu
- Kokoro Research Center, Kyoto University, Kyoto, Japan
| | - Arthur M Jacobs
- Department of Experimental and Neurocognitive Psychology, Freie Universität Berlin, Germany.,Centre for Cognitive Neuroscience Berlin (CCNB), Berlin, Germany
| |
Collapse
|
14
|
Emotional prosody Stroop effect in Hindi: An event related potential study. PROGRESS IN BRAIN RESEARCH 2019. [PMID: 31196434 DOI: 10.1016/bs.pbr.2019.04.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register]
Abstract
Prosody processing is an important aspect of language comprehension. Previous research on emotional word-prosody conflict has shown that participants are worse when emotional prosody and word meaning are incongruent. Studies with event-related potentials have shown a congruency effect in N400 component. There has been no study on emotional processing in Hindi language in the context of conflict between emotional word meaning and prosody. We used happy and angry words spoken using happy and angry prosody. Participants had to identify whether the word had a happy or angry word meaning. The results showed a congruency effect with worse performance in incongruent trials indicating an emotional Stroop effect in Hindi. The ERP results showed that prosody information is detected very early, which can be seen in the N1 component. In addition, there was a congruency effect in N400. The results show that prosody is processed very early and emotional meaning-prosody congruency effect is obtained with Hindi. Further studies would be needed to investigate similarities and differences in cognitive control associated with language processing.
Collapse
|
15
|
Brain Activity Related to Sound Symbolism: Cross-modal Effect of an Aurally Presented Phoneme on Judgment of Size. Sci Rep 2019; 9:7017. [PMID: 31065027 PMCID: PMC6505024 DOI: 10.1038/s41598-019-43457-3] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2018] [Accepted: 04/25/2019] [Indexed: 11/08/2022] Open
Abstract
Sound symbolism is the idea that a sound makes a certain impression (e.g., phoneme “p” is associated with an impression of smallness) and could be the psychological basis of the word–meaning association. In this study, we investigated the neural basis of sound symbolism. Subjects were required to compare the visual sizes of standard and target stimuli while listening to syllables assumed to create either a larger or smaller impression. Stimulus–response congruence is defined as the agreement between the target size and the syllable’s impression. Behavioral data showed that the subjects displayed a longer reaction time under the incongruent condition than under the congruent condition, indicating that they tended to associate the object size with certain syllables. We used functional magnetic resonance imaging to evaluate the cerebral activity during the task, and found that both semantic- and phonetic-process-related areas of the brain (left middle temporal gyrus and right superior temporal gyrus, respectively) were activated under the incongruent condition. These results suggest that these regions are associated with the incongruence of sound symbolism.
Collapse
|
16
|
Krestar ML, McLennan CT. Responses to Semantically Neutral Words in Varying Emotional Intonations. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2019; 62:733-744. [PMID: 30950728 DOI: 10.1044/2018_jslhr-h-17-0428] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Purpose Recent research on perception of emotionally charged material has found both an "emotionality effect" in which participants respond differently to emotionally charged stimuli relative to neutral stimuli in some cognitive-linguistic tasks and a "negativity bias" in which participants respond differently to negatively charged stimuli relative to neutral and positively charged stimuli. The current study investigated young adult listeners' bias when responding to neutral-meaning words in 2 tasks that varied attention to emotional intonation. Method Half the participants completed a word identification task in which they were instructed to type a word they had heard presented binaurally through Sony stereo MDR-ZX100 headphones. The other half of the participants completed an intonation identification task in which they were instructed to use a SuperLab RB-740 button box to identify the emotional prosody of the same words over headphones. For both tasks, all auditory stimuli were semantically neutral words spoken in happy, sad, and neutral emotional intonations. Researchers measured percent correct and reaction time (RT) for each word in both tasks. Results In the word identification task, when identifying semantically neutral words spoken in happy, sad, and neutral intonations, listeners' RTs to words in a sad intonation were longer than RTs to words in a happy intonation. In the intonation identification task, when identifying the emotional intonation of the same words spoken in the same emotional tones of voice, listeners' RTs to words in a sad intonation were significantly faster than those in a neutral intonation. Conclusions Results demonstrate a potential attentional negativity bias for neutral words varying in emotional intonation. Such results support an attention-based theoretical account. In an intonation identification task, an advantage emerged for words in a negative (sad) intonation relative to words in a neutral intonation. Thus, current models of emotional speech should acknowledge the amount of attention to emotional content (i.e., prosody) necessary to complete a cognitive task, as it has the potential to bias processing.
Collapse
Affiliation(s)
- Maura L Krestar
- Department of Clinical Health Sciences, Texas A&M University-Kingsville
| | - Conor T McLennan
- Language Research Laboratory, Department of Psychology, Cleveland State University, OH
| |
Collapse
|
17
|
Shekhar S, Maria A, Kotilahti K, Huotilainen M, Heiskala J, Tuulari JJ, Hirvi P, Karlsson L, Karlsson H, Nissilä I. Hemodynamic responses to emotional speech in two-month-old infants imaged using diffuse optical tomography. Sci Rep 2019; 9:4745. [PMID: 30894569 PMCID: PMC6426868 DOI: 10.1038/s41598-019-39993-7] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2018] [Accepted: 02/04/2019] [Indexed: 12/14/2022] Open
Abstract
Emotional speech is one of the principal forms of social communication in humans. In this study, we investigated neural processing of emotional speech (happy, angry, sad and neutral) in the left hemisphere of 21 two-month-old infants using diffuse optical tomography. Reconstructed total hemoglobin (HbT) images were analysed using adaptive voxel-based clustering and region-of-interest (ROI) analysis. We found a distributed happy > neutral response within the temporo-parietal cortex, peaking in the anterior temporal cortex; a negative HbT response to emotional speech (the average of the emotional speech conditions < baseline) in the temporo-parietal cortex, neutral > angry in the anterior superior temporal sulcus (STS), happy > angry in the superior temporal gyrus and posterior superior temporal sulcus, angry < baseline in the insula, superior temporal sulcus and superior temporal gyrus and happy < baseline in the anterior insula. These results suggest that left STS is more sensitive to happy speech as compared to angry speech, indicating that it might play an important role in processing positive emotions in two-month-old infants. Furthermore, happy speech (relative to neutral) seems to elicit more activation in the temporo-parietal cortex, thereby suggesting enhanced sensitivity of temporo-parietal cortex to positive emotional stimuli at this stage of infant development.
Collapse
Affiliation(s)
- Shashank Shekhar
- University of Turku, Institute of Clinical Medicine, Turku Brain and Mind Center, FinnBrain Birth Cohort Study, Turku, Finland.,University of Mississippi Medical Center, Department of Neurology, Jackson, MS, USA
| | - Ambika Maria
- University of Turku, Institute of Clinical Medicine, Turku Brain and Mind Center, FinnBrain Birth Cohort Study, Turku, Finland
| | - Kalle Kotilahti
- Department of Neuroscience and Biomedical Engineering, Aalto University, Helsinki, Finland
| | - Minna Huotilainen
- University of Turku, Institute of Clinical Medicine, Turku Brain and Mind Center, FinnBrain Birth Cohort Study, Turku, Finland.,CICERO Learning, Faculty of Educational Sciences, University of Helsinki, Helsinki, Finland.,Faculty of Educational Sciences, University of Helsinki, Helsinki, Finland
| | - Juha Heiskala
- Department of Clinical Neurophysiology, Helsinki University Central Hospital, Turku, Finland
| | - Jetro J Tuulari
- University of Turku, Institute of Clinical Medicine, Turku Brain and Mind Center, FinnBrain Birth Cohort Study, Turku, Finland
| | - Pauliina Hirvi
- Department of Neuroscience and Biomedical Engineering, Aalto University, Helsinki, Finland
| | - Linnea Karlsson
- University of Turku, Institute of Clinical Medicine, Turku Brain and Mind Center, FinnBrain Birth Cohort Study, Turku, Finland.,University of Turku and Turku University Hospital, Department of Child Psychiatry, Turku, Finland
| | - Hasse Karlsson
- University of Turku, Institute of Clinical Medicine, Turku Brain and Mind Center, FinnBrain Birth Cohort Study, Turku, Finland.,University of Turku and Turku University Hospital, Department of Psychiatry, Turku, Finland
| | - Ilkka Nissilä
- Department of Neuroscience and Biomedical Engineering, Aalto University, Helsinki, Finland.
| |
Collapse
|
18
|
Shamay-Tsoory SG, Saporta N, Marton-Alper IZ, Gvirts HZ. Herding Brains: A Core Neural Mechanism for Social Alignment. Trends Cogn Sci 2019; 23:174-186. [DOI: 10.1016/j.tics.2019.01.002] [Citation(s) in RCA: 95] [Impact Index Per Article: 19.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2018] [Revised: 11/25/2018] [Accepted: 01/02/2019] [Indexed: 12/19/2022]
|
19
|
Davies-Thompson J, Elli GV, Rezk M, Benetti S, van Ackeren M, Collignon O. Hierarchical Brain Network for Face and Voice Integration of Emotion Expression. Cereb Cortex 2018; 29:3590-3605. [DOI: 10.1093/cercor/bhy240] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2017] [Revised: 08/29/2018] [Indexed: 12/22/2022] Open
Abstract
Abstract
The brain has separate specialized computational units to process faces and voices located in occipital and temporal cortices. However, humans seamlessly integrate signals from the faces and voices of others for optimal social interaction. How are emotional expressions, when delivered by different sensory modalities (faces and voices), integrated in the brain? In this study, we characterized the brains’ response to faces, voices, and combined face–voice information (congruent, incongruent), which varied in expression (neutral, fearful). Using a whole-brain approach, we found that only the right posterior superior temporal sulcus (rpSTS) responded more to bimodal stimuli than to face or voice alone but only when the stimuli contained emotional expression. Face- and voice-selective regions of interest, extracted from independent functional localizers, similarly revealed multisensory integration in the face-selective rpSTS only; further, this was the only face-selective region that also responded significantly to voices. Dynamic causal modeling revealed that the rpSTS receives unidirectional information from the face-selective fusiform face area, and voice-selective temporal voice area, with emotional expression affecting the connection strength. Our study promotes a hierarchical model of face and voice integration, with convergence in the rpSTS, and that such integration depends on the (emotional) salience of the stimuli.
Collapse
Affiliation(s)
- Jodie Davies-Thompson
- Crossmodal Perception and Plasticity Laboratory, Center of Mind/Brain Sciences, University of Trento, Mattarello 38123 - TN, via delle Regole, Italy
- Face Research, Swansea (FaReS), Department of Psychology, College of Human and Health Sciences, Swansea University, Singleton Park, Swansea, UK
| | - Giulia V Elli
- Department of Psychological & Brain Sciences, John Hopkins University, Baltimore, MD, USA
| | - Mohamed Rezk
- Crossmodal Perception and Plasticity Laboratory, Center of Mind/Brain Sciences, University of Trento, Mattarello 38123 - TN, via delle Regole, Italy
- Institute of research in Psychology (IPSY), Institute of Neuroscience (IoNS), 10 Place du Cardinal Mercier, 1348 Louvain-La-Neuve, University of Louvain (UcL), Belgium
| | - Stefania Benetti
- Crossmodal Perception and Plasticity Laboratory, Center of Mind/Brain Sciences, University of Trento, Mattarello 38123 - TN, via delle Regole, Italy
| | - Markus van Ackeren
- Crossmodal Perception and Plasticity Laboratory, Center of Mind/Brain Sciences, University of Trento, Mattarello 38123 - TN, via delle Regole, Italy
| | - Olivier Collignon
- Crossmodal Perception and Plasticity Laboratory, Center of Mind/Brain Sciences, University of Trento, Mattarello 38123 - TN, via delle Regole, Italy
- Institute of research in Psychology (IPSY), Institute of Neuroscience (IoNS), 10 Place du Cardinal Mercier, 1348 Louvain-La-Neuve, University of Louvain (UcL), Belgium
| |
Collapse
|
20
|
Chen T, Becker B, Camilleri J, Wang L, Yu S, Eickhoff SB, Feng C. A domain-general brain network underlying emotional and cognitive interference processing: evidence from coordinate-based and functional connectivity meta-analyses. Brain Struct Funct 2018; 223:3813-3840. [PMID: 30083997 DOI: 10.1007/s00429-018-1727-9] [Citation(s) in RCA: 39] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2018] [Accepted: 07/31/2018] [Indexed: 02/05/2023]
Abstract
The inability to control or inhibit emotional distractors characterizes a range of psychiatric disorders. Despite the use of a variety of task paradigms to determine the mechanisms underlying the control of emotional interference, a precise characterization of the brain regions and networks that support emotional interference processing remains elusive. Here, we performed coordinate-based and functional connectivity meta-analyses to determine the brain networks underlying emotional interference. Paradigms addressing interference processing in the cognitive or emotional domain were included in the meta-analyses, particularly the Stroop, Flanker, and Simon tasks. Our results revealed a consistent involvement of the bilateral dorsal anterior cingulate cortex, anterior insula, left inferior frontal gyrus, and superior parietal lobule during emotional interference. Follow-up conjunction analyses identified correspondence in these regions between emotional and cognitive interference processing. Finally, the patterns of functional connectivity of these regions were examined using resting-state functional connectivity and meta-analytic connectivity modeling. These regions were strongly connected as a distributed system, primarily mapping onto fronto-parietal control, ventral attention, and dorsal attention networks. Together, the present findings indicate that a domain-general neural system is engaged across multiple types of interference processing and that regulating emotional and cognitive interference depends on interactions between large-scale distributed brain networks.
Collapse
Affiliation(s)
- Taolin Chen
- Huaxi MR Research Center (HMRRC), Department of Radiology, West China Hospital of Sichuan University, Chengdu, China
| | - Benjamin Becker
- Clinical Hospital of the Chengdu Brain Science Institute, MOE Key Laboratory for Neuroinformation, University of Electronic Science and Technology of China, Chengdu, China
| | - Julia Camilleri
- Institute of Systems Neuroscience, Medical Faculty, Heinrich Heine University Düsseldorf, Düsseldorf, Germany.,Institute of Neuroscience and Medicine, Brain & Behaviour (INM-7), Research Centre Jülich, Jülich, Germany
| | - Li Wang
- Collaborative Innovation Center of Assessment Toward Basic Education Quality, Beijing Normal University, Beijing, China
| | - Shuqi Yu
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, China
| | - Simon B Eickhoff
- Institute of Systems Neuroscience, Medical Faculty, Heinrich Heine University Düsseldorf, Düsseldorf, Germany.,Institute of Neuroscience and Medicine, Brain & Behaviour (INM-7), Research Centre Jülich, Jülich, Germany
| | - Chunliang Feng
- College of Information Science and Technology, Beijing Normal University, Beijing, China. .,State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, China.
| |
Collapse
|
21
|
Meconi F, Doro M, Schiano Lomoriello A, Mastrella G, Sessa P. Neural measures of the role of affective prosody in empathy for pain. Sci Rep 2018; 8:291. [PMID: 29321532 PMCID: PMC5762917 DOI: 10.1038/s41598-017-18552-y] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2017] [Accepted: 12/14/2017] [Indexed: 01/10/2023] Open
Abstract
Emotional communication often needs the integration of affective prosodic and semantic components from speech and the speaker’s facial expression. Affective prosody may have a special role by virtue of its dual-nature; pre-verbal on one side and accompanying semantic content on the other. This consideration led us to hypothesize that it could act transversely, encompassing a wide temporal window involving the processing of facial expressions and semantic content expressed by the speaker. This would allow powerful communication in contexts of potential urgency such as witnessing the speaker’s physical pain. Seventeen participants were shown with faces preceded by verbal reports of pain. Facial expressions, intelligibility of the semantic content of the report (i.e., participants’ mother tongue vs. fictional language) and the affective prosody of the report (neutral vs. painful) were manipulated. We monitored event-related potentials (ERPs) time-locked to the onset of the faces as a function of semantic content intelligibility and affective prosody of the verbal reports. We found that affective prosody may interact with facial expressions and semantic content in two successive temporal windows, supporting its role as a transverse communication cue.
Collapse
Affiliation(s)
- Federica Meconi
- Department of Developmental and Social Psychology, University of Padova, Padova, Italy
| | - Mattia Doro
- Department of Developmental and Social Psychology, University of Padova, Padova, Italy
| | | | - Giulia Mastrella
- Department of Developmental and Social Psychology, University of Padova, Padova, Italy
| | - Paola Sessa
- Department of Developmental and Social Psychology, University of Padova, Padova, Italy.
| |
Collapse
|
22
|
Minho Affective Sentences (MAS): Probing the roles of sex, mood, and empathy in affective ratings of verbal stimuli. Behav Res Methods 2017; 49:698-716. [PMID: 27004484 DOI: 10.3758/s13428-016-0726-0] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
Abstract
During social communication, words and sentences play a critical role in the expression of emotional meaning. The Minho Affective Sentences (MAS) were developed to respond to the lack of a standardized sentence battery with normative affective ratings: 192 neutral, positive, and negative declarative sentences were strictly controlled for psycholinguistic variables such as numbers of words and letters and per-million word frequency. The sentences were designed to represent examples of each of the five basic emotions (anger, sadness, disgust, fear, and happiness) and of neutral situations. These sentences were presented to 536 participants who rated the stimuli using both dimensional and categorical measures of emotions. Sex differences were also explored. Additionally, we probed how personality, empathy, and mood from a subset of 40 participants modulated the affective ratings. Our results confirmed that the MAS affective norms are valid measures to guide the selection of stimuli for experimental studies of emotion. The combination of dimensional and categorical ratings provided a more fine-grained characterization of the affective properties of the sentences. Moreover, the affective ratings of positive and negative sentences were not only modulated by participants' sex, but also by individual differences in empathy and mood state. Together, our results indicate that, in their quest to reveal the neurofunctional underpinnings of verbal emotional processing, researchers should consider not only the role of sex, but also of interindividual differences in empathy and mood states, in responses to the emotional meaning of sentences.
Collapse
|
23
|
Kar BR, Srinivasan N, Nehabala Y, Nigam R. Proactive and reactive control depends on emotional valence: a Stroop study with emotional expressions and words. Cogn Emot 2017; 32:325-340. [PMID: 28393610 DOI: 10.1080/02699931.2017.1304897] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
We examined proactive and reactive control effects in the context of task-relevant happy, sad, and angry facial expressions on a face-word Stroop task. Participants identified the emotion expressed by a face that contained a congruent or incongruent emotional word (happy/sad/angry). Proactive control effects were measured in terms of the reduction in Stroop interference (difference between incongruent and congruent trials) as a function of previous trial emotion and previous trial congruence. Reactive control effects were measured in terms of the reduction in Stroop interference as a function of current trial emotion and previous trial congruence. Previous trial negative emotions exert greater influence on proactive control than the positive emotion. Sad faces in the previous trial resulted in greater reduction in the Stroop interference for happy faces in the current trial. However, current trial angry faces showed stronger adaptation effects compared to happy faces. Thus, both proactive and reactive control mechanisms are dependent on emotional valence of task-relevant stimuli.
Collapse
Affiliation(s)
- Bhoomika Rastogi Kar
- a Centre of Behavioural and Cognitive Sciences, University of Allahabad , Allahabad , India
| | - Narayanan Srinivasan
- a Centre of Behavioural and Cognitive Sciences, University of Allahabad , Allahabad , India
| | - Yagyima Nehabala
- a Centre of Behavioural and Cognitive Sciences, University of Allahabad , Allahabad , India
| | - Richa Nigam
- a Centre of Behavioural and Cognitive Sciences, University of Allahabad , Allahabad , India
| |
Collapse
|
24
|
Sowman PF, Ryan M, Johnson BW, Savage G, Crain S, Harrison E, Martin E, Burianová H. Grey matter volume differences in the left caudate nucleus of people who stutter. BRAIN AND LANGUAGE 2017; 164:9-15. [PMID: 27693846 DOI: 10.1016/j.bandl.2016.08.009] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/20/2015] [Revised: 08/22/2016] [Accepted: 08/28/2016] [Indexed: 06/06/2023]
Abstract
The cause of stuttering has many theoretical explanations. A number of research groups have suggested changes in the volume and/or function of the striatum as a causal agent. Two recent studies in children and one in adults who stutter (AWS) report differences in striatal volume compared that seen in controls; however, the laterality and nature of this anatomical volume difference is not consistent across studies. The current study investigated whether a reduction in striatal grey matter volume, comparable to that seen in children who stutter (CWS), would be found in AWS. Such a finding would support claims that an anatomical striatal anomaly plays a causal role in stuttering. We used voxel-based morphometry to examine the structure of the striatum in a group of AWS and compared it to that in a group of matched adult control subjects. Results showed a statistically significant group difference for the left caudate nucleus, with smaller mean volume in the group of AWS. The caudate nucleus, one of three main structures within the striatum, is thought to be critical for the planning and modulation of movement sequencing. The difference in striatal volume found here aligns with theoretical accounts of stuttering, which suggest it is a motor control disorder that arises from deficient articulatory movement selection and sequencing. Whilst the current study provides further evidence of a striatal volume difference in stuttering at the group level compared to controls, the significant overlap between AWS and controls suggests this difference is unlikely to be diagnostic of stuttering.
Collapse
Affiliation(s)
- Paul F Sowman
- Department of Cognitive Science, Macquarie University, New South Wales 2109, Australia; Australian Research Council Centre of Excellence in Cognition and Its Disorders, Australia; Perception and Action Research Centre, Faculty of Human Sciences, Macquarie University, New South Wales 2109, Australia.
| | - Margaret Ryan
- Department of Cognitive Science, Macquarie University, New South Wales 2109, Australia; Australian Research Council Centre of Excellence in Cognition and Its Disorders, Australia
| | - Blake W Johnson
- Department of Cognitive Science, Macquarie University, New South Wales 2109, Australia; Australian Research Council Centre of Excellence in Cognition and Its Disorders, Australia
| | - Greg Savage
- Australian Research Council Centre of Excellence in Cognition and Its Disorders, Australia; Department of Psychology, Macquarie University, New South Wales 2109, Australia
| | - Stephen Crain
- Australian Research Council Centre of Excellence in Cognition and Its Disorders, Australia; Department of Linguistics, Macquarie University, New South Wales 2109, Australia
| | - Elisabeth Harrison
- Department of Linguistics, Macquarie University, New South Wales 2109, Australia
| | - Erin Martin
- Department of Cognitive Science, Macquarie University, New South Wales 2109, Australia
| | - Hana Burianová
- Centre for Advanced Imaging, The University of Queensland, Queensland 4072, Australia
| |
Collapse
|
25
|
Sabharwal A, Szekely A, Kotov R, Mukherjee P, Leung HC, Barch DM, Mohanty A. Transdiagnostic neural markers of emotion-cognition interaction in psychotic disorders. JOURNAL OF ABNORMAL PSYCHOLOGY 2016; 125:907-922. [PMID: 27618279 PMCID: PMC5576592 DOI: 10.1037/abn0000196] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
Deficits in working memory (WM) and emotion processing are prominent impairments in psychotic disorders, and have been linked to reduced quality of life and real-world functioning. Translation of knowledge regarding the neural circuitry implementing these deficits into improved diagnosis and targeted treatments has been slow, possibly because of categorical definitions of disorders. Using the dimensional Research Domain Criteria (RDoC) framework, we investigated the clinical and practical utility of transdiagnostic behavioral and neural measures of emotion-related WM disruption across psychotic disorders. Behavioral and functional MRI data were recorded while 53 participants with psychotic disorders and 29 participants with no history of psychosis performed a modified n-back task with fear and neutral distractors. Hierarchical regression analyses showed that psychotic symptoms entered after diagnosis accounted for unique variance in fear versus neutral accuracy and activation in the ventrolateral, dorsolateral, and dorsomedial prefrontal cortex, but diagnostic group entered after psychotic symptoms did not. These results remained even after controlling for negative symptoms, disorganized symptoms, and dysphoria. Finally, worse accuracy and greater prefrontal activity were associated with poorer social functioning and unemployment across diagnostic groups. Present results support the transdiagnostic nature of behavioral and neuroimaging measures of emotion-related WM disruption as they relate to psychotic symptoms, irrespective of diagnosis. They also provide support for the practical utility of these markers in explaining real-world functioning. Overall, these results elucidate key aspects of the RDoC construct of WM maintenance by clarifying its transdiagnostic importance and clinical utility in psychotic disorders. (PsycINFO Database Record
Collapse
Affiliation(s)
| | - Akos Szekely
- Department of Psychology, Stony Brook University
| | - Roman Kotov
- Department of Psychiatry, Stony Brook University
| | | | | | - Deanna M. Barch
- Departments of Psychology, Psychiatry, and Radiology, Washington University in St. Louis
| | | |
Collapse
|
26
|
Matsui T, Nakamura T, Utsumi A, Sasaki AT, Koike T, Yoshida Y, Harada T, Tanabe HC, Sadato N. The role of prosody and context in sarcasm comprehension: Behavioral and fMRI evidence. Neuropsychologia 2016; 87:74-84. [DOI: 10.1016/j.neuropsychologia.2016.04.031] [Citation(s) in RCA: 43] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2015] [Revised: 04/29/2016] [Accepted: 04/29/2016] [Indexed: 11/17/2022]
|
27
|
Mukherjee P, Sabharwal A, Kotov R, Szekely A, Parsey R, Barch DM, Mohanty A. Disconnection Between Amygdala and Medial Prefrontal Cortex in Psychotic Disorders. Schizophr Bull 2016; 42:1056-67. [PMID: 26908926 PMCID: PMC4903065 DOI: 10.1093/schbul/sbw012] [Citation(s) in RCA: 35] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
Distracting emotional information impairs attention more in schizophrenia (SCZ) than in never-psychotic individuals. However, it is unclear whether this impairment and its neural circuitry is indicative generally of psychosis, or specifically of SCZ, and whether it is even more specific to certain SCZ symptoms (eg, deficit syndrome). It is also unclear if this abnormality contributes to impaired behavioral performance and real-world functioning. Functional imaging data were recorded while individuals with SCZ, bipolar disorder with psychosis (BDP) and no history of psychotic disorders (CON) attended to identity of faces while ignoring their emotional expressions. We examined group differences in functional connectivity between amygdala, involved in emotional evaluation, and sub-regions of medial prefrontal cortex (MPFC), involved in emotion regulation and cognitive control. Additionally, we examined correlation of this connectivity with deficit syndrome and real-world functioning. Behaviorally, SCZ showed the worst accuracy when matching the identity of emotional vs neutral faces. Neurally, SCZ showed lower amygdala-MPFC connectivity than BDP and CON. BPD did not differ from CON, neurally or behaviorally. In patients, reduced amygdala-MPFC connectivity during emotional distractors was related to worse emotional vs neutral accuracy, greater deficit syndrome severity, and unemployment. Thus, reduced amygdala-MPFC functional connectivity during emotional distractors reflects a deficit that is specific to SCZ. This reduction in connectivity is associated with worse clinical and real-world functioning. Overall, these findings provide support for the specificity and clinical utility of amygdala-MPFC functional connectivity as a potential neural marker of SCZ.
Collapse
Affiliation(s)
- Prerona Mukherjee
- University of California Davis MIND Institute, UC Davis Medical Center, Sacramento, CA
| | - Amri Sabharwal
- Department of Psychology, Stony Brook University, Stony Brook, NY
| | - Roman Kotov
- Department of Psychology, Stony Brook University, Stony Brook, NY
| | - Akos Szekely
- Department of Psychology, Stony Brook University, Stony Brook, NY
| | - Ramin Parsey
- Department of Psychology, Stony Brook University, Stony Brook, NY
| | - Deanna M. Barch
- Departments of Psychology, Psychiatry, and Radiology, Washington University in St. Louis, St. Louis, MO
| | - Aprajita Mohanty
- Department of Psychology, Stony Brook University, Stony Brook, NY;
| |
Collapse
|
28
|
Jeong E, Ryu H. Melodic Contour Identification Reflects the Cognitive Threshold of Aging. Front Aging Neurosci 2016; 8:134. [PMID: 27378907 PMCID: PMC4904015 DOI: 10.3389/fnagi.2016.00134] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2016] [Accepted: 05/27/2016] [Indexed: 01/16/2023] Open
Abstract
Cognitive decline is a natural phenomenon of aging. Although there exists a consensus that sensitivity to acoustic features of music is associated with such decline, no solid evidence has yet shown that structural elements and contexts of music explain this loss of cognitive performance. This study examined the extent and the type of cognitive decline that is related to the contour identification task (CIT) using tones with different pitches (i.e., melodic contours). Both younger and older adult groups participated in the CIT given in three listening conditions (i.e., focused, selective, and alternating). Behavioral data (accuracy and response times) and hemodynamic reactions were measured using functional near-infrared spectroscopy (fNIRS). Our findings showed cognitive declines in the older adult group but with a subtle difference from the younger adult group. The accuracy of the melodic CITs given in the target-like distraction task (CIT2) was significantly lower than that in the environmental noise (CIT1) condition in the older adult group, indicating that CIT2 may be a benchmark test for age-specific cognitive decline. The fNIRS findings also agreed with this interpretation, revealing significant increases in oxygenated hemoglobin (oxyHb) concentration in the younger (p < 0.05 for Δpre - on task; p < 0.01 for Δon – post task) rather than the older adult group (n.s for Δpre - on task; n.s for Δon – post task). We further concluded that the oxyHb difference was present in the brain regions near the right dorsolateral prefrontal cortex. Taken together, these findings suggest that CIT2 (i.e., the melodic contour task in the target-like distraction) is an optimized task that could indicate the degree and type of age-related cognitive decline.
Collapse
Affiliation(s)
- Eunju Jeong
- Department of Arts and Technology, Hanyang University Seoul, South Korea
| | - Hokyoung Ryu
- Department of Arts and Technology, Hanyang University Seoul, South Korea
| |
Collapse
|
29
|
Rohr CS, Villringer A, Solms‐Baruth C, van der Meer E, Margulies DS, Okon‐Singer H. The neural networks of subjectively evaluated emotional conflicts. Hum Brain Mapp 2016; 37:2234-46. [PMID: 26991156 PMCID: PMC6867502 DOI: 10.1002/hbm.23169] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2015] [Revised: 02/19/2016] [Accepted: 02/22/2016] [Indexed: 01/10/2023] Open
Abstract
Previous work on the neural underpinnings of emotional conflict processing has largely focused on designs that instruct participants to ignore a distracter which conflicts with a target. In contrast, this study investigated the noninstructed experience and evaluation of an emotional conflict, where positive or negative cues can be subjectively prioritized. To this end, healthy participants freely watched short film scenes that evoked emotional conflicts while their BOLD responses were measured. Participants' individual ratings of conflict and valence perception during the film scenes were collected immediately afterwards, and the individual ratings were regressed against the BOLD data. Our analyses revealed that (a) amygdala and medial prefrontal cortex were significantly involved in prioritizing positive or negative cues, but not in subjective evaluations of conflict per se, and (b) superior temporal sulcus (STS) and inferior parietal lobule (IPL), which have been implicated in social cognition and emotion control, were involved in both prioritizing positive or negative cues and subjectively evaluating conflict, and may thus constitute "hubs" or "switches" in emotional conflict processing. Psychophysiological interaction (PPI) analyses further revealed stronger functional connectivity between IPL and ventral prefrontal-medial parietal areas in prioritizing negative cues, and stronger connectivity between STS and dorsal-rostral prefrontal-medial parietal areas in prioritizing positive cues. In sum, our results suggest that IPL and STS are important in the subjective evaluation of complex conflicts and influence valence prioritization via prefrontal and parietal control centers. Hum Brain Mapp 37:2234-2246, 2016. © 2016 Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Christiane S. Rohr
- Department of NeurologyMax Planck Institute for Human Cognitive and Brain SciencesLeipzigGermany
- Mind‐Brain InstituteBerlin School of Mind and BrainCharité and Humboldt University of BerlinBerlinGermany
- Department of PsychologyHumboldt University of BerlinBerlinGermany
| | - Arno Villringer
- Department of NeurologyMax Planck Institute for Human Cognitive and Brain SciencesLeipzigGermany
- Mind‐Brain InstituteBerlin School of Mind and BrainCharité and Humboldt University of BerlinBerlinGermany
| | - Carolina Solms‐Baruth
- Department of NeurologyMax Planck Institute for Human Cognitive and Brain SciencesLeipzigGermany
- Mind‐Brain InstituteBerlin School of Mind and BrainCharité and Humboldt University of BerlinBerlinGermany
| | | | - Daniel S. Margulies
- Department of NeurologyMax Planck Institute for Human Cognitive and Brain SciencesLeipzigGermany
- Mind‐Brain InstituteBerlin School of Mind and BrainCharité and Humboldt University of BerlinBerlinGermany
- Max Planck Research Group for Neuroanatomy and ConnectivityMax Planck Institute for Human Cognitive and Brain SciencesLeipzigGermany
| | - Hadas Okon‐Singer
- Department of NeurologyMax Planck Institute for Human Cognitive and Brain SciencesLeipzigGermany
- Mind‐Brain InstituteBerlin School of Mind and BrainCharité and Humboldt University of BerlinBerlinGermany
- Department of PsychologyUniversity of HaifaHaifaIsrael
| |
Collapse
|
30
|
The sound of emotions-Towards a unifying neural network perspective of affective sound processing. Neurosci Biobehav Rev 2016; 68:96-110. [PMID: 27189782 DOI: 10.1016/j.neubiorev.2016.05.002] [Citation(s) in RCA: 117] [Impact Index Per Article: 14.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2016] [Revised: 05/01/2016] [Accepted: 05/04/2016] [Indexed: 12/15/2022]
Abstract
Affective sounds are an integral part of the natural and social environment that shape and influence behavior across a multitude of species. In human primates, these affective sounds span a repertoire of environmental and human sounds when we vocalize or produce music. In terms of neural processing, cortical and subcortical brain areas constitute a distributed network that supports our listening experience to these affective sounds. Taking an exhaustive cross-domain view, we accordingly suggest a common neural network that facilitates the decoding of the emotional meaning from a wide source of sounds rather than a traditional view that postulates distinct neural systems for specific affective sound types. This new integrative neural network view unifies the decoding of affective valence in sounds, and ascribes differential as well as complementary functional roles to specific nodes within a common neural network. It also highlights the importance of an extended brain network beyond the central limbic and auditory brain systems engaged in the processing of affective sounds.
Collapse
|
31
|
Ben-David BM, Multani N, Shakuf V, Rudzicz F, van Lieshout PHHM. Prosody and Semantics Are Separate but Not Separable Channels in the Perception of Emotional Speech: Test for Rating of Emotions in Speech. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2016; 59:72-89. [PMID: 26903033 DOI: 10.1044/2015_jslhr-h-14-0323] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/18/2014] [Accepted: 07/22/2015] [Indexed: 06/05/2023]
Abstract
PURPOSE Our aim is to explore the complex interplay of prosody (tone of speech) and semantics (verbal content) in the perception of discrete emotions in speech. METHOD We implement a novel tool, the Test for Rating of Emotions in Speech. Eighty native English speakers were presented with spoken sentences made of different combinations of 5 discrete emotions (anger, fear, happiness, sadness, and neutral) presented in prosody and semantics. Listeners were asked to rate the sentence as a whole, integrating both speech channels, or to focus on one channel only (prosody or semantics). RESULTS We observed supremacy of congruency, failure of selective attention, and prosodic dominance. Supremacy of congruency means that a sentence that presents the same emotion in both speech channels was rated highest; failure of selective attention means that listeners were unable to selectively attend to one channel when instructed; and prosodic dominance means that prosodic information plays a larger role than semantics in processing emotional speech. CONCLUSIONS Emotional prosody and semantics are separate but not separable channels, and it is difficult to perceive one without the influence of the other. Our findings indicate that the Test for Rating of Emotions in Speech can reveal specific aspects in the processing of emotional speech and may in the future prove useful for understanding emotion-processing deficits in individuals with pathologies.
Collapse
|
32
|
Diamond E, Zhang Y. Cortical processing of phonetic and emotional information in speech: A cross-modal priming study. Neuropsychologia 2016; 82:110-122. [PMID: 26796714 DOI: 10.1016/j.neuropsychologia.2016.01.019] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2015] [Revised: 01/06/2016] [Accepted: 01/16/2016] [Indexed: 10/22/2022]
Abstract
The current study employed behavioral and electrophysiological measures to investigate the timing, localization, and neural oscillation characteristics of cortical activities associated with phonetic and emotional information processing of speech. The experimental design used a cross-modal priming paradigm in which the normal adult participants were presented a visual prime followed by an auditory target. Primes were facial expressions that systematically varied in emotional content (happy or angry) and mouth shape (corresponding to /a/ or /i/ vowels). Targets were spoken words that varied by emotional prosody (happy or angry) and vowel (/a/ or /i/). In both the phonetic and prosodic conditions, participants were asked to judge congruency status of the visual prime and the auditory target. Behavioral results showed a congruency effect for both percent correct and reaction time. Two ERP responses, the N400 and late positive response (LPR), were identified in both conditions. Source localization and inter-trial phase coherence of the N400 and LPR components further revealed different cortical contributions and neural oscillation patterns for selective processing of phonetic and emotional information in speech. The results provide corroborating evidence for the necessity of differentiating brain mechanisms underlying the representation and processing of co-existing linguistic and paralinguistic information in spoken language, which has important implications for theoretical models of speech recognition as well as clinical studies on the neural bases of language and social communication deficits.
Collapse
Affiliation(s)
- Erin Diamond
- Department of Speech-Language-Hearing Sciences, University of Minnesota, Minneapolis, MN 55455, USA
| | - Yang Zhang
- Department of Speech-Language-Hearing Sciences, University of Minnesota, Minneapolis, MN 55455, USA; Center for Neurobehavioral Development, University of Minnesota, Minneapolis, MN 55455, USA; School of Foreign Languages, Shanghai Jiao Tong University, Shanghai 200240, China.
| |
Collapse
|
33
|
Đorđević M, Glumbić N, Brojčin B. Paralinguistic abilities of adults with intellectual disability. RESEARCH IN DEVELOPMENTAL DISABILITIES 2016; 48:211-219. [PMID: 26625206 DOI: 10.1016/j.ridd.2015.11.001] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/23/2015] [Revised: 11/01/2015] [Accepted: 11/02/2015] [Indexed: 06/05/2023]
Abstract
The aim of this research was to determine the ability level of paralinguistic production and comprehension in adults with intellectual disability (ID) with regard to the level of their intellectual functioning and presence of co-morbid psychiatric conditions or dual diagnosis (DD). The sample consisted of 120 participants of both genders, ranging in age between 20 and 56 years (M=31.82, SD=8.702). Approximately 50% of the sample comprised participants with a co-existing psychiatric condition. Each of these two sub-samples (those with ID only and those with DD) consisted of 25 participants with mild ID and 35 participants with moderate ID. The paralinguistic scale from The Assessment Battery for Communication (ABaCo; Sacco et al., 2008) was used to assess the abilities of comprehension and production of paralinguistic elements. The results showed that the participants with mild ID are more successful than the participants with moderate ID both in paralinguistic comprehension tasks (p=.000) and in paralinguistic production tasks (p=.001). Additionally, the results indicated the presence of separate influences of both ID levels on all of the paralinguistic abilities (F [116]=42.549, p=.000) and the existence of DD (F [116]=18.215, p=.000).
Collapse
Affiliation(s)
- Mirjana Đorđević
- University of Belgrade Faculty of Special Education and Rehabilitation, Serbia.
| | - Nenad Glumbić
- University of Belgrade Faculty of Special Education and Rehabilitation, Serbia
| | - Branislav Brojčin
- University of Belgrade Faculty of Special Education and Rehabilitation, Serbia
| |
Collapse
|
34
|
Gauvin HS, De Baene W, Brass M, Hartsuiker RJ. Conflict monitoring in speech processing: An fMRI study of error detection in speech production and perception. Neuroimage 2015; 126:96-105. [PMID: 26608243 DOI: 10.1016/j.neuroimage.2015.11.037] [Citation(s) in RCA: 57] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2015] [Revised: 11/06/2015] [Accepted: 11/14/2015] [Indexed: 11/16/2022] Open
Abstract
To minimize the number of errors in speech, and thereby facilitate communication, speech is monitored before articulation. It is, however, unclear at which level during speech production monitoring takes place, and what mechanisms are used to detect and correct errors. The present study investigated whether internal verbal monitoring takes place through the speech perception system, as proposed by perception-based theories of speech monitoring, or whether mechanisms independent of perception are applied, as proposed by production-based theories of speech monitoring. With the use of fMRI during a tongue twister task we observed that error detection in internal speech during noise-masked overt speech production and error detection in speech perception both recruit the same neural network, which includes pre-supplementary motor area (pre-SMA), dorsal anterior cingulate cortex (dACC), anterior insula (AI), and inferior frontal gyrus (IFG). Although production and perception recruit similar areas, as proposed by perception-based accounts, we did not find activation in superior temporal areas (which are typically associated with speech perception) during internal speech monitoring in speech production as hypothesized by these accounts. On the contrary, results are highly compatible with a domain general approach to speech monitoring, by which internal speech monitoring takes place through detection of conflict between response options, which is subsequently resolved by a domain general executive center (e.g., the ACC).
Collapse
Affiliation(s)
- Hanna S Gauvin
- Department of Experimental Psychology, Ghent University, Henri Dunantlaan 2, 9000 Ghent, Belgium.
| | - Wouter De Baene
- Department of Experimental Psychology, Ghent University, Henri Dunantlaan 2, 9000 Ghent, Belgium; Department of Cognitive Neuropsychology, Tilburg University, 5000 LE Tilburg, The Netherlands
| | - Marcel Brass
- Department of Experimental Psychology, Ghent University, Henri Dunantlaan 2, 9000 Ghent, Belgium
| | - Robert J Hartsuiker
- Department of Experimental Psychology, Ghent University, Henri Dunantlaan 2, 9000 Ghent, Belgium
| |
Collapse
|
35
|
Zaki J. Cue Integration: A Common Framework for Social Cognition and Physical Perception. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2015; 8:296-312. [PMID: 26172972 DOI: 10.1177/1745691613475454] [Citation(s) in RCA: 54] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
Abstract
Scientists examining how people understand other minds have long thought that this task must be something like how people perceive the physical world. This comparison has proven to be deeply generative, as models of physical perception and social cognition have evolved in parallel. In this article, I propose extending this classic analogy in a new direction by proposing cue integration as a common feature of social cognition and physical perception. When encountering complex social cues-which happens often-perceivers use multiple processes for understanding others' minds. Like physical senses (e.g., vision or audition), social cognitive processes have often been studied as though they operate in relative isolation. In the domain of physical perception, this assumption has broken down, following evidence that perception is instead characterized by pervasive integration of multisensory information. Such integration is, in turn, elegantly described by Bayesian inferential models. By adopting a similar cue integration framework, researchers can similarly understand and formally model the ways that we perceive others' minds based on complex social information.
Collapse
Affiliation(s)
- Jamil Zaki
- Department of Psychology, Stanford University
| |
Collapse
|
36
|
Wang JE, Tsao FM. Emotional prosody perception and its association with pragmatic language in school-aged children with high-function autism. RESEARCH IN DEVELOPMENTAL DISABILITIES 2015; 37:162-170. [PMID: 25463248 DOI: 10.1016/j.ridd.2014.11.013] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/28/2014] [Revised: 11/12/2014] [Accepted: 11/15/2014] [Indexed: 06/04/2023]
Abstract
Emotional prosody perception is essential for social communication, but it is still an open issue whether children with high-function autism (HFA) exhibit any prosodic perception deficits or experience selective impairments in recognizing the prosody of positive emotions. Moreover, the associations between prosody perception, pragmatic language, and social adaptation in children with HFA have not been fully explored. This study investigated whether emotional prosody perception for words and sentences in children with HFA (n=25, 6-11 years of age) differed from age-matched, typically developing children (TD, n=25) when presented with an emotional prosody identification task. The Children's Communication Checklist and Vineland Adaptive Behavior Scale were used to assess pragmatic and social adaption abilities. Results show that children with HFA performed poorer than TD children in identifying happy prosody in both emotionally neutral and relevant utterances. In contrast, children with HFA did not exhibit any deficits in identifying sad and angry prosody. Results of correlation analyses revealed a positive association between happy prosody identification and pragmatic function. The findings indicate that school-aged children with HFA experience difficulties in recognizing happy prosody, and that this limitation in prosody perception is associated with their pragmatic and social adaption performances.
Collapse
Affiliation(s)
- Jia-En Wang
- Department of Psychology, National Taiwan University, Taiwan; Department of Psychiatry, Mackay Memorial Hospital, Taiwan
| | - Feng-Ming Tsao
- Department of Psychology, National Taiwan University, Taiwan.
| |
Collapse
|
37
|
Jessen S, Kotz SA. Affect differentially modulates brain activation in uni- and multisensory body-voice perception. Neuropsychologia 2015; 66:134-43. [DOI: 10.1016/j.neuropsychologia.2014.10.038] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2014] [Revised: 09/22/2014] [Accepted: 10/30/2014] [Indexed: 10/24/2022]
|
38
|
Evolution of affective and linguistic disambiguation under social eavesdropping pressures. Behav Brain Sci 2014; 37:551-2; discussion 577-604. [PMID: 25514941 DOI: 10.1017/s0140525x13003993] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Contradicting new dual-pathway models of language evolution, cortico-striatal-thalamic circuitry disambiguate uncertainties in affective prosody and propositional linguistic content of language production and comprehension, predictably setting limits on useful complexity of articulate phonic and/or signed speech. Such limits likely evolved to ensure public information is discriminated by intended communicants and safeguarded against the ecological pressures of social eavesdropping within and across phylogenetic boundaries.
Collapse
|
39
|
Phylogenetic reorganization of the basal ganglia: A necessary, but not the only, bridge over a primate Rubicon of acoustic communication. Behav Brain Sci 2014. [DOI: 10.1017/s0140525x1400003x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
AbstractIn this response to commentaries, we revisit the two main arguments of our target article. Based on data drawn from a variety of research areas – vocal behavior in nonhuman primates, speech physiology and pathology, neurobiology of basal ganglia functions, motor skill learning, paleoanthropological concepts – the target article, first, suggests a two-stage model of the evolution of the crucial motor prerequisites of spoken language within the hominin lineage: (1) monosynaptic refinement of the projections of motor cortex to brainstem nuclei steering laryngeal muscles, and (2) subsequent “vocal-laryngeal elaboration” of cortico-basal ganglia circuits, driven by human-specific FOXP2 mutations. Second, as concerns the ontogenetic development of verbal communication, age-dependent interactions between the basal ganglia and their cortical targets are assumed to contribute to the time course of the acquisition of articulate speech. Whereas such a phylogenetic reorganization of cortico-striatal circuits must be considered a necessary prerequisite for ontogenetic speech acquisition, the 30 commentaries – addressing the whole range of data sources referred to – point at several further aspects of acoustic communication which have to be added to or integrated with the presented model. For example, the relationships between vocal tract movement sequencing – the focus of the target article – and rhythmical structures of movement organization, the connections between speech motor control and the central-auditory and central-visual systems, the impact of social factors upon the development of vocal behavior (in nonhuman primates and in our species), and the interactions of ontogenetic speech acquisition – based upon FOXP2-driven structural changes at the level of the basal ganglia – with preceding subvocal stages of acoustic communication as well as higher-order (cognitive) dimensions of phonological development. Most importantly, thus, several promising future research directions unfold from these contributions – accessible to clinical studies and functional imaging in our species as well as experimental investigations in nonhuman primates.
Collapse
|
40
|
Mitchell RLC, Rossell SL. Perception of emotion-related conflict in human communications: what are the effects of schizophrenia? Psychiatry Res 2014; 220:135-44. [PMID: 25149130 DOI: 10.1016/j.psychres.2014.07.077] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/06/2014] [Revised: 07/29/2014] [Accepted: 07/31/2014] [Indexed: 11/18/2022]
Abstract
Our ability to make sense of emotional cues is of paramount importance for understanding state of mind and communicative intent. However, emotional cues often conflict with each other; this presents a significant challenge for people with schizophrenia. We conducted a theoretical review to determine the extent and types of impaired processing of emotion-related conflict in schizophrenia; we evaluated the relationship with medication and symptoms, and considered possible mediatory mechanisms. The literature established that people with schizophrenia demonstrated impaired function: (i) when passively exposed to emotion cues whilst performing an unrelated task, (ii) when selectively attending to one source of emotion cues whilst trying to ignore interference from another source, and (iii) when trying to resolve conflicting emotion cues and judge meta-communicative intent. These deficits showed associations with both negative and positive symptoms. There was limited evidence for antipsychotic medications attenuating impaired emotion perception when there are conflicting cues, with further direct research needed. Impaired attentional control and context processing may underlie some of the observed impairments. Neuroanatomical correlates are likely to involve interhemispheric transfer via the corpus callosum, limbic regions such as the amygdala, and possibly dorsolateral prefrontal and anterior cingulate cortex through their role in conflict processing.
Collapse
Affiliation(s)
- Rachel L C Mitchell
- Centre for Affective (PO Box 72), Department of Psychological Medicine, Institute of Psychiatry, 16 De Crespigny Park, London SE5 8AF, UK.
| | - Susan L Rossell
- Brain and Psychological Sciences Research Centre, Swinburne University of Technology, Melbourne, Victoria, Australia
| |
Collapse
|
41
|
Frühholz S, Trost W, Grandjean D. The role of the medial temporal limbic system in processing emotions in voice and music. Prog Neurobiol 2014; 123:1-17. [PMID: 25291405 DOI: 10.1016/j.pneurobio.2014.09.003] [Citation(s) in RCA: 89] [Impact Index Per Article: 8.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2014] [Revised: 09/16/2014] [Accepted: 09/29/2014] [Indexed: 01/15/2023]
Abstract
Subcortical brain structures of the limbic system, such as the amygdala, are thought to decode the emotional value of sensory information. Recent neuroimaging studies, as well as lesion studies in patients, have shown that the amygdala is sensitive to emotions in voice and music. Similarly, the hippocampus, another part of the temporal limbic system (TLS), is responsive to vocal and musical emotions, but its specific roles in emotional processing from music and especially from voices have been largely neglected. Here we review recent research on vocal and musical emotions, and outline commonalities and differences in the neural processing of emotions in the TLS in terms of emotional valence, emotional intensity and arousal, as well as in terms of acoustic and structural features of voices and music. We summarize the findings in a neural framework including several subcortical and cortical functional pathways between the auditory system and the TLS. This framework proposes that some vocal expressions might already receive a fast emotional evaluation via a subcortical pathway to the amygdala, whereas cortical pathways to the TLS are thought to be equally used for vocal and musical emotions. While the amygdala might be specifically involved in a coarse decoding of the emotional value of voices and music, the hippocampus might process more complex vocal and musical emotions, and might have an important role especially for the decoding of musical emotions by providing memory-based and contextual associations.
Collapse
Affiliation(s)
- Sascha Frühholz
- Neuroscience of Emotion and Affective Dynamics Lab, Department of Psychology, University of Geneva, Geneva, Switzerland; Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland.
| | - Wiebke Trost
- Neuroscience of Emotion and Affective Dynamics Lab, Department of Psychology, University of Geneva, Geneva, Switzerland; Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| | - Didier Grandjean
- Neuroscience of Emotion and Affective Dynamics Lab, Department of Psychology, University of Geneva, Geneva, Switzerland; Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| |
Collapse
|
42
|
Comte M, Schön D, Coull JT, Reynaud E, Khalfa S, Belzeaux R, Ibrahim EC, Guedj E, Blin O, Weinberger DR, Fakra E. Dissociating Bottom-Up and Top-Down Mechanisms in the Cortico-Limbic System during Emotion Processing. Cereb Cortex 2014; 26:144-55. [DOI: 10.1093/cercor/bhu185] [Citation(s) in RCA: 81] [Impact Index Per Article: 8.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
|
43
|
Zimmer U, Koschutnig K, Ebner F, Ischebeck A. Successful contextual integration of loose mental associations as evidenced by emotional conflict-processing. PLoS One 2014; 9:e91470. [PMID: 24618674 PMCID: PMC3950074 DOI: 10.1371/journal.pone.0091470] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2013] [Accepted: 02/03/2014] [Indexed: 12/01/2022] Open
Abstract
Often we cannot resist emotional distraction, because emotions capture our attention. For example, in TV-commercials, tempting emotional voices add an emotional expression to a formerly neutral product. Here, we used a Stroop-like conflict paradigm as a tool to investigate whether emotional capture results in contextual integration of loose mental associations. Specifically, we tested whether the associatively connected meaning of an ignored auditory emotion with a non-emotional neutral visual target would yield a modulation of activation sensitive to emotional conflict in the brain. In an fMRI-study, nineteen participants detected the presence or absence of a little worm hidden in the picture of an apple, while ignoring a voice with an emotional sound of taste (delicious/disgusting). Our results indicate a modulation due to emotional conflict, pronounced most strongly when processing conflict in the context of disgust (conflict: disgust/no-worm vs. no conflict: disgust/worm). For conflict in the context of disgust, insula activity was increased, with activity correlating positively with reaction time in the conflict case. Conflict in the context of deliciousness resulted in increased amygdala activation, possibly due to the resulting “negative” emotion in incongruent versus congruent combinations. These results indicate that our associative stimulus-combinations showed a conflict-dependent modulation of activity in emotional brain areas. This shows that the emotional sounds were successfully contextually integrated with the loosely associated neutral pictures.
Collapse
Affiliation(s)
- Ulrike Zimmer
- Department of Psychology, University of Graz, Graz, Austria
| | - Karl Koschutnig
- Department of Radiology, Medical University of Graz, Graz, Austria
| | - Franz Ebner
- Department of Radiology, Medical University of Graz, Graz, Austria
| | - Anja Ischebeck
- Department of Psychology, University of Graz, Graz, Austria
| |
Collapse
|
44
|
Kotz SA, Dengler R, Wittfoth M. Valence-specific conflict moderation in the dorso-medial PFC and the caudate head in emotional speech. Soc Cogn Affect Neurosci 2014; 10:165-71. [PMID: 24526187 DOI: 10.1093/scan/nsu021] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Emotional speech comprises of complex multimodal verbal and non-verbal information that allows deducting others' emotional states or thoughts in social interactions. While the neural correlates of verbal and non-verbal aspects and their interaction in emotional speech have been identified, there is very little evidence on how we perceive and resolve incongruity in emotional speech, and whether such incongruity extends to current concepts of task-specific prediction errors as a consequence of unexpected action outcomes ('negative surprise'). Here, we explored this possibility while participants listened to congruent and incongruent angry, happy or neutral utterances and categorized the expressed emotions by their verbal (semantic) content. Results reveal valence-specific incongruity effects: negative verbal content expressed in a happy tone of voice increased activation in the dorso-medial prefrontal cortex (dmPFC) extending its role from conflict moderation to appraisal of valence-specific conflict in emotional speech. Conversely, the caudate head bilaterally responded selectively to positive verbal content expressed in an angry tone of voice broadening previous accounts of the caudate head in linguistic control to moderating valence-specific control in emotional speech. Together, these results suggest that control structures of the human brain (dmPFC and subcompartments of the basal ganglia) impact emotional speech differentially when conflict arises.
Collapse
Affiliation(s)
- Sonja A Kotz
- Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstrasse 1a, 04103 Leipzig, Germany, School of Psychological Sciences, Zochonis Building, The University of Manchester, Brunswick Street, Manchester M13 9PL, UK, Department of Neurology and Clinical Neurophysiology, Hannover Medical School, Carl-Neuberg-Str. 1, 30625 Hannover, Germany, Department of Psychiatry, Social psychiatry and Psychotherapy, Hannover Medical School Carl-Neuberg-Str. 1, 30625 Hannover, Germany, and NICA - NeuroImaging and Clinical Applications, Hannover Medical School Carl-Neuberg-Str. 1, 30625 Hannover, Germany Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstrasse 1a, 04103 Leipzig, Germany, School of Psychological Sciences, Zochonis Building, The University of Manchester, Brunswick Street, Manchester M13 9PL, UK, Department of Neurology and Clinical Neurophysiology, Hannover Medical School, Carl-Neuberg-Str. 1, 30625 Hannover, Germany, Department of Psychiatry, Social psychiatry and Psychotherapy, Hannover Medical School Carl-Neuberg-Str. 1, 30625 Hannover, Germany, and NICA - NeuroImaging and Clinical Applications, Hannover Medical School Carl-Neuberg-Str. 1, 30625 Hannover, Germany
| | - Reinhard Dengler
- Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstrasse 1a, 04103 Leipzig, Germany, School of Psychological Sciences, Zochonis Building, The University of Manchester, Brunswick Street, Manchester M13 9PL, UK, Department of Neurology and Clinical Neurophysiology, Hannover Medical School, Carl-Neuberg-Str. 1, 30625 Hannover, Germany, Department of Psychiatry, Social psychiatry and Psychotherapy, Hannover Medical School Carl-Neuberg-Str. 1, 30625 Hannover, Germany, and NICA - NeuroImaging and Clinical Applications, Hannover Medical School Carl-Neuberg-Str. 1, 30625 Hannover, Germany
| | - Matthias Wittfoth
- Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstrasse 1a, 04103 Leipzig, Germany, School of Psychological Sciences, Zochonis Building, The University of Manchester, Brunswick Street, Manchester M13 9PL, UK, Department of Neurology and Clinical Neurophysiology, Hannover Medical School, Carl-Neuberg-Str. 1, 30625 Hannover, Germany, Department of Psychiatry, Social psychiatry and Psychotherapy, Hannover Medical School Carl-Neuberg-Str. 1, 30625 Hannover, Germany, and NICA - NeuroImaging and Clinical Applications, Hannover Medical School Carl-Neuberg-Str. 1, 30625 Hannover, Germany Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstrasse 1a, 04103 Leipzig, Germany, School of Psychological Sciences, Zochonis Building, The University of Manchester, Brunswick Street, Manchester M13 9PL, UK, Department of Neurology and Clinical Neurophysiology, Hannover Medical School, Carl-Neuberg-Str. 1, 30625 Hannover, Germany, Department of Psychiatry, Social psychiatry and Psychotherapy, Hannover Medical School Carl-Neuberg-Str. 1, 30625 Hannover, Germany, and NICA - NeuroImaging and Clinical Applications, Hannover Medical School Carl-Neuberg-Str. 1, 30625 Hannover, Germany
| |
Collapse
|
45
|
Garrido-Vásquez P, Pell MD, Paulmann S, Strecker K, Schwarz J, Kotz SA. An ERP study of vocal emotion processing in asymmetric Parkinson's disease. Soc Cogn Affect Neurosci 2013; 8:918-27. [PMID: 22956665 PMCID: PMC3831560 DOI: 10.1093/scan/nss094] [Citation(s) in RCA: 32] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2011] [Accepted: 08/13/2012] [Indexed: 11/14/2022] Open
Abstract
Parkinson's disease (PD) has been related to impaired processing of emotional speech intonation (emotional prosody). One distinctive feature of idiopathic PD is motor symptom asymmetry, with striatal dysfunction being strongest in the hemisphere contralateral to the most affected body side. It is still unclear whether this asymmetry may affect vocal emotion perception. Here, we tested 22 PD patients (10 with predominantly left-sided [LPD] and 12 with predominantly right-sided motor symptoms) and 22 healthy controls in an event-related potential study. Sentences conveying different emotional intonations were presented in lexical and pseudo-speech versions. Task varied between an explicit and an implicit instruction. Of specific interest was emotional salience detection from prosody, reflected in the P200 component. We predicted that patients with predominantly right-striatal dysfunction (LPD) would exhibit P200 alterations. Our results support this assumption. LPD patients showed enhanced P200 amplitudes, and specific deficits were observed for disgust prosody, explicit anger processing and implicit processing of happy prosody. Lexical speech was predominantly affected while the processing of pseudo-speech was largely intact. P200 amplitude in patients correlated significantly with left motor scores and asymmetry indices. The data suggest that emotional salience detection from prosody is affected by asymmetric neuronal degeneration in PD.
Collapse
Affiliation(s)
- Patricia Garrido-Vásquez
- Department of General and Biological Psychology, University of Marburg, Gutenbergstrasse 18, 35032 Marburg, Germany.
| | | | | | | | | | | |
Collapse
|
46
|
Watson R, Latinus M, Noguchi T, Garrod O, Crabbe F, Belin P. Dissociating task difficulty from incongruence in face-voice emotion integration. Front Hum Neurosci 2013; 7:744. [PMID: 24294196 PMCID: PMC3826561 DOI: 10.3389/fnhum.2013.00744] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2013] [Accepted: 10/18/2013] [Indexed: 11/13/2022] Open
Abstract
In the everyday environment, affective information is conveyed by both the face and the voice. Studies have demonstrated that a concurrently presented voice can alter the way that an emotional face expression is perceived, and vice versa, leading to emotional conflict if the information in the two modalities is mismatched. Additionally, evidence suggests that incongruence of emotional valence activates cerebral networks involved in conflict monitoring and resolution. However, it is currently unclear whether this is due to task difficulty—that incongruent stimuli are harder to categorize—or simply to the detection of mismatching information in the two modalities. The aim of the present fMRI study was to examine the neurophysiological correlates of processing incongruent emotional information, independent of task difficulty. Subjects were scanned while judging the emotion of face-voice affective stimuli. Both the face and voice were parametrically morphed between anger and happiness and then paired in all audiovisual combinations, resulting in stimuli each defined by two separate values: the degree of incongruence between the face and voice, and the degree of clarity of the combined face-voice information. Due to the specific morphing procedure utilized, we hypothesized that the clarity value, rather than incongruence value, would better reflect task difficulty. Behavioral data revealed that participants integrated face and voice affective information, and that the clarity, as opposed to incongruence value correlated with categorization difficulty. Cerebrally, incongruence was more associated with activity in the superior temporal region, which emerged after task difficulty had been accounted for. Overall, our results suggest that activation in the superior temporal region in response to incongruent information cannot be explained simply by task difficulty, and may rather be due to detection of mismatching information between the two modalities.
Collapse
Affiliation(s)
- Rebecca Watson
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University Maastricht, Netherlands ; Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow Glasgow, UK
| | | | | | | | | | | |
Collapse
|
47
|
Specht K. Neuronal basis of speech comprehension. Hear Res 2013; 307:121-35. [PMID: 24113115 DOI: 10.1016/j.heares.2013.09.011] [Citation(s) in RCA: 49] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/15/2013] [Revised: 09/15/2013] [Accepted: 09/19/2013] [Indexed: 01/18/2023]
Abstract
Verbal communication does not rely only on the simple perception of auditory signals. It is rather a parallel and integrative processing of linguistic and non-linguistic information, involving temporal and frontal areas in particular. This review describes the inherent complexity of auditory speech comprehension from a functional-neuroanatomical perspective. The review is divided into two parts. In the first part, structural and functional asymmetry of language relevant structures will be discus. The second part of the review will discuss recent neuroimaging studies, which coherently demonstrate that speech comprehension processes rely on a hierarchical network involving the temporal, parietal, and frontal lobes. Further, the results support the dual-stream model for speech comprehension, with a dorsal stream for auditory-motor integration, and a ventral stream for extracting meaning but also the processing of sentences and narratives. Specific patterns of functional asymmetry between the left and right hemisphere can also be demonstrated. The review article concludes with a discussion on interactions between the dorsal and ventral streams, particularly the involvement of motor related areas in speech perception processes, and outlines some remaining unresolved issues. This article is part of a Special Issue entitled Human Auditory Neuroimaging.
Collapse
Affiliation(s)
- Karsten Specht
- Department of Biological and Medical Psychology, University of Bergen, Jonas Lies vei 91, 5009 Bergen, Norway; Department for Medical Engineering, Haukeland University Hospital, Bergen, Norway.
| |
Collapse
|
48
|
Happy facial expression processing with different social interaction cues: an fMRI study of individuals with schizotypal personality traits. Prog Neuropsychopharmacol Biol Psychiatry 2013; 44:108-17. [PMID: 23416087 DOI: 10.1016/j.pnpbp.2013.02.004] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/26/2012] [Revised: 02/06/2013] [Accepted: 02/06/2013] [Indexed: 11/23/2022]
Abstract
In daily life facial expressions change rapidly and the direction of change provides important clues about social interaction. The aim of conducting this study was to elucidate the dynamic happy facial expression processing with different social interaction cues in individuals with (n=14) and without (n=14) schizotypal personality disorder (SPD) traits. Using functional magnetic resonance imaging (fMRI), dynamic happy facial expression processing was examined by presenting video clips depicting happiness appearing and disappearing under happiness inducing ('praise') or reducing ('blame') interaction cues. The happiness appearing condition consistently elicited more brain activations than the happiness disappearing condition in the posterior cingulate bilaterally in all participants. Further analyses showed that the SPD group was less deactivated than the non-SPD group in the right anterior cingulate cortex in the happiness appearing-disappearing contrast. The SPD group deactivated more than the non-SPD group in the left posterior cingulate and right superior temporal gyrus in the praise-blame contrast. Moreover, the incongruence of cues and facial expression activated the frontal-thalamus-caudate-parietal network, which is involved in emotion recognition and conflict resolution. These results shed light on the neural basis of social interaction deficits in individuals with schizotypal personality traits.
Collapse
|
49
|
Llano DA. Functional imaging of the thalamus in language. BRAIN AND LANGUAGE 2013; 126:62-72. [PMID: 22981716 PMCID: PMC4836874 DOI: 10.1016/j.bandl.2012.06.004] [Citation(s) in RCA: 55] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/07/2011] [Revised: 06/09/2012] [Accepted: 06/22/2012] [Indexed: 05/07/2023]
Abstract
Herein, the literature regarding functional imaging of the thalamus during language tasks is reviewed. Fifty studies met criteria for analysis. Two of the most common task paradigms associated with thalamic activation were generative tasks (e.g. word or sentence generation) and naming, though activation was also seen in tasks that involve lexical decision, reading and working memory. Typically, thalamic activation was seen bilaterally, left greater than right, along with activation in frontal and temporal cortical regions. Thalamic activation was seen with perceptually challenging tasks, though few studies rigorously correlated thalamic activation with measures of attention or task difficulty. The peaks of activation loci were seen in virtually all thalamic regions, with a bias towards left-sided and midline activation. These analyses suggest that the thalamus may be involved in processes that involve manipulations of lexical information, but point to the need for more systematic study of the thalamus using language tasks.
Collapse
Affiliation(s)
- Daniel A Llano
- University of Illinois at Urbana-Champaign, Department of Molecular and Integrative Physiology, USA.
| |
Collapse
|
50
|
Mitchell RL. Further characterisation of the functional neuroanatomy associated with prosodic emotion decoding. Cortex 2013; 49:1722-32. [DOI: 10.1016/j.cortex.2012.07.010] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2012] [Revised: 07/13/2012] [Accepted: 07/25/2012] [Indexed: 11/17/2022]
|