1
|
Arya R, Ervin B, Greiner HM, Buroker J, Byars AW, Tenney JR, Arthur TM, Fong SL, Lin N, Frink C, Rozhkov L, Scholle C, Skoch J, Leach JL, Mangano FT, Glauser TA, Hickok G, Holland KD. Emotional facial expression and perioral motor functions of the human auditory cortex. Clin Neurophysiol 2024; 163:102-111. [PMID: 38729074 PMCID: PMC11176009 DOI: 10.1016/j.clinph.2024.04.017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2023] [Revised: 04/16/2024] [Accepted: 04/17/2024] [Indexed: 05/12/2024]
Abstract
OBJECTIVE We investigated the role of transverse temporal gyrus and adjacent cortex (TTG+) in facial expressions and perioral movements. METHODS In 31 patients undergoing stereo-electroencephalography monitoring, we describe behavioral responses elicited by electrical stimulation within the TTG+. Task-induced high-gamma modulation (HGM), auditory evoked responses, and resting-state connectivity were used to investigate the cortical sites having different types of responses on electrical stimulation. RESULTS Changes in facial expressions and perioral movements were elicited on electrical stimulation within TTG+ in 9 (29%) and 10 (32%) patients, respectively, in addition to the more common language responses (naming interruptions, auditory hallucinations, paraphasic errors). All functional sites showed auditory task induced HGM and evoked responses validating their location within the auditory cortex, however, motor sites showed lower peak amplitudes and longer peak latencies compared to language sites. Significant first-degree connections for motor sites included precentral, anterior cingulate, parahippocampal, and anterior insular gyri, whereas those for language sites included posterior superior temporal, posterior middle temporal, inferior frontal, supramarginal, and angular gyri. CONCLUSIONS Multimodal data suggests that TTG+ may participate in auditory-motor integration. SIGNIFICANCE TTG+ likely participates in facial expressions in response to emotional cues during an auditory discourse.
Collapse
Affiliation(s)
- Ravindra Arya
- Comprehensive Epilepsy Center, Division of Neurology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA; Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA; Department of Electrical Engineering and Computer Science, University of Cincinnati, Cincinnati, OH, USA.
| | - Brian Ervin
- Comprehensive Epilepsy Center, Division of Neurology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA; Department of Electrical Engineering and Computer Science, University of Cincinnati, Cincinnati, OH, USA
| | - Hansel M Greiner
- Comprehensive Epilepsy Center, Division of Neurology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA; Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Jason Buroker
- Comprehensive Epilepsy Center, Division of Neurology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA
| | - Anna W Byars
- Comprehensive Epilepsy Center, Division of Neurology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA; Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Jeffrey R Tenney
- Comprehensive Epilepsy Center, Division of Neurology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA; Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Todd M Arthur
- Comprehensive Epilepsy Center, Division of Neurology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA; Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Susan L Fong
- Comprehensive Epilepsy Center, Division of Neurology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA; Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Nan Lin
- Comprehensive Epilepsy Center, Division of Neurology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA; Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Clayton Frink
- Comprehensive Epilepsy Center, Division of Neurology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA
| | - Leonid Rozhkov
- Comprehensive Epilepsy Center, Division of Neurology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA
| | - Craig Scholle
- Comprehensive Epilepsy Center, Division of Neurology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA
| | - Jesse Skoch
- Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA; Division of Pediatric Neurosurgery, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA
| | - James L Leach
- Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA; Division of Pediatric Neuro-radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA
| | - Francesco T Mangano
- Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA; Division of Pediatric Neurosurgery, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA
| | - Tracy A Glauser
- Comprehensive Epilepsy Center, Division of Neurology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA; Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Gregory Hickok
- Department of Cognitive Sciences, Department of Language Science, University of California, Irvine, CA, USA
| | - Katherine D Holland
- Comprehensive Epilepsy Center, Division of Neurology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA; Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| |
Collapse
|
2
|
Grigorescu C, Chalah MA, Ayache SS, Palm U. [Alexithymia in Multiple Sclerosis - Narrative Review]. FORTSCHRITTE DER NEUROLOGIE-PSYCHIATRIE 2023; 91:404-413. [PMID: 35948023 DOI: 10.1055/a-1882-6544] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Alexithymia is a multidimensional construct of personality implicating difficulties in identifying and describing another's feelings, and externally oriented thinking. It is broadly reported in psychiatric patients but has gained little attention regarding its occurrence and pathophysiology in multiple sclerosis (MS). This narrative review aims to address prevalence, etiology, neurobiological, and clinical findings of alexithymia. The prevalence of alexithymia in MS ranges from 10 to 53%. There seems to be an association with anxiety, depression, fatigue, and some aspects of social cognition, while the relationship with clinical and classical cognitive variables was rarely evaluated. Only a few studies referred to its pathophysiology assuming an aberrant interhemispheric transfer or regional cerebral abnormalities. The prevalence of alexithymia in MS and the potential negative impact on quality of life and interpersonal communication could severely impact clinical MS management and a screnning for these factors should be mandatory. Thus, further evaluation is needed concerning its relationship with clinical, emotional, and cognitive confounders. Large-scale studies employing neuroimaging techniques are needed for a better understanding of the neural underpinnings of this MS feature.
Collapse
Affiliation(s)
- Christina Grigorescu
- Klinik für Psychiatrie und Psychotherapie, Klinikum der Universität München, München
| | - Moussa A Chalah
- EA 4391, Excitabilité Nerveuse et Thérapeutique, Université Paris-Est-Créteil, Créteil, France
- Service de Physiologie - Explorations Fonctionnelles, Hôpital Henri Mondor, Assistance Publique - Hôpitaux de Paris, Créteil, France
| | - Samar S Ayache
- EA 4391, Excitabilité Nerveuse et Thérapeutique, Université Paris-Est-Créteil, Créteil, France
- Service de Physiologie - Explorations Fonctionnelles, Hôpital Henri Mondor, Assistance Publique - Hôpitaux de Paris, Créteil, France
| | - Ulrich Palm
- Klinik für Psychiatrie und Psychotherapie, Klinikum der Universität München, München
- Medical Park Chiemseeblick, Bernau a. Chiemsee
| |
Collapse
|
3
|
Pelzl MA, Travers-Podmaniczky G, Brück C, Jacob H, Hoffmann J, Martinelli A, Hölz L, Wabersich-Flad D, Wildgruber D. Reduced impact of nonverbal cues during integration of verbal and nonverbal emotional information in adults with high-functioning autism. Front Psychiatry 2022; 13:1069028. [PMID: 36699473 PMCID: PMC9868406 DOI: 10.3389/fpsyt.2022.1069028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/13/2022] [Accepted: 12/13/2022] [Indexed: 01/11/2023] Open
Abstract
BACKGROUND When receiving mismatching nonverbal and verbal signals, most people tend to base their judgment regarding the current emotional state of others primarily on nonverbal information. However, individuals with high-functioning autism (HFA) have been described as having difficulties interpreting nonverbal signals. Recognizing emotional states correctly is highly important for successful social interaction. Alterations in perception of nonverbal emotional cues presumably contribute to misunderstanding and impairments in social interactions. METHODS To evaluate autism-specific differences in the relative impact of nonverbal and verbal cues, 18 adults with HFA (14 male and four female subjects, mean age 36.7 years (SD 11.4) and 18 age, gender and IQ-matched typically developed controls [14 m/4 f, mean age 36.4 years (SD 12.2)] rated the emotional state of speakers in video sequences with partly mismatching emotional signals. Standardized linear regression coefficients were calculated as a measure of the reliance on the nonverbal and verbal components of the videos for each participant. Regression coefficients were then compared between groups to test the hypothesis that autistic adults base their social evaluations less strongly on nonverbal information. Further exploratory analyses were performed for differences in valence ratings and response times. RESULTS Compared to the typically developed control group, nonverbal cue reliance was reduced in adults with high-functioning autism [t(23.14) = -2.44, p = 0.01 (one-sided)]. Furthermore, the exploratory analyses showed a tendency to avoid extreme answers in the HFA group, observable via less positive as well as less negative valence ratings in response to emotional expressions of increasingly strong valence. In addition, response time was generally longer in HFA compared to the control group [F (1, 33) = 10.65, p = 0.004]. CONCLUSION These findings suggest reduced impact of nonverbal cues and longer processing times in the analysis of multimodal emotional information, which may be associated with a subjectively lower relevance of this information and/or more processing difficulties for people with HFA. The less extreme answering tendency may indicate a lower sensitivity for nonverbal valence expression in HFA or result from a tendency to avoid incorrect answers when confronted with greater uncertainty in interpreting emotional states.
Collapse
Affiliation(s)
- Michael Alexander Pelzl
- Department of Psychiatry and Psychotherapy, Tübingen Center for Mental Health, University of Tübingen, Tübingen, Germany
| | - Gabrielle Travers-Podmaniczky
- Department of Psychiatry and Psychotherapy, Tübingen Center for Mental Health, University of Tübingen, Tübingen, Germany
| | - Carolin Brück
- Department of Psychiatry and Psychotherapy, Tübingen Center for Mental Health, University of Tübingen, Tübingen, Germany
| | - Heike Jacob
- Department of Psychiatry and Psychotherapy, Tübingen Center for Mental Health, University of Tübingen, Tübingen, Germany
| | - Jonatan Hoffmann
- Department of Psychiatry and Psychotherapy, Tübingen Center for Mental Health, University of Tübingen, Tübingen, Germany
| | - Anne Martinelli
- School of Psychology, Fresenius University of Applied Sciences, Frankfurt, Germany
| | - Lea Hölz
- Department of Psychiatry and Psychotherapy, Tübingen Center for Mental Health, University of Tübingen, Tübingen, Germany
| | - Dominik Wabersich-Flad
- Department of Psychiatry and Psychotherapy, Tübingen Center for Mental Health, University of Tübingen, Tübingen, Germany
| | - Dirk Wildgruber
- Department of Psychiatry and Psychotherapy, Tübingen Center for Mental Health, University of Tübingen, Tübingen, Germany
| |
Collapse
|
4
|
Wosiak A, Dura A. Hybrid Method of Automated EEG Signals' Selection Using Reversed Correlation Algorithm for Improved Classification of Emotions. SENSORS 2020; 20:s20247083. [PMID: 33321895 PMCID: PMC7764031 DOI: 10.3390/s20247083] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/30/2020] [Revised: 12/07/2020] [Accepted: 12/08/2020] [Indexed: 11/16/2022]
Abstract
Based on the growing interest in encephalography to enhance human-computer interaction (HCI) and develop brain-computer interfaces (BCIs) for control and monitoring applications, efficient information retrieval from EEG sensors is of great importance. It is difficult due to noise from the internal and external artifacts and physiological interferences. The enhancement of the EEG-based emotion recognition processes can be achieved by selecting features that should be taken into account in further analysis. Therefore, the automatic feature selection of EEG signals is an important research area. We propose a multistep hybrid approach incorporating the Reversed Correlation Algorithm for automated frequency band-electrode combinations selection. Our method is simple to use and significantly reduces the number of sensors to only three channels. The proposed method has been verified by experiments performed on the DEAP dataset. The obtained effects have been evaluated regarding the accuracy of two emotions-valence and arousal. In comparison to other research studies, our method achieved classification results that were 4.20-8.44% greater. Moreover, it can be perceived as a universal EEG signal classification technique, as it belongs to unsupervised methods.
Collapse
|
5
|
Ritter C, Vongpaisal T. Multimodal and Spectral Degradation Effects on Speech and Emotion Recognition in Adult Listeners. Trends Hear 2019; 22:2331216518804966. [PMID: 30378469 PMCID: PMC6236866 DOI: 10.1177/2331216518804966] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022] Open
Abstract
For cochlear implant (CI) users, degraded spectral input hampers the
understanding of prosodic vocal emotion, especially in difficult listening
conditions. Using a vocoder simulation of CI hearing, we examined the extent to
which informative multimodal cues in a talker’s spoken expressions improve
normal hearing (NH) adults’ speech and emotion perception under different levels
of spectral degradation (two, three, four, and eight spectral bands).
Participants repeated the words verbatim and identified emotions (among four
alternative options: happy, sad, angry, and neutral) in meaningful sentences
that are semantically congruent with the expression of the intended emotion.
Sentences were presented in their natural speech form and in speech sampled
through a noise-band vocoder in sound (auditory-only) and video
(auditory–visual) recordings of a female talker. Visual information had a more
pronounced benefit in enhancing speech recognition in the lower spectral band
conditions. Spectral degradation, however, did not interfere with emotion
recognition performance when dynamic visual cues in a talker’s expression are
provided as participants scored at ceiling levels across all spectral band
conditions. Our use of familiar sentences that contained congruent semantic and
prosodic information have high ecological validity, which likely optimized
listener performance under simulated CI hearing and may better predict CI users’
outcomes in everyday listening contexts.
Collapse
Affiliation(s)
- Chantel Ritter
- 1 Department of Psychology, MacEwan University, Alberta, Canada
| | - Tara Vongpaisal
- 1 Department of Psychology, MacEwan University, Alberta, Canada
| |
Collapse
|
6
|
Nuber S, Jacob H, Kreifelts B, Martinelli A, Wildgruber D. Attenuated impression of irony created by the mismatch of verbal and nonverbal cues in patients with autism spectrum disorder. PLoS One 2018; 13:e0205750. [PMID: 30321214 PMCID: PMC6188779 DOI: 10.1371/journal.pone.0205750] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2018] [Accepted: 10/01/2018] [Indexed: 11/27/2022] Open
Abstract
Perception of irony has been observed to be impaired in adults with autism spectrum disorder. In typically developed adults, the mismatch of verbal and nonverbal emotional cues can be perceived as an expression of irony even in the absence of any further contextual information. In this study, we evaluate to what extent high functioning autists perceive this incongruence as expressing irony. Our results show that incongruent verbal and nonverbal signals create an impression of irony significantly less often in participants with high-functioning autism than in typically developed control subjects. The extent of overall autistic symptomatology as measured with the autism-spectrum questionnaire (AQ), however, does not correlate with the reduced tendency to attribute incongruent stimuli as expressing irony. Therefore, the attenuation in irony attribution might rather be related to specific subdomains of autistic traits, such as a reduced tendency to interpret communicative signals in terms of complex intentional mental states. The observed differences in irony attribution support the assumption that a less pronounced tendency to engage in higher order mentalization processes might underlie the impairment of pragmatic language understanding in high functioning autism.
Collapse
Affiliation(s)
- Simon Nuber
- Department of Psychiatry and Psychotherapy, University Hospital Tübingen, Tübingen, Germany
- * E-mail:
| | - Heike Jacob
- Department of Psychiatry and Psychotherapy, University Hospital Tübingen, Tübingen, Germany
| | - Benjamin Kreifelts
- Department of Psychiatry and Psychotherapy, University Hospital Tübingen, Tübingen, Germany
| | - Anne Martinelli
- Department of Child and Adolescent Psychiatry, University Hospital Frankfurt, Frankfurt am Main, Germany
| | - Dirk Wildgruber
- Department of Psychiatry and Psychotherapy, University Hospital Tübingen, Tübingen, Germany
| |
Collapse
|
7
|
Association between Neuroticism and Emotional Face Processing. Sci Rep 2017; 7:17669. [PMID: 29247161 PMCID: PMC5732281 DOI: 10.1038/s41598-017-17706-2] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2017] [Accepted: 11/29/2017] [Indexed: 12/15/2022] Open
Abstract
Neuroticism is one of the “Big Five” personality factors and is characterized by a tendency to experience negative affect. We aimed to investigate how neuroticism influences the neural correlates for processing of emotional facial expressions. 68 healthy participants were presented with emotional dynamic facial stimuli, i.e. happy, neutral or angry, during functional MRI. Brain activations for the contrasts emotional vs. neutral, happy vs. neutral and angry vs. neutral were correlated with individuals’ neuroticism scores as obtained by the NEO Five Factor Inventory questionnaire and additionally investigated for gender differences. The bilateral medial temporal gyrus (MTG) was identified as key region in the processing of emotional faces and activations within this region correlated with individual neuroticism scores. Although female participants showed significantly stronger activation differences between emotional and neutral facial expressions in the left MTG, the correlation between activation and neuroticism scores did not show any significant gender differences. Our results offer for the first time a biological correlate within the face processing network for enhanced reactivity of neurotic individuals to emotional facial expressions which occurs similarly for both male and female participants.
Collapse
|
8
|
Chalah MA, Ayache SS. Alexithymia in multiple sclerosis: A systematic review of literature. Neuropsychologia 2017; 104:31-47. [DOI: 10.1016/j.neuropsychologia.2017.07.034] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2017] [Revised: 07/28/2017] [Accepted: 07/29/2017] [Indexed: 02/07/2023]
|
9
|
Niedtfeld I. Experimental investigation of cognitive and affective empathy in borderline personality disorder: Effects of ambiguity in multimodal social information processing. Psychiatry Res 2017; 253:58-63. [PMID: 28351003 DOI: 10.1016/j.psychres.2017.03.037] [Citation(s) in RCA: 33] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/12/2016] [Revised: 03/13/2017] [Accepted: 03/20/2017] [Indexed: 01/13/2023]
Abstract
Borderline personality disorder (BPD) is characterized by affective instability and interpersonal problems. In the context of social interaction, impairments in empathy are proposed to result in inadequate social behavior. In contrast to findings of reduced cognitive empathy, some authors suggested enhanced emotional empathy in BPD. It was investigated whether ambiguity leads to decreased cognitive or emotional empathy in BPD. Thirty-four patients with BPD and thirty-two healthy controls were presented with video clips, which were presented through prosody, facial expression, and speech content. Experimental conditions were designed to induce ambiguity by presenting neutral valence in one of these communication channels. Subjects were asked to indicate the actors' emotional valence, their decision confidence, and their own emotional state. BPD patients showed increased emotional empathy when neutral stories comprised nonverbally expressed emotions. In contrast, when all channels were emotional, patients showed lower emotional empathy than healthy controls. Regarding cognitive empathy, there were no significant differences between BPD patients and healthy control subjects in recognition accuracy, but reduced decision confidence in BPD. These results suggest that patients with BPD show altered emotional empathy, experiencing higher rates of emotional contagion when emotions are expressed nonverbally. The latter may contribute to misunderstandings and inadequate social behavior.
Collapse
Affiliation(s)
- Inga Niedtfeld
- Department of Psychosomatic Medicine, Central Institute of Mental Health Mannheim, Medical Faculty Mannheim / Heidelberg University, Germany.
| |
Collapse
|
10
|
Moradi Z, Mantini D, Yankouskaya A, Hewstone M, Humphreys GW. Changes in intrinsic functional connectivity and group relevant salience: The case of sport rivalry. Behav Brain Res 2017; 332:126-135. [PMID: 28487224 DOI: 10.1016/j.bbr.2017.04.045] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2016] [Revised: 04/08/2017] [Accepted: 04/11/2017] [Indexed: 01/09/2023]
Abstract
Studies have shown that attending to salient group relevant information could increase the BOLD activity across distributed neural networks. However, it is unclear how attending to group relevant information changes the functional connectivity across these networks. We investigated this issue combining resting states and task-based fMRI experiment. The task involved football fans learning associations between arbitrary geometric shapes and the badges of in-group, the rival and the neutral football teams. Upon learning, participants viewed different badge/shape pairs and their task was to judge whether the viewed pair was a match or a mismatch. For whole brain analyses increased activity was found in the IFG, DLPFC, AI, fusiform gyrus, precuneus and pSTS (all in the left hemisphere) for the rival over the in-group mismatch. Further, the ROI analyses revealed larger beta-values for the rival badge in the left pSTS, left AI and the left IFG. However, larger beta-values were found in the left pSTS and the left IFG (but not AI) for the in-group shape. The intrinsic functional connectivity analyses revealed that compare to the pre-task, post task functional connectivity was decreased between the left DLPFC and the left AI. In contrast, it was increased between the left IFG and the left AI and this was correlated with the difference in RT for the rival vs. in-group team. Our findings suggest that attending to group relevant information differentially affects the strength of functional coupling in attention networks and this can be explained by the saliency of the group relevant information.
Collapse
Affiliation(s)
- Zargol Moradi
- Department of Experimental Psychology, University of Oxford, Ewert House Oxford, OX2 7SG, UK.
| | - Dante Mantini
- Department of Experimental Psychology, University of Oxford, Ewert House Oxford, OX2 7SG, UK; Department of Health Sciences and Technology, ETH, Zürich, Switzerland
| | - Alla Yankouskaya
- Department of Experimental Psychology, University of Oxford, Ewert House Oxford, OX2 7SG, UK
| | - Miles Hewstone
- Department of Experimental Psychology, University of Oxford, Ewert House Oxford, OX2 7SG, UK
| | - Glyn W Humphreys
- Department of Experimental Psychology, University of Oxford, Ewert House Oxford, OX2 7SG, UK
| |
Collapse
|
11
|
Tomlin RJ, Stevenage SV, Hammond S. Putting the pieces together: Revealing face–voice integration through the facial overshadowing effect. VISUAL COGNITION 2016. [DOI: 10.1080/13506285.2016.1245230] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Affiliation(s)
- Rebecca J. Tomlin
- Department of Psychology, University of Southampton, Southampton, UK
| | | | - Sarah Hammond
- Department of Psychology, University of Southampton, Southampton, UK
| |
Collapse
|
12
|
Mitchell RLC, Jazdzyk A, Stets M, Kotz SA. Recruitment of Language-, Emotion- and Speech-Timing Associated Brain Regions for Expressing Emotional Prosody: Investigation of Functional Neuroanatomy with fMRI. Front Hum Neurosci 2016; 10:518. [PMID: 27803656 PMCID: PMC5067951 DOI: 10.3389/fnhum.2016.00518] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2016] [Accepted: 09/29/2016] [Indexed: 12/02/2022] Open
Abstract
We aimed to progress understanding of prosodic emotion expression by establishing brain regions active when expressing specific emotions, those activated irrespective of the target emotion, and those whose activation intensity varied depending on individual performance. BOLD contrast data were acquired whilst participants spoke non-sense words in happy, angry or neutral tones, or performed jaw-movements. Emotion-specific analyses demonstrated that when expressing angry prosody, activated brain regions included the inferior frontal and superior temporal gyri, the insula, and the basal ganglia. When expressing happy prosody, the activated brain regions also included the superior temporal gyrus, insula, and basal ganglia, with additional activation in the anterior cingulate. Conjunction analysis confirmed that the superior temporal gyrus and basal ganglia were activated regardless of the specific emotion concerned. Nevertheless, disjunctive comparisons between the expression of angry and happy prosody established that anterior cingulate activity was significantly higher for angry prosody than for happy prosody production. Degree of inferior frontal gyrus activity correlated with the ability to express the target emotion through prosody. We conclude that expressing prosodic emotions (vs. neutral intonation) requires generic brain regions involved in comprehending numerous aspects of language, emotion-related processes such as experiencing emotions, and in the time-critical integration of speech information.
Collapse
Affiliation(s)
- Rachel L C Mitchell
- Centre for Affective Disorders, Institute of Psychiatry Psychology and Neuroscience, King's College London London, UK
| | | | - Manuela Stets
- Department of Psychology, University of Essex Colchester, UK
| | - Sonja A Kotz
- Section of Neuropsychology and Psychopharmacology, Maastricht University Maastricht, Netherlands
| |
Collapse
|
13
|
Jacob H, Kreifelts B, Nizielski S, Schütz A, Wildgruber D. Effects of Emotional Intelligence on the Impression of Irony Created by the Mismatch between Verbal and Nonverbal Cues. PLoS One 2016; 11:e0163211. [PMID: 27716831 PMCID: PMC5055337 DOI: 10.1371/journal.pone.0163211] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2015] [Accepted: 09/06/2016] [Indexed: 12/03/2022] Open
Abstract
Emotional information is conveyed through verbal and nonverbal signals, with nonverbal cues often being considered the decisive factor in the judgment of others' emotional states. The aim of the present study was to examine how verbal and nonverbal cues are integrated by perceivers. More specifically, we tested whether the mismatch between verbal and nonverbal information was perceived as an expression of irony. Moreover, we investigated the effects of emotional intelligence on the impression of irony. The findings revealed that the mismatch between verbal and nonverbal information created the impression of irony. Furthermore, participants higher in emotional intelligence were faster at rating such stimuli as ironic expressions.
Collapse
Affiliation(s)
- Heike Jacob
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Benjamin Kreifelts
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Sophia Nizielski
- Department of Psychology, Chemnitz University of Technology, Chemnitz, Germany
| | - Astrid Schütz
- Department of Psychology, University of Bamberg, Bamberg, Germany
| | - Dirk Wildgruber
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| |
Collapse
|
14
|
Vogel B, Brück C, Jacob H, Eberle M, Wildgruber D. Integration of verbal and nonverbal emotional signals in patients with schizophrenia: Decreased nonverbal dominance. Psychiatry Res 2016; 241:98-103. [PMID: 27156031 DOI: 10.1016/j.psychres.2016.03.050] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/20/2015] [Revised: 03/15/2016] [Accepted: 03/26/2016] [Indexed: 11/28/2022]
Abstract
In day-to-day social interaction, emotions are usually expressed by verbal (e.g. spoken words) and nonverbal signals (e.g. facial expressions, prosody). In case of conflicting signals nonverbal signals are perceived as being the more reliable source of information. Deficits in interpreting nonverbal signals - as described for patients with schizophrenic disorders - might interfere with the ability to integrate verbal and nonverbal social cues into a meaningful whole. The aim of this study was to examine how schizophrenic disorders influence the integration of verbal and nonverbal signals. For this purpose short video sequences were presented to 21 patients with schizophrenia and 21 healthy controls. Each sequence showed an actor speaking a short sentence with independently varying emotional connotations at the verbal and the nonverbal level. The participants rated the valence of the speaker's emotional state on a four-point scale (from very negative to very positive). The relative impact of nonverbal cues as compared to verbal cues on these ratings was evaluated. Both groups base their decisions primarily on nonverbal information. However, this effect is significantly less prominent in the patient group. Patients tend to base their decisions less on nonverbal signals and more on verbal information than healthy controls.
Collapse
Affiliation(s)
- Bastian Vogel
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Calwerstraße 14, 72076 Tübingen, Germany.
| | - Carolin Brück
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Calwerstraße 14, 72076 Tübingen, Germany
| | - Heike Jacob
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Calwerstraße 14, 72076 Tübingen, Germany
| | - Mark Eberle
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Calwerstraße 14, 72076 Tübingen, Germany
| | - Dirk Wildgruber
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Calwerstraße 14, 72076 Tübingen, Germany
| |
Collapse
|
15
|
Farrugia N, Jakubowski K, Cusack R, Stewart L. Tunes stuck in your brain: The frequency and affective evaluation of involuntary musical imagery correlate with cortical structure. Conscious Cogn 2015; 35:66-77. [PMID: 25978461 DOI: 10.1016/j.concog.2015.04.020] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2014] [Revised: 03/26/2015] [Accepted: 04/27/2015] [Indexed: 01/15/2023]
Abstract
Recent years have seen a growing interest in the neuroscience of spontaneous cognition. One form of such cognition is involuntary musical imagery (INMI), the non-pathological and everyday experience of having music in one's head, in the absence of an external stimulus. In this study, aspects of INMI, including frequency and affective evaluation, were measured by self-report in 44 subjects and related to variation in brain structure in these individuals. Frequency of INMI was related to cortical thickness in regions of right frontal and temporal cortices as well as the anterior cingulate and left angular gyrus. Affective aspects of INMI, namely the extent to which subjects wished to suppress INMI or considered them helpful, were related to gray matter volume in right temporopolar and parahippocampal cortices respectively. These results provide the first evidence that INMI is a common internal experience recruiting brain networks involved in perception, emotions, memory and spontaneous thoughts.
Collapse
Affiliation(s)
- Nicolas Farrugia
- Goldsmiths, University of London, New Cross, London SE14 6NW, UK.
| | - Kelly Jakubowski
- Goldsmiths, University of London, New Cross, London SE14 6NW, UK; Medical Research Council, Cognition and Brain Sciences Unit, Cambridge CB2 7EF, UK
| | - Rhodri Cusack
- Brain and Mind Institute, Western University, London, Ontario N6A 5B7, Canada
| | - Lauren Stewart
- Goldsmiths, University of London, New Cross, London SE14 6NW, UK
| |
Collapse
|
16
|
Keogh E. Gender differences in the nonverbal communication of pain: A new direction for sex, gender, and pain research? Pain 2014; 155:1927-1931. [DOI: 10.1016/j.pain.2014.06.024] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2013] [Revised: 06/20/2014] [Accepted: 06/30/2014] [Indexed: 12/30/2022]
|
17
|
Gilboa-Schechtman E, Shachar-Lavie I. More than a face: a unified theoretical perspective on nonverbal social cue processing in social anxiety. Front Hum Neurosci 2013; 7:904. [PMID: 24427129 PMCID: PMC3876460 DOI: 10.3389/fnhum.2013.00904] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2013] [Accepted: 12/10/2013] [Indexed: 01/31/2023] Open
Abstract
Processing of nonverbal social cues (NVSCs) is essential to interpersonal functioning and is particularly relevant to models of social anxiety. This article provides a review of the literature on NVSC processing from the perspective of social rank and affiliation biobehavioral systems (ABSs), based on functional analysis of human sociality. We examine the potential of this framework for integrating cognitive, interpersonal, and evolutionary accounts of social anxiety. We argue that NVSCs are uniquely suited to rapid and effective conveyance of emotional, motivational, and trait information and that various channels are differentially effective in transmitting such information. First, we review studies on perception of NVSCs through face, voice, and body. We begin with studies that utilized information processing or imaging paradigms to assess NVSC perception. This research demonstrated that social anxiety is associated with biased attention to, and interpretation of, emotional facial expressions (EFEs) and emotional prosody. Findings regarding body and posture remain scarce. Next, we review studies on NVSC expression, which pinpointed links between social anxiety and disturbances in eye gaze, facial expressivity, and vocal properties of spontaneous and planned speech. Again, links between social anxiety and posture were understudied. Although cognitive, interpersonal, and evolutionary theories have described different pathways to social anxiety, all three models focus on interrelations among cognition, subjective experience, and social behavior. NVSC processing and production comprise the juncture where these theories intersect. In light of the conceptualizations emerging from the review, we highlight several directions for future research including focus on NVSCs as indexing reactions to changes in belongingness and social rank, the moderating role of gender, and the therapeutic opportunities offered by embodied cognition to treat social anxiety.
Collapse
Affiliation(s)
- Eva Gilboa-Schechtman
- Department of Psychology, The Gonda Brain Science Center, Bar-Ilan University Ramat Gan, Israel
| | - Iris Shachar-Lavie
- Department of Psychology, The Gonda Brain Science Center, Bar-Ilan University Ramat Gan, Israel
| |
Collapse
|
18
|
Show me how you walk and I tell you how you feel - a functional near-infrared spectroscopy study on emotion perception based on human gait. Neuroimage 2013; 85 Pt 1:380-90. [PMID: 23921096 DOI: 10.1016/j.neuroimage.2013.07.078] [Citation(s) in RCA: 40] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2013] [Revised: 07/22/2013] [Accepted: 07/29/2013] [Indexed: 11/20/2022] Open
Abstract
The ability to recognize and adequately interpret emotional states in others plays a fundamental role in regulating social interaction. Body language presents an essential element of nonverbal communication which is often perceived prior to mimic expression. However, the neural networks that underlie the processing of emotionally expressive body movement and body posture are poorly understood. 33 healthy subjects have been investigated using the optically based imaging method functional near-infrared spectroscopy (fNIRS) during the performance of a newly developed emotion discrimination paradigm consisting of faceless avatars expressing fearful, angry, sad, happy or neutral gait patterns. Participants were instructed to judge (a) the presented emotional state (emotion task) and (b) the observed walking speed of the respective avatar (speed task). We measured increases in cortical oxygenated haemoglobin (O2HB) in response to visual stimulation during emotion discrimination. These O2HB concentration changes were enhanced for negative emotions in contrast to neutral gait sequences in right occipito-temporal and left temporal and temporo-parietal brain regions. Moreover, fearful and angry bodies elicited higher activation increases during the emotion task compared to the speed task. Haemodynamic responses were correlated with a number of behavioural measures, whereby a positive relationship between emotion regulation strategy preference and O2HB concentration increases after sad walks was mediated by the ability to accurately categorize sad walks. Our results support the idea of a distributed brain network involved in the recognition of bodily emotion expression that comprises visual association areas as well as body/movement perception specific cortical regions that are also sensitive to emotion. This network is activated less when the emotion is not intentionally processed (i.e. during the speed task). Furthermore, activity of this perceptive network is, mediated by the ability to correctly recognize emotions, indirectly connected to active emotion regulation processes. We conclude that a full understanding of emotion perception and its neural substrate requires the investigation of dynamic representations and means of expression other than the face.
Collapse
|
19
|
Goerlich-Dobre KS, Witteman J, Schiller NO, van Heuven VJP, Aleman A, Martens S. Blunted feelings: alexithymia is associated with a diminished neural response to speech prosody. Soc Cogn Affect Neurosci 2013; 9:1108-17. [PMID: 23681887 DOI: 10.1093/scan/nst075] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/12/2023] Open
Abstract
How we perceive emotional signals from our environment depends on our personality. Alexithymia, a personality trait characterized by difficulties in emotion regulation has been linked to aberrant brain activity for visual emotional processing. Whether alexithymia also affects the brain's perception of emotional speech prosody is currently unknown. We used functional magnetic resonance imaging to investigate the impact of alexithymia on hemodynamic activity of three a priori regions of the prosody network: the superior temporal gyrus (STG), the inferior frontal gyrus and the amygdala. Twenty-two subjects performed an explicit task (emotional prosody categorization) and an implicit task (metrical stress evaluation) on the same prosodic stimuli. Irrespective of task, alexithymia was associated with a blunted response of the right STG and the bilateral amygdalae to angry, surprised and neutral prosody. Individuals with difficulty describing feelings deactivated the left STG and the bilateral amygdalae to a lesser extent in response to angry compared with neutral prosody, suggesting that they perceived angry prosody as relatively more salient than neutral prosody. In conclusion, alexithymia may be associated with a generally blunted neural response to speech prosody. Such restricted prosodic processing may contribute to problems in social communication associated with this personality trait.
Collapse
Affiliation(s)
- Katharina Sophia Goerlich-Dobre
- Neuroimaging Center, Department of Neuroscience, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands, Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany, LIBC Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands, LUCL Leiden University Centre for Linguistics, Leiden University, Leiden, The Netherlands, and Department of Psychology, University of Groningen, Groningen, The NetherlandsNeuroimaging Center, Department of Neuroscience, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands, Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany, LIBC Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands, LUCL Leiden University Centre for Linguistics, Leiden University, Leiden, The Netherlands, and Department of Psychology, University of Groningen, Groningen, The Netherlands
| | - Jurriaan Witteman
- Neuroimaging Center, Department of Neuroscience, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands, Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany, LIBC Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands, LUCL Leiden University Centre for Linguistics, Leiden University, Leiden, The Netherlands, and Department of Psychology, University of Groningen, Groningen, The NetherlandsNeuroimaging Center, Department of Neuroscience, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands, Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany, LIBC Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands, LUCL Leiden University Centre for Linguistics, Leiden University, Leiden, The Netherlands, and Department of Psychology, University of Groningen, Groningen, The Netherlands
| | - Niels O Schiller
- Neuroimaging Center, Department of Neuroscience, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands, Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany, LIBC Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands, LUCL Leiden University Centre for Linguistics, Leiden University, Leiden, The Netherlands, and Department of Psychology, University of Groningen, Groningen, The NetherlandsNeuroimaging Center, Department of Neuroscience, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands, Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany, LIBC Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands, LUCL Leiden University Centre for Linguistics, Leiden University, Leiden, The Netherlands, and Department of Psychology, University of Groningen, Groningen, The Netherlands
| | - Vincent J P van Heuven
- Neuroimaging Center, Department of Neuroscience, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands, Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany, LIBC Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands, LUCL Leiden University Centre for Linguistics, Leiden University, Leiden, The Netherlands, and Department of Psychology, University of Groningen, Groningen, The NetherlandsNeuroimaging Center, Department of Neuroscience, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands, Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany, LIBC Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands, LUCL Leiden University Centre for Linguistics, Leiden University, Leiden, The Netherlands, and Department of Psychology, University of Groningen, Groningen, The Netherlands
| | - André Aleman
- Neuroimaging Center, Department of Neuroscience, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands, Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany, LIBC Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands, LUCL Leiden University Centre for Linguistics, Leiden University, Leiden, The Netherlands, and Department of Psychology, University of Groningen, Groningen, The NetherlandsNeuroimaging Center, Department of Neuroscience, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands, Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany, LIBC Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands, LUCL Leiden University Centre for Linguistics, Leiden University, Leiden, The Netherlands, and Department of Psychology, University of Groningen, Groningen, The Netherlands
| | - Sander Martens
- Neuroimaging Center, Department of Neuroscience, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands, Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany, LIBC Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands, LUCL Leiden University Centre for Linguistics, Leiden University, Leiden, The Netherlands, and Department of Psychology, University of Groningen, Groningen, The Netherlands
| |
Collapse
|
20
|
Jacob H, Brück C, Domin M, Lotze M, Wildgruber D. I can't keep your face and voice out of my head: neural correlates of an attentional bias toward nonverbal emotional cues. ACTA ACUST UNITED AC 2013; 24:1460-73. [PMID: 23382516 DOI: 10.1093/cercor/bhs417] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
Emotional information can be conveyed by verbal and nonverbal cues with the latter often suggested to exert a greater influence in shaping our perceptions of others. The present functional magnetic resonance imaging study sought to explore attentional biases toward nonverbal signals by investigating the interaction of verbal and nonverbal cues. Results obtained in this study underline the previous suggestions of a "nonverbal dominance" in emotion communication by evidencing implicit effects of nonverbal cues on emotion judgements even when attention is directed away from nonverbal signals and focused on verbal cues. Attentional biases toward nonverbal signals appeared to be reflected in increasing activation of the dorsolateral prefrontal cortex (DLPFC) assumed to reflect increasing difficulties to suppress nonverbal cues during task conditions that asked to shift attention away from nonverbal signals. Aside the DLPFC, results suggest the right amygdala to play a role in attention control mechanisms related to the processing of emotional cues. Analyses conducted to determine the cerebral correlates of the individual ability to shift attention between verbal and nonverbal sources of information indicated that higher task-switching abilities seem to be associated with the up-regulation of right amygdala activation during explicit judgments of nonverbal cues, whereas difficulties in task-switching seem to be related to a down-regulation.
Collapse
Affiliation(s)
- Heike Jacob
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany and
| | | | | | | | | |
Collapse
|