1
|
Yu L, Wang W, Li Z, Ren Y, Liu J, Jiao L, Xu Q. Alexithymia modulates emotion concept activation during facial expression processing. Cereb Cortex 2024; 34:bhae071. [PMID: 38466112 DOI: 10.1093/cercor/bhae071] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2023] [Revised: 01/23/2024] [Accepted: 02/06/2024] [Indexed: 03/12/2024] Open
Abstract
Alexithymia is characterized by difficulties in emotional information processing. However, the underlying reasons for emotional processing deficits in alexithymia are not fully understood. The present study aimed to investigate the mechanism underlying emotional deficits in alexithymia. Using the Toronto Alexithymia Scale-20, we recruited college students with high alexithymia (n = 24) or low alexithymia (n = 24) in this study. Participants judged the emotional consistency of facial expressions and contextual sentences while recording their event-related potentials. Behaviorally, the high alexithymia group showed longer response times versus the low alexithymia group in processing facial expressions. The event-related potential results showed that the high alexithymia group had more negative-going N400 amplitudes compared with the low alexithymia group in the incongruent condition. More negative N400 amplitudes are also associated with slower responses to facial expressions. Furthermore, machine learning analyses based on N400 amplitudes could distinguish the high alexithymia group from the low alexithymia group in the incongruent condition. Overall, these findings suggest worse facial emotion perception for the high alexithymia group, potentially due to difficulty in spontaneously activating emotion concepts. Our findings have important implications for the affective science and clinical intervention of alexithymia-related affective disorders.
Collapse
Affiliation(s)
- Linwei Yu
- Department of Psychology, Ningbo University, Ningbo 315211, China
| | - Weihan Wang
- Department of Psychology, Ningbo University, Ningbo 315211, China
| | - Zhiwei Li
- Department of Psychology, Ningbo University, Ningbo 315211, China
| | - Yi Ren
- Department of Psychology, Ningbo University, Ningbo 315211, China
| | - Jiabin Liu
- Beijing Key Laboratory of Applied Experimental Psychology, National Demonstration Center for Experimental Psychology Education (Beijing Normal University), Faculty of Psychology, Beijing Normal University, Beijing 100875, China
| | - Lan Jiao
- Department of Psychology, Ningbo University, Ningbo 315211, China
| | - Qiang Xu
- Department of Psychology, Ningbo University, Ningbo 315211, China
| |
Collapse
|
2
|
Xu Q, Sommer W, Recio G. Control over emotional facial expressions: Evidence from facial EMG and ERPs in a Stroop-like task. Biol Psychol 2023; 181:108611. [PMID: 37302517 DOI: 10.1016/j.biopsycho.2023.108611] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2023] [Revised: 05/04/2023] [Accepted: 06/06/2023] [Indexed: 06/13/2023]
Abstract
Facial expressions carry important social signals that must be precisely regulated despite potentially conflicting demands on veridicality, communicative intent, and the social situation. In a sample of 19 participants we investigated the challenges of deliberately controlling two facial expressions (smiles and frowns) by the emotional congruency with the expressions of adult and infant counterparts. In a Stroop-like task requiring participants' deliberate expressions of anger or happiness, we investigated the impact of task-irrelevant background pictures of adults and infants showing negative, neutral, or positive facial expressions. Participants' deliberate expressions were measured with electromyogram (EMG) of the M. zygomaticus major and M. corrugator supercilii. The latencies of EMG onsets revealed similar congruency effects for smiles and frowns with significant facilitation and inhibition components relative to the neutral condition. Interestingly, the facilitation effect for frown responses by negative facial expressions was significantly smaller vis a vis infant as compared to adult background faces. This diminished facilitation of frowns by infant's expressions of distress may relate to the activation of caregiver behavior or empathy. We investigated the neural correlates of the observed performance effects by recording event-related-potentials (ERPs). Increased amplitudes in ERP components were observed in incongruent relative to neutral conditions, revealing interference effects on both types of deliberate facial expressions, at different processing stages, namely, structural facial encoding (N170), conflict monitoring (N2), to semantic analysis (N400).
Collapse
Affiliation(s)
- Qiang Xu
- Department of Psychology, Ningbo University, Ningbo, China.
| | - Werner Sommer
- Department of Psychology, Humboldt Universität zu Berlin, Berlin, Germany.
| | - Guillermo Recio
- Departament de Psicologia Clínica i Psicobiologia, Universitat de Barcelona, Barcelona, Spain; Institut de Neurociències, Universitat de Barcelona, Barcelona, Spain
| |
Collapse
|
3
|
Zhou Q, Du J, Gao R, Hu S, Yu T, Wang Y, Pan NC. Discriminative neural pathways for perception-cognition activity of color and face in the human brain. Cereb Cortex 2023; 33:1972-1984. [PMID: 35580851 DOI: 10.1093/cercor/bhac186] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2022] [Revised: 04/22/2022] [Accepted: 04/24/2022] [Indexed: 11/13/2022] Open
Abstract
Human performance can be examined using a visual lens. The identification of psychophysical colors and emotional faces with perceptual visual pathways may remain invalid for simple detection tasks. In particular, how the visual dorsal and ventral processing streams handle discriminative visual perceptions and subsequent cognition activities are obscure. We explored these issues using stereoelectroencephalography recordings, which were obtained from patients with pharmacologically resistant epilepsy. Delayed match-to-sample paradigms were used for analyzing the processing of simple colors and complex emotional faces in the human brain. We showed that the angular-cuneus gyrus acts as a pioneer in discriminating the 2 features, and dorsal regions, including the middle frontal gyrus (MFG) and postcentral gyrus, as well as ventral regions, such as the middle temporal gyrus (MTG) and posterior superior temporal sulcus (pSTS), were involved in processing incongruent colors and faces. Critically, the beta and gamma band activities between the cuneus and MTG and between the cuneus and pSTS would tune a separate pathway of incongruency processing. In addition, posterior insular gyrus, fusiform, and MFG were found for attentional modulation of the 2 features via alpha band activities. These findings suggest the neural basis of the discriminative pathways of perception-cognition activities in the human brain.
Collapse
Affiliation(s)
- Qilin Zhou
- Department of Neurology, Xuanwu Hospital, Capital Medical University, No. 45, Changchun Street, Xicheng District, Beijing, 100053, China.,Beijing Key Laboratory of Neuromodulation, No. 45, Changchun Street, Xicheng District, Beijing, 100053, China
| | - Jialin Du
- Department of Pharmacy Phase I Clinical Trial Center, Xuanwu Hospital, Capital Medical University, No. 45, Changchun Street, Xicheng District, Beijing, 100053, China
| | - Runshi Gao
- Beijing Institute of Functional Neurosurgery, Xuanwu Hospital, Capital Medical University, No. 45, Changchun Street, Xicheng District, Beijing, 100053, China
| | - Shimin Hu
- Department of Neurology, Xuanwu Hospital, Capital Medical University, No. 45, Changchun Street, Xicheng District, Beijing, 100053, China.,Beijing Key Laboratory of Neuromodulation, No. 45, Changchun Street, Xicheng District, Beijing, 100053, China
| | - Tao Yu
- Beijing Institute of Functional Neurosurgery, Xuanwu Hospital, Capital Medical University, No. 45, Changchun Street, Xicheng District, Beijing, 100053, China
| | - Yuping Wang
- Department of Neurology, Xuanwu Hospital, Capital Medical University, No. 45, Changchun Street, Xicheng District, Beijing, 100053, China.,Beijing Key Laboratory of Neuromodulation, No. 45, Changchun Street, Xicheng District, Beijing, 100053, China.,Institute of sleep and consciousness disorders, Center of Epilepsy, Beijing Institute for Brain Disorders, Capital Medical University, No. 10, Xi Tou Tiao, Youanmen wai, Fengtai District, Beijing, 100069, China
| | - Na Clara Pan
- Department of Neurology, Xuanwu Hospital, Capital Medical University, No. 45, Changchun Street, Xicheng District, Beijing, 100053, China.,Beijing Key Laboratory of Neuromodulation, No. 45, Changchun Street, Xicheng District, Beijing, 100053, China
| |
Collapse
|
4
|
Zhang R, Hu Y, Zhang J, Wu Y, Huang L. Event‐related potential response to drivers' facial expressions in an online car‐hailing scene. Psych J 2022; 12:195-201. [PMID: 36336336 DOI: 10.1002/pchj.613] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2021] [Accepted: 08/17/2022] [Indexed: 11/09/2022]
Abstract
Recognizing facial expressions is crucial for adaptive social interaction. Prior empirical research on facial expression processing has primarily focused on isolated faces; however, facial expressions appear embedded in surrounding scenes in everyday life. In this study, we attempted to demonstrate how the online car-hailing scene affects the processing of facial expression. This study examined the processing of drivers' facial expressions in scenes by recording event-related potentials, in which neutral or happy faces embedded in online car-hailing orders were constructed (with type of vehicle, driver rating, driver surname, and level of reputation controlled). A total of 35 female volunteers participated in this experiment and were asked to judge which facial expressions that emerged in scenes of online car-hailing were more trustworthy. The results revealed an interaction between facial expression scenes, brain areas, and electrode sites in the late positive potential, which indicated that happy faces elicited larger amplitudes than did neutral ones in the parietal areas and that scenes with happy facial expressions had shorter latencies than did those with neutral ones. As expected, the late positive potential evoked by happy facial expressions in a scene was larger than that evoked by neutral ones, which reflected motivated attention and motivational response processes. This study highlights the importance of scenes as context in the study of facial expression processing.
Collapse
Affiliation(s)
- Ran‐Ran Zhang
- Department of Psychology School of Medical Humanitarians, Guizhou Medical University Guiyang China
| | - Yu‐Wei Hu
- Department of Psychology School of Medical Humanitarians, Guizhou Medical University Guiyang China
| | - Jia‐Rui Zhang
- Department of Psychology School of Medical Humanitarians, Guizhou Medical University Guiyang China
| | - Yi‐Xun Wu
- Department of Psychology School of Medical Humanitarians, Guizhou Medical University Guiyang China
| | - Lie‐Yu Huang
- Department of Psychology School of Medical Humanitarians, Guizhou Medical University Guiyang China
| |
Collapse
|
5
|
Song S, Wu M, Feng C. Early Influence of Emotional Scenes on the Encoding of Fearful Expressions With Different Intensities: An Event-Related Potential Study. Front Hum Neurosci 2022; 16:866253. [PMID: 35652009 PMCID: PMC9150066 DOI: 10.3389/fnhum.2022.866253] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2022] [Accepted: 04/26/2022] [Indexed: 12/03/2022] Open
Abstract
Contextual affective information influences the processing of facial expressions at the relatively early stages of face processing, but the effect of the context on the processing of facial expressions with varying intensities remains unclear. In this study, we investigated the influence of emotional scenes (fearful, happy, and neutral) on the processing of fear expressions at different levels of intensity (high, medium, and low) during the early stages of facial recognition using event-related potential (ERP) technology. EEG data were collected while participants performed a fearful facial expression recognition task. The results showed that (1) the recognition of high-intensity fear expression was higher than that of medium- and low-intensity fear expressions. Facial expression recognition was the highest when faces appeared in fearful scenes. (2) Emotional scenes modulated the amplitudes of N170 for fear expressions with different intensities. Specifically, the N170 amplitude, induced by high-intensity fear expressions, was significantly higher than that induced by low-intensity fear expressions when faces appeared in both neutral and fearful scenes. No significant differences were found between the N170 amplitudes induced by high-, medium-, and low-intensity fear expressions when faces appeared in happy scenes. These results suggest that individuals may tend to allocate their attention resources to the processing of face information when the valence between emotional context and expression conflicts i.e., when the conflict is absent (fear scene and fearful faces) or is low (neutral scene and fearful faces).
Collapse
Affiliation(s)
- Sutao Song
- School of Information Science and Engineering, Shandong Normal University, Jinan, China
- School of Education and Psychology, University of Jinan, Jinan, China
- *Correspondence: Sutao Song,
| | - Meiyun Wu
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, China
| | - Chunliang Feng
- Key Laboratory of Brain, Cognition and Education Sciences, Ministry of Education, School of Psychology, Center for Studies of Psychological Application, Guangdong Key Laboratory of Mental Health and Cognitive Science, South China Normal University, Guangzhou, China
- Chunliang Feng,
| |
Collapse
|
6
|
Calbi M, Siri F, Heimann K, Barratt D, Gallese V, Kolesnikov A, Umiltà MA. How context influences the interpretation of facial expressions: a source localization high-density EEG study on the "Kuleshov effect". Sci Rep 2019; 9:2107. [PMID: 30765713 PMCID: PMC6376122 DOI: 10.1038/s41598-018-37786-y] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2018] [Accepted: 12/12/2018] [Indexed: 11/24/2022] Open
Abstract
Few studies have explored the specificities of contextual modulations of the processing of facial expressions at a neuronal level. This study fills this gap by employing an original paradigm, based on a version of the filmic “Kuleshov effect”. High-density EEG was recorded while participants watched film sequences consisting of three shots: the close-up of a target person’s neutral face (Face_1), the scene that the target person was looking at (happy, fearful, or neutral), and another close-up of the same target person’s neutral face (Face_2). The participants’ task was to rate both valence and arousal, and subsequently to categorize the target person’s emotional state. The results indicate that despite a significant behavioural ‘context’ effect, the electrophysiological indexes still indicate that the face is evaluated as neutral. Specifically, Face_2 elicited a high amplitude N170 when preceded by neutral contexts, and a high amplitude Late Positive Potential (LPP) when preceded by emotional contexts, thus showing sensitivity to the evaluative congruence (N170) and incongruence (LPP) between context and Face_2. The LPP activity was mainly underpinned by brain regions involved in facial expressions and emotion recognition processing. Our results shed new light on temporal and neural correlates of context-sensitivity in the interpretation of facial expressions.
Collapse
Affiliation(s)
- Marta Calbi
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy.
| | - Francesca Siri
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy
| | - Katrin Heimann
- Interacting Minds Center, University of Aarhus, Aarhus, Denmark
| | - Daniel Barratt
- Department of Management, Society and Communication, Copenhagen Business School, Copenhagen, Denmark
| | - Vittorio Gallese
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy. .,Institute of Philosophy, School of Advanced Study, University of London, London, UK.
| | - Anna Kolesnikov
- Department of Humanities, Social Sciences and Cultural Industries, University of Parma, Parma, Italy
| | | |
Collapse
|
7
|
Schoth DE, Wu J, Zhang J, Guo X, Liossi C. Eye-movement behaviours when viewing real-world pain-related images. Eur J Pain 2019; 23:945-956. [PMID: 30629782 DOI: 10.1002/ejp.1363] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2017] [Revised: 12/10/2018] [Accepted: 12/19/2018] [Indexed: 11/09/2022]
Abstract
BACKGROUND Pain-related cues are evolutionarily primed to capture attention, although evidence of attentional biases towards pain-related information is mixed in healthy individuals. The present study explores whether healthy individuals show significantly different eye-movement behaviours when viewing real-world pain-related scenes compared to neutral scenes. The effect of manipulating via written information the threat value of the pain-related scenes on eye-movement behaviours was also assessed. METHODS Participants were randomized to threatening (n = 28) and non-threatening (n = 27) information conditions. All completed a free-viewing task with real-world pain-related and neutral images while their eye movements were recorded. RESULTS Participants made significantly fewer fixations of significantly longer duration when viewing pain-related images compared to neutral images. No significant differences were found between threatening and non-threatening information groups in their pattern of eye movements. CONCLUSIONS This study shows that healthy individuals demonstrate attentional biases to pain-related real-world complex images compared to neutral images. Future research is needed to establish the implications of these biases, particularly in the context of acute pain, on the onset and/or subsequent maintenance of chronic pain conditions. SIGNIFICANCE Healthy individuals show different eye-movement behaviours when viewing pain-related scenes than neutral scenes, supporting evolutionary accounts of pain. Implications for the onset and/or maintenance of chronic pain need to be explored.
Collapse
Affiliation(s)
- Daniel E Schoth
- Pain Research Laboratory, Department of Psychology, University of Southampton, Southampton, UK
| | - Jun Wu
- School of Psychology, South China Normal University, Guangzhou, China
| | - Jin Zhang
- Pain Research Laboratory, Department of Psychology, University of Southampton, Southampton, UK
| | - Xiaoying Guo
- School of Software Engineering, Shanxi University, Taiyuan, China
| | - Christina Liossi
- Pain Research Laboratory, Department of Psychology, University of Southampton, Southampton, UK
| |
Collapse
|
8
|
Xu Q, Yang Y, Tan Q, Zhang L. Facial Expressions in Context: Electrophysiological Correlates of the Emotional Congruency of Facial Expressions and Background Scenes. Front Psychol 2017; 8:2175. [PMID: 29312049 PMCID: PMC5733078 DOI: 10.3389/fpsyg.2017.02175] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2017] [Accepted: 11/29/2017] [Indexed: 11/29/2022] Open
Abstract
Facial expressions can display personal emotions and indicate an individual’s intentions within a social situation. They are extremely important to the social interaction of individuals. Background scenes in which faces are perceived provide important contextual information for facial expression processing. The purpose of this study was to explore the time course of emotional congruency effects in processing faces and scenes simultaneously by recording event-related potentials (ERPs). The behavioral results found that the categorization of facial expression was faster and more accurate when the face was emotionally congruent than incongruent with the emotion displayed by the scene. In ERPs the late positive potential (LPP) amplitudes were modulated by the emotional congruency between faces and scenes. Specifically, happy faces elicited larger LPP amplitudes within positive than within negative scenes and fearful faces within negative scenes elicited larger LPP amplitudes than within positive scenes. The results did not find the scene effects on the P1 and N170 components. These findings indicate that emotional congruency effects could occur in late stages of facial expression processing, reflecting motivated attention allocation.
Collapse
Affiliation(s)
- Qiang Xu
- Department of Psychology, Ningbo University, Ningbo, China
| | - Yaping Yang
- Department of Psychology, Ningbo University, Ningbo, China
| | - Qun Tan
- Department of Psychology, Ningbo University, Ningbo, China
| | - Lin Zhang
- Department of Psychology, Ningbo University, Ningbo, China
| |
Collapse
|
9
|
Morioka S, Osumi M, Shiotani M, Nobusako S, Maeoka H, Okada Y, Hiyamizu M, Matsuo A. Incongruence between Verbal and Non-Verbal Information Enhances the Late Positive Potential. PLoS One 2016; 11:e0164633. [PMID: 27736931 PMCID: PMC5063471 DOI: 10.1371/journal.pone.0164633] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2016] [Accepted: 09/28/2016] [Indexed: 11/28/2022] Open
Abstract
Smooth social communication consists of both verbal and non-verbal information. However, when presented with incongruence between verbal information and nonverbal information, the relationship between an individual judging trustworthiness in those who present the verbal-nonverbal incongruence and the brain activities observed during judgment for trustworthiness are not clear. In the present study, we attempted to identify the impact of incongruencies between verbal information and facial expression on the value of trustworthiness and brain activity using event-related potentials (ERP). Combinations of verbal information [positive/negative] and facial expressions [smile/angry] expressions were presented randomly on a computer screen to 17 healthy volunteers. The value of trustworthiness of the presented facial expression was evaluated by the amount of donation offered by the observer to the person depicted on the computer screen. In addition, the time required to judge the value of trustworthiness was recorded for each trial. Using electroencephalography, ERP were obtained by averaging the wave patterns recorded while the participants judged the value of trustworthiness. The amount of donation offered was significantly lower when the verbal information and facial expression were incongruent, particularly for [negative × smile]. The amplitude of the early posterior negativity (EPN) at the temporal lobe showed no significant difference between all conditions. However, the amplitude of the late positive potential (LPP) at the parietal electrodes for the incongruent condition [negative × smile] was higher than that for the congruent condition [positive × smile]. These results suggest that the LPP amplitude observed from the parietal cortex is involved in the processing of incongruence between verbal information and facial expression.
Collapse
Affiliation(s)
- Shu Morioka
- Neurorehabilitation Research Center, Kio University, 4-2-2 Umaminaka, Koryo, Kitakatsuragi-gun, Nara, 635-0832, Japan
- * E-mail:
| | - Michihiro Osumi
- Neurorehabilitation Research Center, Kio University, 4-2-2 Umaminaka, Koryo, Kitakatsuragi-gun, Nara, 635-0832, Japan
| | - Mayu Shiotani
- Department of Rehabilitation, Higashisumiyoshi Morimoto Hospital, 3-2-66 Takaai, Higashisumiyoshi, Osaka-city, Osaka, 546-0014, Japan
| | - Satoshi Nobusako
- Neurorehabilitation Research Center, Kio University, 4-2-2 Umaminaka, Koryo, Kitakatsuragi-gun, Nara, 635-0832, Japan
| | - Hiroshi Maeoka
- Neurorehabilitation Research Center, Kio University, 4-2-2 Umaminaka, Koryo, Kitakatsuragi-gun, Nara, 635-0832, Japan
| | - Yohei Okada
- Neurorehabilitation Research Center, Kio University, 4-2-2 Umaminaka, Koryo, Kitakatsuragi-gun, Nara, 635-0832, Japan
| | - Makoto Hiyamizu
- Neurorehabilitation Research Center, Kio University, 4-2-2 Umaminaka, Koryo, Kitakatsuragi-gun, Nara, 635-0832, Japan
| | - Atsushi Matsuo
- Neurorehabilitation Research Center, Kio University, 4-2-2 Umaminaka, Koryo, Kitakatsuragi-gun, Nara, 635-0832, Japan
| |
Collapse
|