1
|
Edmiston EK, Chase HW, Jones N, Nhan TJ, Phillips ML, Fournier JC. Differential role of fusiform gyrus coupling in depressive and anxiety symptoms during emotion perception. Soc Cogn Affect Neurosci 2024; 19:nsae009. [PMID: 38334745 PMCID: PMC10908550 DOI: 10.1093/scan/nsae009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2023] [Revised: 12/06/2023] [Accepted: 02/01/2024] [Indexed: 02/10/2024] Open
Abstract
Anxiety and depression co-occur; the neural substrates of shared and unique components of these symptoms are not understood. Given emotional alterations in internalizing disorders, we hypothesized that function of regions associated with emotion processing/regulation, including the anterior cingulate cortex (ACC), amygdala and fusiform gyrus (FG), would differentiate these symptoms. Forty-three adults with depression completed an emotional functional magnetic resonance imaging task and the Hamilton Depression and Anxiety Scales. We transformed these scales to examine two orthogonal components, one representing internalizing symptom severity and the other the type of internalizing symptoms (anxiety vs depression). We extracted blood oxygen level dependent signal from FG subregions, ACC, and amygdala and performed generalized psychophysiological interaction analyses to assess relationships between symptoms and brain function. Type of internalizing symptoms was associated with FG3-FG1 coupling (F = 8.14, P = 0.007). More coupling was associated with a higher concentration of depression, demonstrating that intra-fusiform coupling is differentially associated with internalizing symptom type (anxiety vs depression). We found an interaction between task condition and internalizing symptoms and dorsal (F = 4.51, P = 0.014) and rostral ACC activity (F = 4.27, P = 0.012). Post hoc comparisons revealed that less activity was associated with greater symptom severity during emotional regulation. Functional coupling differences during emotional processing are associated with depressive relative to anxiety symptoms and internalizing symptom severity. These findings could inform future treatments for depression.
Collapse
Affiliation(s)
- Elliot Kale Edmiston
- Department of Psychiatry, University of Massachusetts Chan Medical School, Worcester, MA 01605, United States
| | - Henry W Chase
- Department of Psychiatry, University of Pittsburgh School of Medicine, Pittsburgh, PA 15213, United States
| | - Neil Jones
- Department of Psychiatry, University of Pittsburgh School of Medicine, Pittsburgh, PA 15213, United States
| | - Tiffany J Nhan
- Department of Psychiatry, University of Massachusetts Chan Medical School, Worcester, MA 01605, United States
| | - Mary L Phillips
- Department of Psychiatry, University of Pittsburgh School of Medicine, Pittsburgh, PA 15213, United States
| | - Jay C Fournier
- Department of Psychiatry and Behavioral Health, The Ohio State University College of Medicine, Columbus, OH 43210, United States
| |
Collapse
|
2
|
Ping Y, Ouyang Y, Zhang M, Zheng W. Perceiving the outlier in the crowd: The influence of facial identity. Perception 2024; 53:163-179. [PMID: 38158215 DOI: 10.1177/03010066231218519] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/03/2024]
Abstract
The accurate perception of groups with outliers can help us identify potential risks. However, it is unclear how outliers affect the perception of group emotion. To address this question, we conducted a study on group emotion perception in the context of facial identity. We presented 74 participants with pictures of crowds, and asked them to evaluate the valence ratios and intensity of the crowd by means of the Emotional Aperture Measure. The results revealed that outlier emotions were often overestimated within crowds. Moreover, we found that the emotional expression of a close friend modulated the perception of outliers. Specifically, when a close friend expressed the group emotion, participants overestimated the outlier less than when a close friend expressed the outlier emotion. These results suggest that people can detect outliers within groups, and that their perception of group emotion is influenced by close friends. Thus, we provide evidence that facial identity affects group emotion perception.
Collapse
Affiliation(s)
- Yuting Ping
- Capital Medical University, People's Republic of China
| | - Yiyun Ouyang
- Capital Medical University, People's Republic of China
| | - Manhua Zhang
- Capital Medical University, People's Republic of China
| | - Wen Zheng
- Capital Medical University, People's Republic of China
| |
Collapse
|
3
|
Duits AA, de Ronde EM, Vinke RS, Vos SH, Esselink RAJ, Kessels RPC. The impact of deep brain stimulation of the subthalamic nucleus on facial emotion recognition in patients with Parkinson's disease. J Neuropsychol 2024; 18 Suppl 1:134-141. [PMID: 37353988 DOI: 10.1111/jnp.12336] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2023] [Revised: 06/01/2023] [Accepted: 06/14/2023] [Indexed: 06/25/2023]
Abstract
Deep brain stimulation (DBS) of the subthalamic nucleus (STN) is successful in patients with advanced Parkinson's disease (PD) but may worsen cognitive outcome, including facial emotion recognition (FER). Data-analyses on 59 consecutive PD patients with complete pre- and postoperative assessments, using a sensitive FER test, showed no changes in FER 1 year after STN-DBS surgery, both after group and individual analyses. These findings do however not exclude the impact of FER in and on itself on the outcome after STN-DBS.
Collapse
Affiliation(s)
- Annelien A Duits
- Department of Medical Psychology, Radboud University Medical Centre, Nijmegen, The Netherlands
- Department of Psychiatry and Neuropsychology, Maastricht University, Maastricht, The Netherlands
| | - Eva M de Ronde
- Donders Institute for Brain, Cognition and Behaviour, Department of Neurosurgery, Radboud University Medical Centre, Nijmegen, The Netherlands
| | - R Saman Vinke
- Donders Institute for Brain, Cognition and Behaviour, Department of Neurosurgery, Radboud University Medical Centre, Nijmegen, The Netherlands
| | - Sandra H Vos
- Vincent van Gogh Institute for Psychiatry, Centre of Excellence for Korsakoff and Alcohol-related Cognitive Disorders, Venray, The Netherlands
| | - Rianne A J Esselink
- Donders Institute for Brain, Cognition and Behaviour, Department of Neurosurgery, Radboud University Medical Centre, Nijmegen, The Netherlands
- Department of Neurology, Radboud University Medical Centre, Nijmegen, The Netherlands
| | - Roy P C Kessels
- Department of Medical Psychology, Radboud University Medical Centre, Nijmegen, The Netherlands
- Vincent van Gogh Institute for Psychiatry, Centre of Excellence for Korsakoff and Alcohol-related Cognitive Disorders, Venray, The Netherlands
- Donders Institute for Brain. Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
| |
Collapse
|
4
|
Xia R, Heise MJ, Bowman LC. Parental emotionality is related to preschool children's neural responses to emotional faces. Soc Cogn Affect Neurosci 2024; 19:nsad078. [PMID: 38123451 PMCID: PMC10868131 DOI: 10.1093/scan/nsad078] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2023] [Revised: 11/28/2023] [Accepted: 12/20/2023] [Indexed: 12/23/2023] Open
Abstract
The ability to accurately decode others' facial expressions is essential for successful social interaction. Previous theories suggest that aspects of parental emotionality-the frequency, persistence and intensity of parents' own emotions-can influence children's emotion perception. Through a combination of mechanisms, parental emotionality may shape how children's brains specialize to respond to emotional expressions, but empirical data are lacking. The present study provides a direct empirical test of the relation between the intensity, persistence and frequency of parents' own emotions and children's neural responses to perceiving emotional expressions. Event-related potentials (ERPs) were recorded as typically developing 3- to 5-year-old children (final Ns = 59 and 50) passively viewed faces expressing different emotional valences (happy, angry and fearful) at full and reduced intensity (100% intense expression and 40% intense expression). We examined relations between parental emotionality and children's mean amplitude ERP N170 and negative central responses. The findings demonstrate a clear relation between parental emotionality and children's neural responses (in the N170 mean amplitude and latency) to emotional expressions and suggest that parents may influence children's emotion-processing neural circuitry.
Collapse
Affiliation(s)
- Ruohan Xia
- Department of Psychology, University of California, Davis, 1 Shields Ave, Davis, CA 95616, USA
- Center for Mind and Brain, 202 Cousteau Pl, Davis, CA 95616, USA
| | - Megan J Heise
- Division of HIV, Infectious Disease, and Global Medicine, University of California, San Francisco, 505 Parnassus Ave, San Francisco, CA 94143, USA
| | - Lindsay C Bowman
- Department of Psychology, University of California, Davis, 1 Shields Ave, Davis, CA 95616, USA
- Center for Mind and Brain, 202 Cousteau Pl, Davis, CA 95616, USA
| |
Collapse
|
5
|
Chen C, Messinger DS, Chen C, Yan H, Duan Y, Ince RAA, Garrod OGB, Schyns PG, Jack RE. Cultural facial expressions dynamically convey emotion category and intensity information. Curr Biol 2024; 34:213-223.e5. [PMID: 38141619 PMCID: PMC10831323 DOI: 10.1016/j.cub.2023.12.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2023] [Revised: 10/27/2023] [Accepted: 12/01/2023] [Indexed: 12/25/2023]
Abstract
Communicating emotional intensity plays a vital ecological role because it provides valuable information about the nature and likelihood of the sender's behavior.1,2,3 For example, attack often follows signals of intense aggression if receivers fail to retreat.4,5 Humans regularly use facial expressions to communicate such information.6,7,8,9,10,11 Yet how this complex signaling task is achieved remains unknown. We addressed this question using a perception-based, data-driven method to mathematically model the specific facial movements that receivers use to classify the six basic emotions-"happy," "surprise," "fear," "disgust," "anger," and "sad"-and judge their intensity in two distinct cultures (East Asian, Western European; total n = 120). In both cultures, receivers expected facial expressions to dynamically represent emotion category and intensity information over time, using a multi-component compositional signaling structure. Specifically, emotion intensifiers peaked earlier or later than emotion classifiers and represented intensity using amplitude variations. Emotion intensifiers are also more similar across emotions than classifiers are, suggesting a latent broad-plus-specific signaling structure. Cross-cultural analysis further revealed similarities and differences in expectations that could impact cross-cultural communication. Specifically, East Asian and Western European receivers have similar expectations about which facial movements represent high intensity for threat-related emotions, such as "anger," "disgust," and "fear," but differ on those that represent low threat emotions, such as happiness and sadness. Together, our results provide new insights into the intricate processes by which facial expressions can achieve complex dynamic signaling tasks by revealing the rich information embedded in facial expressions.
Collapse
Affiliation(s)
- Chaona Chen
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK.
| | - Daniel S Messinger
- Departments of Psychology, Pediatrics, and Electrical & Computer Engineering, University of Miami, 5665 Ponce De Leon Blvd, Coral Gables, FL 33146, USA
| | - Cheng Chen
- Foreign Language Department, Teaching Centre for General Courses, Chengdu Medical College, 601 Tianhui Street, Chengdu 610083, China
| | - Hongmei Yan
- The MOE Key Lab for Neuroinformation, University of Electronic Science and Technology of China, North Jianshe Road, Chengdu 611731, China
| | - Yaocong Duan
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Robin A A Ince
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Oliver G B Garrod
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Philippe G Schyns
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Rachael E Jack
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| |
Collapse
|
6
|
Rainer LJ, Kuchukhidze G, Trinka E, Braun M, Kronbichler M, Langthaler P, Zimmermann G, Kronbichler L, Said-Yürekli S, Kirschner M, Zamarian L, Schmid E, Jokeit H, Höfler J. Recognition and perception of emotions in juvenile myoclonic epilepsy. Epilepsia 2023; 64:3319-3330. [PMID: 37795683 DOI: 10.1111/epi.17783] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2023] [Revised: 09/26/2023] [Accepted: 09/28/2023] [Indexed: 10/06/2023]
Abstract
OBJECTIVE Perception and recognition of emotions are fundamental prerequisites of human life. Patients with juvenile myoclonic epilepsy (JME) may have emotional and behavioral impairments that might influence socially desirable interactions. We aimed to investigate perception and recognition of emotions in patients with JME by means of neuropsychological tests and functional magnetic resonance imaging (fMRI). METHODS Sixty-five patients with JME (median age = 27 years, interquartile range [IQR] = 23-34) were prospectively recruited at the Department of Neurology, Christian Doppler University Hospital, Paracelsus Medical University, Salzburg, Austria. Patients were compared to 68 healthy controls (median age = 24 years, IQR = 21-31), matched for sex, age, and education. All study participants underwent the Networks of Emotion Processing test battery (NEmo), an fMRI paradigm of "dynamic fearful faces," a structured interview for psychiatric and personality disorders, and comprehensive neuropsychological testing. RESULTS JME patients versus healthy controls demonstrated significant deficits in emotion recognition in facial and verbal tasks of all emotions, especially fear. fMRI revealed decreased amygdala activation in JME patients as compared to healthy controls. Patients were at a higher risk of experiencing psychiatric disorders as compared to healthy controls. Cognitive evaluation revealed impaired attentional and executive functioning, namely psychomotor speed, tonic alertness, divided attention, mental flexibility, and inhibition of automated reactions. Duration of epilepsy correlated negatively with parallel prosodic and facial emotion recognition in NEmo. Deficits in emotion recognition were not associated with psychiatric comorbidities, impaired attention and executive functions, types of seizures, and treatment. SIGNIFICANCE This prospective study demonstrated that as compared to healthy subjects, patients with JME had significant deficits in recognition and perception of emotions as shown by neuropsychological tests and fMRI. The results of this study may have importance for psychological/psychotherapeutic interventions in the management of patients with JME.
Collapse
Affiliation(s)
- Lucas Johannes Rainer
- Department of Neurology, Christian Doppler University Hospital, Paracelsus Medical University, Center for Cognitive Neuroscience Salzburg, member of the European Reference Network EpiCARE, Salzburg, Austria
- Neuroscience Institute, Christian Doppler University Hospital, Center for Cognitive Neuroscience, Salzburg, Austria
- Department of Child and Adolescent Psychiatry, Christian Doppler University Hospital, Paracelsus Medical University, Salzburg, Austria
| | - Giorgi Kuchukhidze
- Department of Neurology, Christian Doppler University Hospital, Paracelsus Medical University, Center for Cognitive Neuroscience Salzburg, member of the European Reference Network EpiCARE, Salzburg, Austria
- Neuroscience Institute, Christian Doppler University Hospital, Center for Cognitive Neuroscience, Salzburg, Austria
| | - Eugen Trinka
- Department of Neurology, Christian Doppler University Hospital, Paracelsus Medical University, Center for Cognitive Neuroscience Salzburg, member of the European Reference Network EpiCARE, Salzburg, Austria
- Neuroscience Institute, Christian Doppler University Hospital, Center for Cognitive Neuroscience, Salzburg, Austria
- Department of Public Health, Health Services Research and Health Technology Assessment, University for Health Sciences, Medical Informatics, and Technology, Hall in Tirol, Austria
- Karl-Landsteiner Institute for Neurorehabilitation and Space Neurology, Salzburg, Austria
| | - Mario Braun
- Center for Cognitive Neuroscience/Department of Psychology, Faculty of Natural Sciences, Paris Lodron University, Salzburg, Austria
| | - Martin Kronbichler
- Neuroscience Institute, Christian Doppler University Hospital, Center for Cognitive Neuroscience, Salzburg, Austria
- Center for Cognitive Neuroscience/Department of Psychology, Faculty of Natural Sciences, Paris Lodron University, Salzburg, Austria
| | - Patrick Langthaler
- Department of Neurology, Christian Doppler University Hospital, Paracelsus Medical University, Center for Cognitive Neuroscience Salzburg, member of the European Reference Network EpiCARE, Salzburg, Austria
- Department of Mathematics, Faculty of Natural Sciences, Paris Lodron University, Salzburg, Austria
| | - Georg Zimmermann
- Team Biostatistics and Big Medical Data, Lab for Intelligent Data Analytics Salzburg, Paracelsus Medical University, Salzburg, Austria
- Research and Innovation Management, Paracelsus Medical University, Salzburg, Austria
| | - Lisa Kronbichler
- Neuroscience Institute, Christian Doppler University Hospital, Center for Cognitive Neuroscience, Salzburg, Austria
- Department of Child and Adolescent Psychiatry, Christian Doppler University Hospital, Paracelsus Medical University, Salzburg, Austria
| | - Sarah Said-Yürekli
- Department of Neurology, Christian Doppler University Hospital, Paracelsus Medical University, Center for Cognitive Neuroscience Salzburg, member of the European Reference Network EpiCARE, Salzburg, Austria
- Center for Cognitive Neuroscience/Department of Psychology, Faculty of Natural Sciences, Paris Lodron University, Salzburg, Austria
| | - Margarita Kirschner
- Department of Neurology, Christian Doppler University Hospital, Paracelsus Medical University, Center for Cognitive Neuroscience Salzburg, member of the European Reference Network EpiCARE, Salzburg, Austria
| | - Laura Zamarian
- Department of Neurology, Medical University of Innsbruck, Innsbruck, Austria
| | - Elisabeth Schmid
- Department of Child and Adolescent Psychiatry, Christian Doppler University Hospital, Paracelsus Medical University, Salzburg, Austria
| | | | - Julia Höfler
- Department of Neurology, Christian Doppler University Hospital, Paracelsus Medical University, Center for Cognitive Neuroscience Salzburg, member of the European Reference Network EpiCARE, Salzburg, Austria
| |
Collapse
|
7
|
Sharman R, Kyle SD, Espie CA, Tamm S. Associations between self-reported sleep, overnight memory consolidation, and emotion perception: A large-scale online study in the general population. J Sleep Res 2023:e14094. [PMID: 38009410 DOI: 10.1111/jsr.14094] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2023] [Revised: 10/08/2023] [Accepted: 10/24/2023] [Indexed: 11/28/2023]
Abstract
Experimental studies suggest that short or disrupted sleep impairs memory consolidation, mood, and perception of emotional stimuli. However, studies have chiefly relied on laboratory-based study designs and small sample sizes. The aim of this fully online and pre-registered study was to investigate the association between sleep and overnight memory consolidation, emotion perception, and affect in a large, self-selected UK sample. A total of 1646 participants (473 completed) took part in an online study, where they completed a declarative (word-pairs) memory task, emotion perception task (valence ratings of images), and rated their affect within 2 h of bed-time. The following morning, participants reported on their state affect, sleep for the previous night, completed a cued recall task for the previously presented word-pairs, rated the valence of previously viewed images, and completed a surprise recognition task. Demographic data and habitual sleep quality and duration (sleep traits) were also recorded. Habitual sleep traits were associated with immediate recall for the word-pairs task, while self-reported sleep parameters for the specific night were not associated with overnight memory consolidation. Neither habitual sleep traits, nor nightly sleep parameters were associated with unpleasantness ratings to negative stimuli or overnight habituation. Habitual poor sleep was associated with less positive and more negative affect, and morning affect was predicted by the specific night's sleep. This study suggests that overnight emotional processing and declarative memory may not be associated with self-reported sleep across individuals. More work is needed to understand how findings from laboratory-based studies extrapolate to real-world samples and contexts.
Collapse
Affiliation(s)
- Rachel Sharman
- Sleep and Circadian Neuroscience Institute (SCNi), Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, UK
| | - Simon D Kyle
- Sleep and Circadian Neuroscience Institute (SCNi), Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, UK
| | - Colin A Espie
- Sleep and Circadian Neuroscience Institute (SCNi), Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, UK
| | - Sandra Tamm
- Centre for Psychiatry Research, Department of Clinical Neuroscience, Karolinska Institutet, & Stockholm Health Care Services, Stockholm, Sweden
- Department of Psychiatry, University of Oxford, Oxford, UK
| |
Collapse
|
8
|
Vaessen M, Van der Heijden K, de Gelder B. Modality-specific brain representations during automatic processing of face, voice and body expressions. Front Neurosci 2023; 17:1132088. [PMID: 37869514 PMCID: PMC10587395 DOI: 10.3389/fnins.2023.1132088] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/26/2022] [Accepted: 09/05/2023] [Indexed: 10/24/2023] Open
Abstract
A central question in affective science and one that is relevant for its clinical applications is how emotions provided by different stimuli are experienced and represented in the brain. Following the traditional view emotional signals are recognized with the help of emotion concepts that are typically used in descriptions of mental states and emotional experiences, irrespective of the sensory modality. This perspective motivated the search for abstract representations of emotions in the brain, shared across variations in stimulus type (face, body, voice) and sensory origin (visual, auditory). On the other hand, emotion signals like for example an aggressive gesture, trigger rapid automatic behavioral responses and this may take place before or independently of full abstract representation of the emotion. This pleads in favor specific emotion signals that may trigger rapid adaptative behavior only by mobilizing modality and stimulus specific brain representations without relying on higher order abstract emotion categories. To test this hypothesis, we presented participants with naturalistic dynamic emotion expressions of the face, the whole body, or the voice in a functional magnetic resonance (fMRI) study. To focus on automatic emotion processing and sidestep explicit concept-based emotion recognition, participants performed an unrelated target detection task presented in a different sensory modality than the stimulus. By using multivariate analyses to assess neural activity patterns in response to the different stimulus types, we reveal a stimulus category and modality specific brain organization of affective signals. Our findings are consistent with the notion that under ecological conditions emotion expressions of the face, body and voice may have different functional roles in triggering rapid adaptive behavior, even if when viewed from an abstract conceptual vantage point, they may all exemplify the same emotion. This has implications for a neuroethologically grounded emotion research program that should start from detailed behavioral observations of how face, body, and voice expressions function in naturalistic contexts.
Collapse
|
9
|
Omary A, Khalifeh N, Cotter DL, Kim MS, Choudhury F, Ahmadi H, Geffner ME, Herting MM. Altered Emotion Perception Linked to Structural Brain Differences in Youth With Congenital Adrenal Hyperplasia. J Clin Endocrinol Metab 2023; 108:e1134-e1146. [PMID: 36930527 PMCID: PMC10505548 DOI: 10.1210/clinem/dgad158] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/09/2022] [Revised: 03/01/2023] [Accepted: 03/15/2023] [Indexed: 03/18/2023]
Abstract
CONTEXT Congenital adrenal hyperplasia (CAH) is a genetic disorder that results in hormonal imbalances and decreased brain volumes in regions important for emotional processing. OBJECTIVE To examine whether emotion perception differs between youth with CAH and control youth, and if these differences relate to brain volumes. METHODS In this cross-sectional study of 27 youths with CAH (mean age = 12.63 years, 16 female) and 35 age- and sex-matched controls (mean age = 13.03 years, 20 female), each participant rated picture stimuli and completed a 3T structural brain scan. Valence and arousal ratings and reaction times of 61 affective images were assessed. Gray matter volumes were measured by MRI. RESULTS Youth with CAH had lower valence ratings for negative (P = .007) and neutral (P = .019) images. Controls showed differences in reaction times and arousal ratings across stimuli conditions, but youth with CAH did not. Brain volumes of the right amygdala (P = .025) and left hippocampus (P = .002) were associated with valence ratings. Left rostral middle frontal (P < .001) and right medial orbitofrontal cortex (P = .002) volumes were negatively related to valence scores only in youth with CAH, whereas left medial orbitofrontal cortex (P < .001) volumes were associated with valence scores positively in youth with CAH and negatively in controls. CONCLUSION Findings suggest that youth with CAH perceive emotive stimuli as more unpleasant. Decreased brain volumes in the amygdala, hippocampus, and prefrontal cortex are associated with these measures of altered emotion perception in youth with CAH.
Collapse
Affiliation(s)
- Adam Omary
- Department of Population and Public Health Sciences, Keck School of Medicine, University of Southern California, Los Angeles, CA 90032, USA
| | - Noor Khalifeh
- Department of Population and Public Health Sciences, Keck School of Medicine, University of Southern California, Los Angeles, CA 90032, USA
| | - Devyn L Cotter
- Department of Population and Public Health Sciences, Keck School of Medicine, University of Southern California, Los Angeles, CA 90032, USA
- Neuroscience Graduate Program, University of Southern California, Los Angeles, CA 90089, USA
| | - Mimi S Kim
- Center for Endocrinology, Diabetes, and Metabolism, and The Saban Research Institute at Children's Hospital Los Angeles; Keck School of Medicine, University of Southern California, Los Angeles, CA 90027, USA
| | - Farzana Choudhury
- Department of Population and Public Health Sciences, Keck School of Medicine, University of Southern California, Los Angeles, CA 90032, USA
| | - Hedyeh Ahmadi
- Department of Population and Public Health Sciences, Keck School of Medicine, University of Southern California, Los Angeles, CA 90032, USA
| | - Mitchell E Geffner
- Center for Endocrinology, Diabetes, and Metabolism, and The Saban Research Institute at Children's Hospital Los Angeles; Keck School of Medicine, University of Southern California, Los Angeles, CA 90027, USA
| | - Megan M Herting
- Department of Population and Public Health Sciences, Keck School of Medicine, University of Southern California, Los Angeles, CA 90032, USA
| |
Collapse
|
10
|
Kafetsios K, Hess U. Reconceptualizing Emotion Recognition Ability. J Intell 2023; 11:123. [PMID: 37367525 DOI: 10.3390/jintelligence11060123] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2023] [Revised: 06/12/2023] [Accepted: 06/14/2023] [Indexed: 06/28/2023] Open
Abstract
Emotion decoding accuracy (EDA) plays a central role within the emotional intelligence (EI) ability model. The EI-ability perspective typically assumes personality antecedents and social outcomes of EI abilities, yet, traditionally, there has been very limited research to support this contention. The present paper argues that the way in which EDA has been conceptualized and operationalized in EI research has ignored developments in social perception theory and research. These developments point, on one hand, to the importance of embedding emotion expressions in a social context and, on the other, to reformulating the definitions of emotion decoding accuracy. The present paper outlines the importance of context in the framework of a truth and bias model of the social perception of emotions (Assessment of Contextualized Emotions, ACE) for EI abilities.
Collapse
Affiliation(s)
- Konstantinos Kafetsios
- School of Psychology, Aristotle University of Thessaloniki, 54124 Thessaloniki, Greece
- Psychology Department, Palacký University, 779 00 Olomouc, Czech Republic
| | - Ursula Hess
- Institute of Psychology, Humboldt University, 10117 Berlin, Germany
| |
Collapse
|
11
|
Held MJ, Fehn T, Gauglitz IK, Schütz A. Training Emotional Intelligence Online: An Evaluation of WEIT 2.0. J Intell 2023; 11:122. [PMID: 37367524 DOI: 10.3390/jintelligence11060122] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2023] [Revised: 06/02/2023] [Accepted: 06/06/2023] [Indexed: 06/28/2023] Open
Abstract
With the growing popularity of online courses, there is an increasing need for scientifically validated online interventions that can improve emotional competencies. We addressed this demand by evaluating an extended version of the Web-Based Emotional Intelligence Training (WEIT 2.0) program. Based on the four-branch model of emotional intelligence, WEIT 2.0 focuses on improving participants' emotion perception and emotion regulation skills. A total of 214 participants were randomly assigned to the training group (n = 91) or a waiting list control group (n = 123) to evaluate short-term (directly after WEIT 2.0) and long-term intervention effects (8 weeks later). Two-way MANOVAs and mixed ANOVAs showed significant treatment effects for self-reported emotion perception of the self, as well as emotion regulation of the self and others, after 8 weeks. No significant treatment effects were found for self-reported emotion perception in others or for performance-based emotion perception or emotion regulation. Moderator analyses revealed no significant effects of digital affinity on training success from the pretest to the posttest. The findings suggest that components of self-reported emotional intelligence can be enhanced through WEIT 2.0, but performance-based emotional intelligence cannot. Further research is needed on the online training of emotional intelligence and the mechanisms that underlie training success.
Collapse
Affiliation(s)
- Marco Jürgen Held
- Institute for Psychology, University of Bamberg, 96047 Bamberg, Germany
| | - Theresa Fehn
- Institute for Psychology, University of Bamberg, 96047 Bamberg, Germany
| | | | - Astrid Schütz
- Institute for Psychology, University of Bamberg, 96047 Bamberg, Germany
| |
Collapse
|
12
|
Plate RC, Jones C, Zhao S, Flum MW, Steinberg J, Daley G, Corbett N, Neumann C, Waller R. "But not the music": psychopathic traits and difficulties recognising and resonating with the emotion in music. Cogn Emot 2023; 37:748-762. [PMID: 37104122 DOI: 10.1080/02699931.2023.2205105] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2022] [Revised: 12/23/2022] [Accepted: 04/05/2023] [Indexed: 04/28/2023]
Abstract
Recognising and responding appropriately to emotions is critical to adaptive psychological functioning. Psychopathic traits (e.g. callous, manipulative, impulsive, antisocial) are related to differences in recognition and response when emotion is conveyed through facial expressions and language. Use of emotional music stimuli represents a promising approach to improve our understanding of the specific emotion processing difficulties underlying psychopathic traits because it decouples recognition of emotion from cues directly conveyed by other people (e.g. facial signals). In Experiment 1, participants listened to clips of emotional music and identified the emotional content (Sample 1, N = 196) or reported on their feelings elicited by the music (Sample 2, N = 197). Participants accurately recognised (t(195) = 32.78, p < .001, d = 4.69) and reported feelings consistent with (t(196) = 7.84, p < .001, d = 1.12) the emotion conveyed in the music. However, psychopathic traits were associated with reduced emotion recognition accuracy (F(1, 191) = 19.39, p < .001) and reduced likelihood of feeling the emotion (F(1, 193) = 35.45, p < .001), particularly for fearful music. In Experiment 2, we replicated findings for broad difficulties with emotion recognition (Sample 3, N = 179) and emotional resonance (Sample 4, N = 199) associated with psychopathic traits. Results offer new insight into emotion recognition and response difficulties that are associated with psychopathic traits.
Collapse
Affiliation(s)
- R C Plate
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, USA
| | - C Jones
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, USA
| | - S Zhao
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, USA
| | - M W Flum
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, USA
| | - J Steinberg
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, USA
| | - G Daley
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, USA
| | - N Corbett
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, USA
| | - C Neumann
- Department of Psychology, University of North Texas, Denton, TX, USA
| | - R Waller
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
13
|
Abstract
OBJECTIVE Disorders of social cognition, such as difficulties with emotion perception, alexithymia, Theory of Mind (ToM), empathy and disorders of emotion regulation, are prevalent and pervasive problems across many neurological, neurodevelopmental and neuropsychiatric conditions. Clinicians are familiar with how these difficulties present but assessment and treatment has lagged behind other traditional cognitive domains, such as memory, language and executive functioning. METHOD In this paper, we review the prevalence and degree of impairment associated with disorders of social cognition and emotion regulation across a range of clinical conditions, with particular emphasis on their relationship to cognitive deficits and also real-world functioning. We reported effects sizes from published meta-analyses for a range of clinical disorders and also review test usage and available tests. RESULTS In general, many clinical conditions are associated with impairments in social cognition and emotion regulation. Effect sizes range from small to very large and are comparable to effect sizes for impairments in nonsocial cognition. Socio-emotional impairments are also associated with social and adaptive functioning. In reviewing prior research, it is apparent that the standardized assessment of social cognition, in particular, is not routine in clinical practice. This is despite the fact that there are a range of tools available and accruing evidence for the efficacy of interventions for social cognitive impairments. CONCLUSION We are using this information to urge and call for clinicians to factor social cognition into their clinical assessments and treatment planning, as to provide rigorous, holistic and comprehensive person-centred care.
Collapse
Affiliation(s)
- Skye McDonald
- School of Psychology, University of New South Wales, Sydney, Australia
| | - Travis Wearne
- School of Psychology, University of Western Sydney, Penrith South, Australia
| | - Michelle Kelly
- School of Psychological Sciences, University of Newcastle, Callaghan, Australia
| |
Collapse
|
14
|
Nijman SA, Veling W, Timmerman ME, Pijnenborg GHM. Trajectories of Emotion Recognition Training in Virtual Reality and Predictors of Improvement for People with a Psychotic Disorder. Cyberpsychol Behav Soc Netw 2023; 26:288-299. [PMID: 37071641 PMCID: PMC10125400 DOI: 10.1089/cyber.2022.0228] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Indexed: 04/19/2023]
Abstract
Meta-analyses have found that social cognition training (SCT) has large effects on the emotion recognition ability of people with a psychotic disorder. Virtual reality (VR) could be a promising tool for delivering SCT. Presently, it is unknown how improvements in emotion recognition develop during (VR-)SCT, which factors impact improvement, and how improvements in VR relate to improvement outside VR. Data were extracted from task logs from a pilot study and randomized controlled trials on VR-SCT (n = 55). Using mixed-effects generalized linear models, we examined the: (a) effect of treatment session (1-5) on VR accuracy and VR response time for correct answers; (b) main effects and moderation of participant and treatment characteristics on VR accuracy; and (c) the association between baseline performance on the Ekman 60 Faces task and accuracy in VR, and the interaction of Ekman 60 Faces change scores (i.e., post-treatment - baseline) with treatment session. Accounting for the task difficulty level and the type of presented emotion, participants became more accurate at the VR task (b = 0.20, p < 0.001) and faster (b = -0.10, p < 0.001) at providing correct answers as treatment sessions progressed. Overall emotion recognition accuracy in VR decreased with age (b = -0.34, p = 0.009); however, no significant interactions between any of the moderator variables and treatment session were found. An association between baseline Ekman 60 Faces and VR accuracy was found (b = 0.04, p = 0.006), but no significant interaction between difference scores and treatment session. Emotion recognition accuracy improved during VR-SCT, but improvements in VR may not generalize to non-VR tasks and daily life.
Collapse
Affiliation(s)
- Saskia A Nijman
- Department of Long-Term Care, GGZ Drenthe, Assen, Netherlands
- University Medical Center Groningen, University of Groningen, Groningen, Netherlands
- Department of Clinical & Developmental Neuropsychology and Faculty of Behavioral and Social Sciences, University of Groningen, Groningen, Netherlands
| | - Wim Veling
- University Medical Center Groningen, University of Groningen, Groningen, Netherlands
| | - Marieke E Timmerman
- Department of Psychometrics and Statistics, Faculty of Behavioral and Social Sciences, University of Groningen, Groningen, Netherlands
| | - Gerdina H M Pijnenborg
- Department of Long-Term Care, GGZ Drenthe, Assen, Netherlands
- Department of Clinical & Developmental Neuropsychology and Faculty of Behavioral and Social Sciences, University of Groningen, Groningen, Netherlands
| |
Collapse
|
15
|
Ventura-Bort C, Panza D, Weymar M. Words matter when inferring emotions: a conceptual replication and extension. Cogn Emot 2023:1-15. [PMID: 36856025 DOI: 10.1080/02699931.2023.2183491] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/02/2023]
Abstract
It is long known that facial configurations play a critical role when inferring mental and emotional states from others. Nevertheless, there is still a scientific debate on how we infer emotions from facial configurations. The theory of constructed emotion (TCE) suggests that we may infer different emotions from the same facial configuration, depending on the context (e.g. provided by visual and lexical cues) in which they are perceived. For instance, a recent study found that participants were more accurate in inferring mental and emotional states across three different datasets (i.e. RMET, static and dynamic emojis) when words were provided (i.e. forced-choice task), compared to when they were not (i.e. free-labelling task), suggesting that words serve as contexts that modulate the inference from facial configurations. The goal of the current within-subject study was to replicate and extend these findings by adding a fourth dataset (KDEF-dyn), consisting of morphed human faces (to increase the ecological validity). Replicating previous findings, we observed that words increased accuracy across the three (previously used) datasets, an effect that was also observed for the facial morphed stimuli. Our findings are in line with the TCE, providing support for the importance of contextual verbal cues in emotion perception.
Collapse
Affiliation(s)
- C Ventura-Bort
- Department of Biological Psychology and Affective Science, Faculty of Human Sciences, University of Potsdam, Potsdam, Germany
| | - D Panza
- Department of Biological Psychology and Affective Science, Faculty of Human Sciences, University of Potsdam, Potsdam, Germany
| | - M Weymar
- Department of Biological Psychology and Affective Science, Faculty of Human Sciences, University of Potsdam, Potsdam, Germany.,Research Focus Cognitive Sciences, University of Potsdam, Potsdam, Germany.,Faculty of Health Sciences Brandenburg, University of Potsdam, Potsdam, Germany
| |
Collapse
|
16
|
Chiang SK, Liu WY, Hu TM. The effect of computerized working memory training on working memory and emotion perception for patients with chronic schizophrenia and normal cognition. Appl Neuropsychol Adult 2022:1-9. [PMID: 36576049 DOI: 10.1080/23279095.2022.2159825] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
Abstract
BACKGROUND Cognitive impairment and affective symptoms are hallmark features of patients with schizophrenia. This study determines whether a computerized working memory training program improves the patient's working memory and affective perception. METHODS Thirty-nine male patients with schizophrenia, aged 25-65, participated in this study. The study uses a single-blind randomized controlled design. Twenty subjects were assigned to the experimental group and received an eight-week working memory computerized training course comprising four modules of the CogniPlus system. Nineteen subjects were assigned to the control group and received treatment as usual. All subjects received the same assessments twice, including the Mini-Mental Status Examination (MMSE), Working Memory Index (WMI) of Wechsler Adult Intelligence Scale-Third Edition, and the subjective rating of pictures of the International affective picture system by Self-Assessment Manikin (SAM). RESULTS This study shows that computerized working memory training improves WMI and the score for MMSE and produces a significant increase in the pleasure score for S.A.M. for negative pictures, between the pretest and post-test for the experimental group. CONCLUSIONS Working memory training improves working memory and emotion perception for patients with chronic schizophrenia and normal cognition. The limitations of this study and suggestions for future study are also discussed.
Collapse
Affiliation(s)
| | - Wan-Yu Liu
- Chung Shan Medical Rehabilitative Hospital, Taichung, Taiwan
| | | |
Collapse
|
17
|
Leitner MC, Meurer V, Hutzler F, Schuster S, Hawelka S. The effect of masks on the recognition of facial expressions: A true-to-life study on the perception of basic emotions. Front Psychol 2022; 13:933438. [PMID: 36619058 PMCID: PMC9815612 DOI: 10.3389/fpsyg.2022.933438] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2022] [Accepted: 12/06/2022] [Indexed: 12/24/2022] Open
Abstract
Mouth-to-nose face masks became ubiquitous due to the COVID-19 pandemic. This ignited studies on the perception of emotions in masked faces. Most of these studies presented still images of an emotional face with a face mask digitally superimposed upon the nose-mouth region. A common finding of these studies is that smiles become less perceivable. The present study investigated the recognition of basic emotions in video sequences of faces. We replicated much of the evidence gathered from presenting still images with digitally superimposed masks. We also unearthed fundamental differences in comparison to existing studies with regard to the perception of smile which is less impeded than previous studies implied.
Collapse
Affiliation(s)
- Michael Christian Leitner
- Salzburg University of Applied Sciences, Salzburg, Austria,Centre for Cognitive Neuroscience (CCNS), University of Salzburg, Salzburg, Austria,Department of Psychology, University of Salzburg, Salzburg, Austria
| | - Verena Meurer
- Centre for Cognitive Neuroscience (CCNS), University of Salzburg, Salzburg, Austria,Department of Psychology, University of Salzburg, Salzburg, Austria
| | - Florian Hutzler
- Centre for Cognitive Neuroscience (CCNS), University of Salzburg, Salzburg, Austria,Department of Psychology, University of Salzburg, Salzburg, Austria
| | - Sarah Schuster
- Centre for Cognitive Neuroscience (CCNS), University of Salzburg, Salzburg, Austria,Department of Psychology, University of Salzburg, Salzburg, Austria
| | - Stefan Hawelka
- Centre for Cognitive Neuroscience (CCNS), University of Salzburg, Salzburg, Austria,Department of Psychology, University of Salzburg, Salzburg, Austria,*Correspondence: Stefan Hawelka, ✉
| |
Collapse
|
18
|
Albuquerque N, Resende B. Dogs functionally respond to and use emotional information from human expressions. Evol Hum Sci 2022; 5:e2. [PMID: 37587944 PMCID: PMC10426098 DOI: 10.1017/ehs.2022.57] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2022] [Revised: 09/11/2022] [Accepted: 11/21/2022] [Indexed: 01/11/2023] Open
Abstract
Emotions are critical for humans, not only feeling and expressing them, but also reading the emotional expressions of others. For a long time, this ability was thought to be exclusive to people; however, there is now evidence that other animals also rely on emotion perception to guide their behaviour and to adjust their actions in such way as to guarantee success in their social groups. This is the case for domestic dogs, who have tremendously complex abilities to perceive the emotional expressions not only of their conspecifics but also of human beings. In this paper we discuss dogs' capacities to read human emotions. More than perception, though, are dogs able to use this emotional information in a functional way? Does reading emotional expressions allow them to live functional social lives? Dogs can respond functionally to emotional expressions and can use the emotional information they obtain from others during problem-solving, that is, acquiring information from faces and body postures allows them to make decisions. Here, we tackle questions related to the abilities of responding to and using emotional information from human expressions in a functional way and discuss how far dogs can go when reading our emotions.
Collapse
Affiliation(s)
| | - Briseida Resende
- Institute of Psychology, University of São Paulo, São Paulo, Brazil
| |
Collapse
|
19
|
Wu Q, Peng K, Xie Y, Lai Y, Liu X, Zhao Z. An ingroup disadvantage in recognizing micro-expressions. Front Psychol 2022; 13:1050068. [PMID: 36507018 PMCID: PMC9732534 DOI: 10.3389/fpsyg.2022.1050068] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2022] [Accepted: 11/08/2022] [Indexed: 11/27/2022] Open
Abstract
Micro-expression is a fleeting facial expression of emotion that usually occurs in high-stake situations and reveals the true emotion that a person tries to conceal. Due to its unique nature, recognizing micro-expression has great applications for fields like law enforcement, medical treatment, and national security. However, the psychological mechanism of micro-expression recognition is still poorly understood. In the present research, we sought to expand upon previous research to investigate whether the group membership of the expresser influences the recognition process of micro-expressions. By conducting two behavioral studies, we found that contrary to the widespread ingroup advantage found in macro-expression recognition, there was a robust ingroup disadvantage in micro-expression recognition instead. Specifically, in Study 1A and 1B, we found that participants were more accurate at recognizing the intense and subtle micro-expressions of their racial outgroups than those micro-expressions of their racial ingroups, and neither the training experience nor the duration of micro-expressions moderated this ingroup disadvantage. In Study 2A and 2B, we further found that mere social categorization alone was sufficient to elicit the ingroup disadvantage for the recognition of intense and subtle micro-expressions, and such an effect was also unaffected by the duration of micro-expressions. These results suggest that individuals spontaneously employ the social category information of others to recognize micro-expressions, and the ingroup disadvantage in micro-expression stems partly from motivated differential processing of ingroup micro-expressions.
Collapse
Affiliation(s)
- Qi Wu
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China,*Correspondence: Qi Wu,
| | - Kunling Peng
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Yanni Xie
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Yeying Lai
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Xuanchen Liu
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Ziwei Zhao
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| |
Collapse
|
20
|
Ong JH, Liu F. EXPRESS: Frequent experience with face coverings for 10 months improves emotion perception among individuals with high autistic traits: A repeated cross-sectional study. Q J Exp Psychol (Hove) 2022:17470218221135585. [PMID: 36250598 DOI: 10.1177/17470218221135585] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Face coverings pose difficulties for emotion recognition, but it is unclear whether improvement in recognising emotions from the eyes is possible with experience and whether this might be dependent on one's autistic traits, given the associations between high autistic traits and poorer emotion perception and reduced gaze to the eye region. In this preregistered study, participants completed a forced-choice emotion recognition task with photographs of eyes and demographic questionnaires that measure their autistic traits and their interaction frequency with others wearing face coverings at two time points: once at the start of the face covering mandate and again 10 months later. We found that after 10 months, individuals with high autistic traits as a cohort recognise emotions from just the eyes better as a function of their experience with others wearing face coverings, suggesting that emotion perception is malleable even for those who have difficulties with emotion perception.
Collapse
Affiliation(s)
- Jia Hoong Ong
- School of Psychology and Clinical Language Sciences, University of Reading, Reading, United Kingdom 6816
| | - Fang Liu
- School of Psychology and Clinical Language Sciences, University of Reading, Reading, United Kingdom 6816
| |
Collapse
|
21
|
Chung-Fat-Yim A, Chen P, Chan AHD, Marian V. Audio-Visual Interactions During Emotion Processing in Bicultural Bilinguals. Motiv Emot 2022; 46:719-734. [PMID: 36299445 PMCID: PMC9590621 DOI: 10.1007/s11031-022-09953-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2021] [Revised: 05/14/2022] [Accepted: 05/16/2022] [Indexed: 11/30/2022]
Abstract
Despite the growing number of bicultural bilinguals in the world, the way in which multisensory emotions are evaluated by bilinguals who identify with two or more cultures remains unknown. In the present study, Chinese-English bicultural bilinguals from Singapore viewed Asian or Caucasian faces and heard Mandarin or English speech, and evaluated the emotion from one of the two simultaneously-presented modalities. Reliance on the visual modality was greater when bicultural bilinguals processed Western audio-visual emotion information. Although no differences between modalities emerged when processing East-Asian audio-visual emotion information, correlations revealed that bicultural bilinguals increased their reliance on the auditory modality with more daily exposure to East-Asian cultures. Greater interference from the irrelevant modality was observed for Asian faces paired with English speech than for Caucasian faces paired with Mandarin speech. We conclude that processing of emotion in bicultural bilinguals is guided by culture-specific norms, and that familiarity influences how the emotions of those who speak a foreign language are perceived and evaluated.
Collapse
Affiliation(s)
| | - Peiyao Chen
- Swarthmore College, Swarthmore, Pennsylvania
| | - Alice H. D. Chan
- Linguistics and Multilingual Studies, School of Humanities, Nanyang Technological University, Singapore
| | | |
Collapse
|
22
|
Masuda T, Shi S, Varma P, Fisher D, Shirazi S. Do Surrounding People's Emotions Affect Judgment of the Central Person's Emotion? Comparing Within Cultural Variation in Holistic Patterns of Emotion Perception in the Multicultural Canadian Society. Front Hum Neurosci 2022; 16:886971. [PMID: 35874162 PMCID: PMC9300416 DOI: 10.3389/fnhum.2022.886971] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2022] [Accepted: 06/06/2022] [Indexed: 11/13/2022] Open
Abstract
Previous studies in cultural psychology have suggested that when assessing a target person's emotion, East Asians are more likely to incorporate the background figure's emotion into the judgment of the target's emotion compared to North Americans. The objective of this study was to further examine cultural variation in emotion perception within a culturally diverse population that is representative of Canada's multicultural society. We aimed to see whether East-Asian Canadians tended to keep holistic tendencies of their heritage culture regarding emotion perception. Participants were presented with 60 cartoon images consisting of a central figure and four surrounding figures and were then asked to rate the central figure's emotion; out of the four cartoon figures, two were female and two were male. Each character was prepared with 5 different emotional settings with corresponding facial expressions including: extremely sad, moderately sad, neutral, moderately happy, and extremely happy. Each central figure was surrounded by a group of 4 background figures. As a group, the background figures either displayed a sad, happy, or neutral expression. The participant's task was to judge the intensity of the central figures' happiness or sadness on a 10-point Likert scale ranging from 0 (not at all) to 9 (extremely). For analysis, we divided the participants into three groups: European Canadians (N = 105), East Asian Canadians' (N = 104) and Non-East Asian/Non-European Canadians (N = 161). The breakdown for the Non-East Asian/Non-European Canadian group is as follows: 94 South Asian Canadians, 25 Middle Eastern Canadians, 23 African Canadians, 9 Indigenous Canadians, and 10 Latin/Central/South American Canadians. Results comparing European Canadians and East Asian Canadians demonstrated cultural variation in emotion judgment, indicating that East Asian Canadians were in general more likely than their European Canadian counterparts to be affected by the background figures' emotion. The study highlights important cultural variations in holistic and analytic patterns of emotional attention in the ethnically diverse Canadian society. We discussed future studies which broaden the scope of research to incorporate a variety of diverse cultural backgrounds outside of the Western educational context to fully comprehend cultural variations in context related attentional patterns.
Collapse
Affiliation(s)
- Takahiko Masuda
- Department of Psychology, University of Alberta, Edmonton, AB, Canada
| | - Shuwei Shi
- Department of Psychology, University of Alberta, Edmonton, AB, Canada
| | - Pragya Varma
- Department of Psychology, University of Alberta, Edmonton, AB, Canada
| | - Delaney Fisher
- Department of Psychology, University of Alberta, Edmonton, AB, Canada
| | - Safi Shirazi
- Department of Psychology, University of Alberta, Edmonton, AB, Canada
| |
Collapse
|
23
|
Kulke L, Langer T, Valuch C. The Emotional Lockdown: How Social Distancing and Mask Wearing Influence Mood and Emotion Recognition in Adolescents and Adults. Front Psychol 2022; 13:878002. [PMID: 35756255 PMCID: PMC9226820 DOI: 10.3389/fpsyg.2022.878002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2022] [Accepted: 05/06/2022] [Indexed: 11/18/2022] Open
Abstract
During the COVID-19 pandemic, government-mandated protection measures such as contact restrictions and mask wearing significantly affected social interactions. In the current preregistered studies we hypothesized that such measures could influence self-reported mood in adults and in adolescents between 12 and 13 years of age, who are in a critical phase of social development. We found that mood was positively related to face-to-face but not to virtual interactions in adults and that virtual interactions were associated with negative mood in adolescents. This suggests that contact restrictions leading to a decrease in face-to-face compared to virtual interactions may be related to negative mood. To understand if prolonged exposure to people wearing masks during the pandemic might be related to increased sensitivity for subtle visual cues to others’ emotions from the eye region of the face, we also presented both age groups with the same standardized emotion recognition test. We found slightly better performance in emotion recognition from the eyes in our student sample tested during the pandemic relative to a comparable sample tested prior to the pandemic although these differences were restricted to female participants. Adolescents were also better at classifying emotions from the eyes in the current study than in a pre-pandemic sample, with no gender effects occurring in this age group. In conclusion, while social distancing might have detrimental effects on self-reported mood, the ability to recognize others’ emotions from subtle visual cues around the eye region remained comparable or might have even improved during the COVID-19 pandemic.
Collapse
Affiliation(s)
- Louisa Kulke
- Neurocognitive Developmental Psychology, Friedrich-Alexander-Universität Erlangen-Nürnberg, Erlangen, Germany
| | - Theresia Langer
- Neurocognitive Developmental Psychology, Friedrich-Alexander-Universität Erlangen-Nürnberg, Erlangen, Germany
| | - Christian Valuch
- Department of Cognition, Emotion, and Methods in Psychology, Faculty of Psychology, University of Vienna, Vienna, Austria
| |
Collapse
|
24
|
Abstract
Considering the widespread use of face masks during the COVID-19 pandemic, the goal of the current study was to examine how occlusion of the lower half of the face may impact first impression formation. We conducted three experiments, each building on previous research, investigating the effect of face masks on first impressions of faces across the lifespan (children, young and older adults). Experiment 1 examined whether the mandatory influence of happy facial expressions on perceived trustworthiness in young adult faces is influenced by face masks. Experiment 2 examined behavioural consequences of adults' first impressions of child faces to determine whether masks reduce the effect of facial niceness on interpretations of ambiguous behaviour. Experiment 3 investigated consensus for first impressions of trustworthiness and competence in older adult faces with and without masks, as well as consensus on underlying facial cues. The results of all three experiments present converging evidence that masks do not have a significant impact on first impressions and their behavioural consequences.
Collapse
|
25
|
Carbon CC, Held MJ, Schütz A. Reading Emotions in Faces With and Without Masks Is Relatively Independent of Extended Exposure and Individual Difference Variables. Front Psychol 2022; 13:856971. [PMID: 35369259 PMCID: PMC8967961 DOI: 10.3389/fpsyg.2022.856971] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2022] [Accepted: 02/23/2022] [Indexed: 12/30/2022] Open
Abstract
The ability to read emotions in faces helps humans efficiently assess social situations. We tested how this ability is affected by aspects of familiarization with face masks and personality, with a focus on emotional intelligence (measured with an ability test, the MSCEIT, and a self-report scale, the SREIS). To address aspects of the current pandemic situation, we used photos of not only faces per se but also of faces that were partially covered with face masks. The sample (N = 49), the size of which was determined by an a priori power test, was recruited in Germany and consisted of healthy individuals of different ages [M = 24.8 (18-64) years]. Participants assessed the emotional expressions displayed by six different faces determined by a 2 (sex) × 3 (age group: young, medium, and old) design. Each person was presented with six different emotional displays (angry, disgusted, fearful, happy, neutral, and sad) with or without a face mask. Accuracy and confidence were lower with masks-in particular for the emotion disgust (very often misinterpreted as anger) but also for happiness, anger, and sadness. When comparing the present data collected in July 2021 with data from a different sample collected in May 2020, when people first started to familiarize themselves with face masks in Western countries during the first wave of the COVID-19 pandemic, we did not detect an improvement in performance. There were no effects of participants' emotional intelligence, sex, or age regarding their accuracy in assessing emotional states in faces for unmasked or masked faces.
Collapse
Affiliation(s)
- Claus-Christian Carbon
- Department of Psychology, University of Bamberg, Bamberg, Germany
- Bamberg Graduate School of Affective and Cognitive Sciences (BaGrACS), Bamberg, Germany
| | - Marco Jürgen Held
- Department of Psychology, University of Bamberg, Bamberg, Germany
- Bamberg Graduate School of Affective and Cognitive Sciences (BaGrACS), Bamberg, Germany
| | - Astrid Schütz
- Department of Psychology, University of Bamberg, Bamberg, Germany
- Bamberg Graduate School of Affective and Cognitive Sciences (BaGrACS), Bamberg, Germany
| |
Collapse
|
26
|
Abstract
Ambiguous figures (aka bistable, multistable, or reversible images) have fascinated scientists as well as laypersons for centuries. It may be surprising indeed how one and the same physical depiction can be experienced in perceptual awareness in cardinally different ways. In the most well-known examples of such illusions of multistability, the phenomenal change relates just to visual organization. Much less common are perceptions of alternating emotional content in the ambiguous visual image. Here, I introduce one such example.
Collapse
Affiliation(s)
- Talis Bachmann
- Institute of Psychology and School of Law (Tallinn branch), Tallinn, Estonia
| |
Collapse
|
27
|
Abstract
The accurate decoding of facial emotion expressions lies at the center of many research traditions in psychology. Much of this research, while paying lip service to the importance of context in emotion perception, has used stimuli that were carefully created to be deprived of contextual information. The participants' task is to associate the expression shown in the face with a correct label, essentially changing a social perception task into a cognitive task. In fact, in many cases, the task can be carried out correctly without engaging emotion recognition at all. The present article argues that infusing context in emotion perception does not only add an additional source of information but changes the way that participants approach the task by rendering it a social perception task rather than a cognitive task. Importantly, distinguishing between accuracy (perceiving the intended emotions) and bias (perceiving additional emotions to those intended) leads to a more nuanced understanding of social emotion perception. Results from several studies that use the Assessment of Contextual Emotions demonstrate the significance and social functionality of simultaneously considering emotion decoding accuracy and bias for social interaction in different cultures, their key personality and societal correlates, and their function for close relationships processes.
Collapse
Affiliation(s)
- Ursula Hess
- Department of Psychology, Humboldt-Universität zu Berlin, Germany
| | - Konstantinos Kafetsios
- School of Film, Aristotle University of Thessaloniki, Greece.,Katedra Psychology, Palacký University in Olomouc, Czech Republic
| |
Collapse
|
28
|
Abstract
Emotion understanding facilitates the development of healthy social interactions. To develop emotion knowledge, infants and young children must learn to make inferences about people's dynamically changing facial and vocal expressions in the context of their everyday lives. Given that emotional information varies so widely, the emotional input that children receive might particularly shape their emotion understanding over time. This review explores how variation in children's received emotional input shapes their emotion understanding and their emotional behavior over the course of development. Variation in emotional input from caregivers shapes individual differences in infants' emotion perception and understanding, as well as older children's emotional behavior. Finally, this work can inform policy and focus interventions designed to help infants and young children with social-emotional development.
Collapse
|
29
|
Atanasova K, Lotter T, Bekrater-Bodmann R, Kleindienst N, Reindl W, Lis S. Is It a Gut Feeling? Bodily Sensations Associated With the Experience of Valence and Arousal in Patients With Inflammatory Bowel Disease. Front Psychiatry 2022; 13:833423. [PMID: 35530019 PMCID: PMC9072626 DOI: 10.3389/fpsyt.2022.833423] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/11/2021] [Accepted: 03/29/2022] [Indexed: 11/23/2022] Open
Abstract
BACKGROUND Previous studies have shown dysfunctional emotion processing in patients with inflammatory bowel diseases (IBD), characterized by a hypersensitivity to negative emotions and a hyposensitivity to positive emotions. Models of emotion processing emphasize the importance of bodily sensations to the experience of emotions. Since there have been no studies on whether emotion-associated bodily sensations are changed in IBD, we investigated the experience of bodily sensations related to valence and arousal, together with their links to emotional awareness, as one domain of interoceptive sensibility relevant to emotion processing. METHODS Using a topographical self-report measure, 41 IBD patients in clinical remission and 44 healthy control (HC) participants were asked to indicate where and how intensely in their body they perceive changes when experiencing emotions of positive and negative valence, as well as relaxation and tension. Additionally, we used self-report questionnaires to assess emotional awareness as one domain of an individual's interoceptive sensibility, gastrointestinal-specific anxiety (GSA), and psychological distress. RESULTS Patients with IBD reported higher emotional awareness but lower intensities of perceived changes in their bodily sensations related to valence and arousal of emotional processing. IBD patients reported less intense bodily activation during positive emotions and less intense bodily deactivation during negative emotional states in comparison to HC participants. Higher emotional awareness and psychological distress were linked to stronger experiences of emotion-related bodily sensations in IBD patients. CONCLUSION Inflammatory bowel diseases patients exhibited alterations in how they link bodily sensations to their emotional experience. Such persistent changes can affect a patient's wellbeing and are related to higher levels of anxiety and depression among IBD patients, even in remission.
Collapse
Affiliation(s)
- Konstantina Atanasova
- Department of Clinical Psychology, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany.,Department of Medicine II, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Tobias Lotter
- Department of Medicine II, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany.,Department of Psychosomatic Medicine, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Robin Bekrater-Bodmann
- Department of Psychosomatic Medicine, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Nikolaus Kleindienst
- Department of Psychosomatic Medicine, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Wolfgang Reindl
- Department of Medicine II, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Stefanie Lis
- Department of Clinical Psychology, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| |
Collapse
|
30
|
Wolf D, Leder J, Röseler L, Schütz A. Does facial redness really affect emotion perception? Evidence for limited generalisability of effects of facial redness on emotion perception in a large sample. Cogn Emot 2021; 35:1607-1617. [PMID: 34590539 DOI: 10.1080/02699931.2021.1979473] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
We conducted a preregistered study (N = 609) to conceptually replicate and extend prior research regarding the effects of facial redness on emotion perception. In a within-subjects design, participants saw emotion faces (anger, happiness, fear, neutral) of a random female and a random male target with default facial colouration and increased facial redness and were asked to simultaneously rate the intensity of six emotions (happiness, surprise, sadness, fear, disgust, anger) for each emotion face. The emotion intensity was rated higher, when the emotion face and the rated emotion matched than when the emotion face and the rated emotion did not match. However, increased facial redness did not influence the intensity of the rated emotion. The results of this conceptual replication limit the generalisability of previous findings, challenge the assumption that facial redness is used as a cue to infer emotions, and point to the necessity to develop a more nuanced theoretical account of contextual boundaries.
Collapse
Affiliation(s)
- Daniel Wolf
- Department of Psychology, University of Bamberg, Bamberg, Germany
| | - Johannes Leder
- Department of Psychology, University of Bamberg, Bamberg, Germany
| | - Lukas Röseler
- Department of Psychology, University of Bamberg, Bamberg, Germany
| | - Astrid Schütz
- Department of Psychology, University of Bamberg, Bamberg, Germany
| |
Collapse
|
31
|
Dror C, Portnoy V, Dayan-Rosenblum S, Gvion Y, Bloch Y, Boyle D, Maoz H. Emotion perception and theory of mind in adolescents with major depression. Acta Neuropsychiatr 2021; 33:261-6. [PMID: 34477049 DOI: 10.1017/neu.2021.16] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
BACKGROUND The research of theory of mind (ToM) and emotion perception (EP) in adolescents with major depressive disorder (MDD) is scarce, and no study to date has investigated the association between EP and long-term outcomes of adolescents with MDD. The aim of the current study was to evaluate ToM and EP in adolescents with MDD, as compared to healthy controls (HCs). In addition, we aimed to assess the association between impairment in ToM and EP, depressive symptom severity, and long-term outcome in the MDD group. METHODS We compared the performance of 14 adolescents with MDD and 25 HC in the Facial Expression Recognition Task (FERT) and the Interpersonal Perception Task. We followed up with the MDD group 2 years later to assess the level of their depressive symptoms using the Children's Depression Rating Scale-Revised (CDRS-R). RESULTS No differences were found between adolescents with MDD and HC in the ToM and FERT tasks. Also, within the MDD group, there was no association between the severity of depressive symptoms and task performance. In the MDD group, there was a significant correlation between lower levels of accuracy in the FERT during the index depressive episode and lower CDRS-R scores on follow-up 2 years later (r2 = 0.35, p = 0.021). CONCLUSIONS EP impairments in adolescents with MDD might predict worse long-term outcome. Further research is needed to verify our findings and to assess for a possible neurobiological underpinning for the state and trait impairments in EP in adolescents with MDD.
Collapse
|
32
|
Taherian T, Fazilatfar AM, Mazdayasna G. Joint Growth Trajectories of Trait Emotional Intelligence Subdomains Among L2 Language Learners: Estimating a Second-Order Factor-of-Curves Model With Emotion Perception. Front Psychol 2021; 12:720945. [PMID: 34589027 PMCID: PMC8473697 DOI: 10.3389/fpsyg.2021.720945] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2021] [Accepted: 08/06/2021] [Indexed: 11/13/2022] Open
Abstract
The present study assessed the developmental dynamics of trait emotional intelligence (TEI) and its subdomains during English as a foreign language (EFL) learning in a longitudinal study. A sample of 309 EFL learners (217 females, 92 males) was used to assess the trajectories of the global factor of TEI and the parallel development of the TEI subdomains over 1 year in the context of the EFL classroom using parallel process modeling (PPM) and factor of curve modeling (FCM). Additionally, emotion perception (EP) was used as a distal outcome to investigate how growth parameters, including intercept and slope factors in a TEI-FCM, influence the distal outcome of EP. The results revealed that there was sufficient inter-individual variation and intra-individual trends within each subdomain and a significant increase over time across the four subdomains. Additionally, concerning the covariances within and among the subdomains of TEI, the PPM results revealed moderate to high associations between the intercept and slope growth factors within and between these subdomains. Finally, regarding the direct association of the global growth factors (intercept and slope) of TEI on EP, the results indicated that the intercept and slope of global TEI were associated with EP (γ0 = 1.127, p < 0.001; γ1 = 0.321, p < 0.001). Specifically, the intercepts and slopes of emotionality and sociability turned out to be significantly linked to EP (γ03 = 1.311, p < 0.001; γ13 = 0.684, p < 0.001; γ04 = 0.497, p < 0.001; γ14 = 0.127, p < 0.001). These results suggest the dynamicity of TEI during learning a foreign language are discussed in this study in light of the potential variables associated with TEI and its related literature.
Collapse
Affiliation(s)
- Tahereh Taherian
- Department of English Language and Literature, Yazd University, Yazd, Iran
| | | | - Golnar Mazdayasna
- Department of English Language and Literature, Yazd University, Yazd, Iran
| |
Collapse
|
33
|
Yamamoto HW, Kawahara M, Tanaka A. A Web-Based Auditory and Visual Emotion Perception Task Experiment With Children and a Comparison of Lab Data and Web Data. Front Psychol 2021; 12:702106. [PMID: 34484051 PMCID: PMC8416272 DOI: 10.3389/fpsyg.2021.702106] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2021] [Accepted: 07/19/2021] [Indexed: 12/03/2022] Open
Abstract
Due to the COVID-19 pandemic, the significance of online research has been rising in the field of psychology. However, online experiments with child participants are rare compared to those with adults. In this study, we investigated the validity of web-based experiments with child participants 4–12 years old and adult participants. They performed simple emotional perception tasks in an experiment designed and conducted on the Gorilla Experiment Builder platform. After short communication with each participant via Zoom videoconferencing software, participants performed the auditory task (judging emotion from vocal expression) and the visual task (judging emotion from facial expression). The data collected were compared with data collected in our previous similar laboratory experiment, and similar tendencies were found. For the auditory task in particular, we replicated differences in accuracy perceiving vocal expressions between age groups and also found the same native language advantage. Furthermore, we discuss the possibility of using online cognitive studies for future developmental studies.
Collapse
Affiliation(s)
- Hisako W Yamamoto
- Tokyo Woman's Christian University, Tokyo, Japan.,Japan Society for the Promotion of Science, Tokyo, Japan
| | - Misako Kawahara
- Tokyo Woman's Christian University, Tokyo, Japan.,Japan Society for the Promotion of Science, Tokyo, Japan
| | | |
Collapse
|
34
|
Nagy E, Prentice L, Wakeling T. Atypical Facial Emotion Recognition in Children with Autism Spectrum Disorders: Exploratory Analysis on the Role of Task Demands. Perception 2021; 50:819-833. [PMID: 34428977 PMCID: PMC8438782 DOI: 10.1177/03010066211038154] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
Abstract
People with autism spectrum disorders (ASDs) have difficulty with socio-emotional functioning; however, research on facial emotion recognition (FER) remains inconclusive. Individuals with ASD might be using atypical compensatory mechanisms that are exhausted in more complex tasks. This study compared response accuracy and speed on a forced-choice FER task using neutral, happy, sad, disgust, anger, fear and surprise expressions under both timed and non-timed conditions in children with and without ASD (n = 18). The results showed that emotion recognition accuracy was comparable in the two groups in the non-timed condition. However, in the timed condition, children with ASD were less accurate in identifying anger and surprise compared to children without ASD. This suggests that people with ASD have atypical processing of anger and surprise that might become challenged under time pressure. Understanding these atypical processes, and the environmental factors that challenge them, could be beneficial in supporting socio-emotional functioning in people ASD.
Collapse
Affiliation(s)
- Emese Nagy
- 3042Psychology, School of Social Sciences, The University of Dundee, UK
| | - Louise Prentice
- 3042Psychology, School of Social Sciences, The University of Dundee, UK
| | - Tess Wakeling
- 3042Psychology, School of Social Sciences, The University of Dundee, UK
| |
Collapse
|
35
|
Abstract
Face masks have been said to impact face-to-face interaction negatively. Yet, there is limited evidence on the degree to which partial face occlusion is detrimental to empathic processes such as emotion perception and facial mimicry. To address this question, we conducted an online experiment (N=200, U.K. sample) that assessed subjective ratings and facial expressions (mimicry) in response to masked and unmasked faces. Perceivers were able to recognise happiness and sadness in dynamic emotion expressions independent of (surgical) face masks. However, perceived emotion intensity and interpersonal closeness were reduced for masked faces. Facial mimicry, the perceiver's imitation of the expresser's emotional display, was reduced or absent in response to happy but preserved for sad mask-covered expressions. For happy target expressions, the face-mimicry link was partially mediated by perceived emotion intensity, supporting the idea that mimicry is influenced by context effects. Thus, these findings suggest that whether face masks impede emotion communication depends on the emotion expressed and the emotion-communication aspect of interest. With unprecedented changes in nonverbal communication brought about by the COVID-19 pandemic, this research marks a first contribution to our understanding of facial mimicry as an important social regulator during these times.
Collapse
Affiliation(s)
- Till Kastendieck
- Humboldt-Universität zu Berlin, Institute of Psychology, Berlin, Germany
| | - Stephan Zillmer
- Humboldt-Universität zu Berlin, Institute of Psychology, Berlin, Germany
| | - Ursula Hess
- Humboldt-Universität zu Berlin, Institute of Psychology, Berlin, Germany
| |
Collapse
|
36
|
Ikeda S. Dual Development of Affective-Speech-Based Emotion Perception. J Genet Psychol 2021; 182:462-470. [PMID: 34424134 DOI: 10.1080/00221325.2021.1967270] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
Studies have shown that when interpreting emotions from speech, adults focus on prosody, while young children focus on lexical content. However, the kind of socio-emotional processing implemented in such emotion perception, as well as how it is developed, remains unclear. The present study examined the development of a dual process in affective-speech-induced emotion perception in 3- and 5-year-old children. Previous studies have suggested that unconscious emotion perception at the gaze level and conscious emotion judgment in response to speakers' emotions develop differently. Children were presented with affective speech, which included inconsistent lexical content and prosody (e.g., saying 'thank you' in an angry tone), and asked to report the speaker's emotions by pointing to the corresponding facial expressions (happy or angry). Additionally, the duration for which children gazed at each facial expression was examined. The results showed that 3-year-old children judged the speaker's emotions based on lexical content more than the 5-year-olds, who used prosody. However, at the gaze level, both the 3- and 5-year-olds focused longer on the facial expressions that matched the prosody. The results suggest that two processes can be observed: unconscious emotion perception, which matches prosody and expression, and assessment of the speaker's emotions by weighting the lexical content and prosody.
Collapse
Affiliation(s)
- Shinnosuke Ikeda
- Faculty of Humanities, Kyoto University of Advanced Science, Kyoto, Japan
| |
Collapse
|
37
|
Liang J, Zou YQ, Liang SY, Wu YW, Yan WJ. Emotional Gaze: The Effects of Gaze Direction on the Perception of Facial Emotions. Front Psychol 2021; 12:684357. [PMID: 34408705 PMCID: PMC8365180 DOI: 10.3389/fpsyg.2021.684357] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2021] [Accepted: 06/15/2021] [Indexed: 11/13/2022] Open
Abstract
Previous research has found that when gaze direction matches the underlying behavioral intent communicated by the expression of a specific emotion, it enhances or facilitates the perception of that emotion; this is called the shared signal hypothesis (SSH). Specifically, a direct gaze shares an approach-orientated signal with the emotions of anger and joy, whereas an averted gaze shares an avoidance-orientated signal with fear and sadness. In this research, we attempted to verify the SSH by using different materials on Asian participants. In Experiment 1 we employed photos of models exhibiting direct and averted gazes for rating tasks, in order to study the effects of gaze direction on participants’ perception of emotion. In Experiment 2 we utilized smiling faces in a similar investigation. The results show that for neutral and smiling faces, a direct gaze (relative to a gaze of avoidance) increased the likelihood of a subject perceiving a happy mood; a gaze of avoidance increased the likelihood that anger and fear would be perceived. The effect of gaze direction on emotional expression perception was verified, but a “facilitating-impairing” pattern was not. The difference between our work and previous research may be attributable to the materials employed (which were more ecological), as well as the participants, who were from a different culture.
Collapse
Affiliation(s)
- Jing Liang
- School of Educational Science, Ludong University, Yantai, China
| | - Yu-Qing Zou
- Faculty of Psychology, Southwest University, Chongqing, China
| | - Si-Yi Liang
- College of Teacher Education, Wenzhou University, Wenzhou, China
| | - Yu-Wei Wu
- Wenzhou Business College, Wenzhou, China
| | - Wen-Jing Yan
- College of Teacher Education, Wenzhou University, Wenzhou, China.,School of Mental Health, Wenzhou Medical University, Wenzhou, China
| |
Collapse
|
38
|
Wang Z, S Goerlich K, Luo Y, Xu P, Aleman A. Social-Specific Impairment of Negative Emotion Perception in Alexithymia. Soc Cogn Affect Neurosci 2021; 17:387-397. [PMID: 34406408 PMCID: PMC8972281 DOI: 10.1093/scan/nsab099] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2021] [Revised: 07/06/2021] [Accepted: 08/18/2021] [Indexed: 11/25/2022] Open
Abstract
Alexithymia has been characterized as an impaired ability of emotion processing and regulation. The definition of alexithymia does not include a social component. However, there is some evidence that social cognition may be compromised in individuals with alexithymia. Hence, emotional impairments associated with alexithymia may extend to socially relevant information. Here, we recorded electrophysiological responses of individuals meeting the clinically relevant cutoff for alexithymia (ALEX; n = 24) and individuals without alexithymia (NonALEX; n = 23) while they viewed affective scenes that varied on the dimensions of sociality and emotional valence during a rapid serial visual presentation task. We found that ALEX exhibited lower accuracy and larger N2 than NonALEX in the perception of social negative scenes. Source reconstruction revealed that the group difference in N2 was localized at the dorsal anterior cingulate cortex. Irrespective of emotional valence, ALEX showed stronger alpha power than NonALEX in social but not non-social conditions. Our findings support the hypothesis of social processing being selectively affected by alexithymia, especially for stimuli with negative valence. Electrophysiological evidence suggests altered deployment of attentional resources in the perception of social-specific emotional information in alexithymia. This work sheds light on the neuropsychopathology of alexithymia and alexithymia-related disorders.
Collapse
Affiliation(s)
- Zhihao Wang
- Shenzhen Key Laboratory of Affective and Social Neuroscience, Center for Brain Disorders and Cognitive Sciences, Shenzhen University, Shenzhen 518060, China.,Department of Biomedical Sciences of Cells & Systems, Section Cognitive Neuroscience, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - Katharina S Goerlich
- Department of Biomedical Sciences of Cells & Systems, Section Cognitive Neuroscience, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - Yuejia Luo
- Shenzhen Key Laboratory of Affective and Social Neuroscience, Center for Brain Disorders and Cognitive Sciences, Shenzhen University, Shenzhen 518060, China.,Faculty of Psychology, Beijing Normal University, Beijing, China.,Center for Neuroimaging, Shenzhen Institute of Neuroscience, Shenzhen, China
| | - Pengfei Xu
- Faculty of Psychology, Beijing Normal University, Beijing, China.,Center for Neuroimaging, Shenzhen Institute of Neuroscience, Shenzhen, China.,Great Bay Neuroscience and Technology Research Institute (Hong Kong), Kwun Tong, Hong Kong
| | - André Aleman
- Shenzhen Key Laboratory of Affective and Social Neuroscience, Center for Brain Disorders and Cognitive Sciences, Shenzhen University, Shenzhen 518060, China.,Department of Biomedical Sciences of Cells & Systems, Section Cognitive Neuroscience, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| |
Collapse
|
39
|
Suslow T, Kersting A. Beyond Face and Voice: A Review of Alexithymia and Emotion Perception in Music, Odor, Taste, and Touch. Front Psychol 2021; 12:707599. [PMID: 34393944 PMCID: PMC8362879 DOI: 10.3389/fpsyg.2021.707599] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2021] [Accepted: 07/06/2021] [Indexed: 11/22/2022] Open
Abstract
Alexithymia is a clinically relevant personality trait characterized by deficits in recognizing and verbalizing one's emotions. It has been shown that alexithymia is related to an impaired perception of external emotional stimuli, but previous research focused on emotion perception from faces and voices. Since sensory modalities represent rather distinct input channels it is important to know whether alexithymia also affects emotion perception in other modalities and expressive domains. The objective of our review was to summarize and systematically assess the literature on the impact of alexithymia on the perception of emotional (or hedonic) stimuli in music, odor, taste, and touch. Eleven relevant studies were identified. On the basis of the reviewed research, it can be preliminary concluded that alexithymia might be associated with deficits in the perception of primarily negative but also positive emotions in music and a reduced perception of aversive taste. The data available on olfaction and touch are inconsistent or ambiguous and do not allow to draw conclusions. Future investigations would benefit from a multimethod assessment of alexithymia and control of negative affect. Multimodal research seems necessary to advance our understanding of emotion perception deficits in alexithymia and clarify the contribution of modality-specific and supramodal processing impairments.
Collapse
Affiliation(s)
- Thomas Suslow
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig Medical Center, Leipzig, Germany
| | - Anette Kersting
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig Medical Center, Leipzig, Germany
| |
Collapse
|
40
|
Lorette P. Investigating Emotion Perception via the Two-Dimensional Affect and Feeling Space: An Example of a Cross-Cultural Study Among Chinese and Non-Chinese Participants. Front Psychol 2021; 12:662610. [PMID: 34366981 PMCID: PMC8343541 DOI: 10.3389/fpsyg.2021.662610] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2021] [Accepted: 06/07/2021] [Indexed: 11/15/2022] Open
Abstract
The categorical approach to cross-cultural emotion perception research has mainly relied on constrained experimental tasks, which have arguably biased previous findings and attenuated cross-cultural differences. On the other hand, in the constructionist approach, conclusions on the universal nature of valence and arousal have mainly been indirectly drawn based on participants' word-matching or free-sorting behaviors, but studies based on participants' continuous valence and arousal ratings are very scarce. When it comes to self-reports of specific emotion perception, constructionists tend to rely on free labeling, which has its own limitations. In an attempt to move beyond the limitations of previous methods, a new instrument called the Two-Dimensional Affect and Feeling Space (2DAFS) has been developed. The 2DAFS is a useful, innovative, and user-friendly instrument that can easily be integrated in online surveys and allows for the collection of both continuous valence and arousal ratings and categorical emotion perception data in a quick and flexible way. In order to illustrate the usefulness of this tool, a cross-cultural emotion perception study based on the 2DAFS is reported. The results indicate the cross-cultural variation in valence and arousal perception, suggesting that the minimal universality hypothesis might need to be more nuanced.
Collapse
Affiliation(s)
- Pernelle Lorette
- Department of English Studies, University of Mannheim, Mannheim, Germany
| |
Collapse
|
41
|
Kret ME, van Berlo E. Attentional Bias in Humans Toward Human and Bonobo Expressions of Emotion. Evol Psychol 2021; 19:14747049211032816. [PMID: 34318723 PMCID: PMC10358346 DOI: 10.1177/14747049211032816] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2020] [Accepted: 06/29/2021] [Indexed: 11/16/2022] Open
Abstract
Correctly recognizing and efficiently attending to emotional situations are highly valuable skills for social species such as humans and bonobos, humans' closest living relatives. In the current study, we investigated whether humans perceive a range of emotional situations differently when these involved other humans compared to bonobos. A large group of children and adults participated in an emotion perception task and rated scenes showing either bonobos or humans in situations depicting distressed or aggressive behavior, yawning, scratching, grooming, playing, sex scenes or neutral situations. A new group of people performed a dot-probe task to assess attentional biases toward these materials. The main finding is that humans perceive emotional scenes showing people similarly as emotional scenes of bonobos, a result reflecting a shared evolutionary origin of emotional expressions. Other results show that children interpreted bonobos' bared teeth displays as a positive signal. This signal is related to the human smile, but is frequently seen in distressed situations, as was the case in the current experiment. Children may still need to learn to use contextual cues when judging an ambiguous expression as positive or negative. Further, the sex scenes were rated very positively, especially by male participants. Even though they rated these more positively than women, their attention was captured similarly, surpassing all other emotion categories. Finally, humans' attention was captured more by human yawns than by bonobo yawns, which may be related to the highly contagious nature of yawns, especially when shown by close others. The current research adds to earlier work showing morphological, behavioral and genetic parallels between humans and bonobos by showing that their emotional expressions have a common origin too.
Collapse
Affiliation(s)
- Mariska E. Kret
- Leiden University, Institute of Psychology, Cognitive Psychology Unit, CoPAN lab, Wassenaarseweg, Leiden, Zuid-Holland, The Netherlands
| | - Evy van Berlo
- Leiden University, Institute of Psychology, Cognitive Psychology Unit, CoPAN lab, Wassenaarseweg, Leiden, Zuid-Holland, The Netherlands
| |
Collapse
|
42
|
Xu J, Dong H, Li N, Wang Z, Guo F, Wei J, Dang J. Weighted RSA: An Improved Framework on the Perception of Audio-visual Affective Speech in Left Insula and Superior Temporal Gyrus. Neuroscience 2021; 469:46-58. [PMID: 34119576 DOI: 10.1016/j.neuroscience.2021.06.002] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2021] [Revised: 05/16/2021] [Accepted: 06/02/2021] [Indexed: 12/24/2022]
Abstract
Being able to accurately perceive the emotion expressed by the facial or verbal expression from others is critical to successful social interaction. However, only few studies examined the multimodal interactions on speech emotion, and there is no consistence in studies on the speech emotion perception. It remains unclear, how the speech emotion of different valence is perceived on the multimodal stimuli by our human brain. In this paper, we conducted a functional magnetic resonance imaging (fMRI) study with an event-related design, using dynamic facial expressions and emotional speech stimuli to express different emotions, in order to explore the perception mechanism of speech emotion in audio-visual modality. The representational similarity analysis (RSA), whole-brain searchlight analysis, and conjunction analysis of emotion were used to interpret the representation of speech emotion in different aspects. Significantly, a weighted RSA approach was creatively proposed to evaluate the contribution of each candidate model to the best fitted model and provided a supplement to RSA. The results of weighted RSA indicated that the fitted models were superior to all candidate models and the weights could be used to explain the representation of ROIs. The bilateral amygdala has been shown to be associated with the processing of both positive and negative emotions except neutral emotion. It is indicated that the left posterior insula and the left anterior superior temporal gyrus (STG) play important roles in the perception of multimodal speech emotion.
Collapse
Affiliation(s)
- Junhai Xu
- College of Intelligence and Computing, Tianjin Key Lab of Cognitive Computing and Application, Tianjin University, Tianjin, China
| | - Haibin Dong
- College of Intelligence and Computing, Tianjin Key Lab of Cognitive Computing and Application, Tianjin University, Tianjin, China; State Grid Tianjin Electric Power Company, China
| | - Na Li
- College of Intelligence and Computing, Tianjin Key Lab of Cognitive Computing and Application, Tianjin University, Tianjin, China
| | - Zeyu Wang
- College of Intelligence and Computing, Tianjin Key Lab of Cognitive Computing and Application, Tianjin University, Tianjin, China
| | - Fei Guo
- School of Computer Science and Engineering, Central South University, Changsha 410083, China.
| | - Jianguo Wei
- College of Intelligence and Computing, Tianjin Key Lab of Cognitive Computing and Application, Tianjin University, Tianjin, China.
| | - Jianwu Dang
- College of Intelligence and Computing, Tianjin Key Lab of Cognitive Computing and Application, Tianjin University, Tianjin, China; School of Information Science, Japan Advanced Institute of Science and Technology, Japan
| |
Collapse
|
43
|
Fugate JMB, Franco CL. Implications for Emotion: Using Anatomically Based Facial Coding to Compare Emoji Faces Across Platforms. Front Psychol 2021; 12:605928. [PMID: 33716870 PMCID: PMC7947884 DOI: 10.3389/fpsyg.2021.605928] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2020] [Accepted: 01/18/2021] [Indexed: 11/27/2022] Open
Abstract
Emoji faces, which are ubiquitous in our everyday communication, are thought to resemble human faces and aid emotional communication. Yet, few studies examine whether emojis are perceived as a particular emotion and whether that perception changes based on rendering differences across electronic platforms. The current paper draws upon emotion theory to evaluate whether emoji faces depict anatomical differences that are proposed to differentiate human depictions of emotion (hereafter, “facial expressions”). We modified the existing Facial Action Coding System (FACS) (Ekman and Rosenberg, 1997) to apply to emoji faces. An equivalent “emoji FACS” rubric allowed us to evaluate two important questions: First, Anatomically, does the same emoji face “look” the same across platforms and versions? Second, Do emoji faces perceived as a particular emotion category resemble the proposed human facial expression for that emotion? To answer these questions, we compared the anatomically based codes for 31 emoji faces across three platforms and two version updates. We then compared those codes to the proposed human facial expression prototype for the emotion perceived within the emoji face. Overall, emoji faces across platforms and versions were not anatomically equivalent. Moreover, the majority of emoji faces did not conform to human facial expressions for an emotion, although the basic anatomical codes were shared among human and emoji faces. Some emotion categories were better predicted by the assortment of anatomical codes than others, with some individual differences among platforms. We discuss theories of emotion that help explain how emoji faces are perceived as an emotion, even when anatomical differences are not always consistent or specific to an emotion.
Collapse
Affiliation(s)
- Jennifer M B Fugate
- Department of Psychology, University of Massachusetts Dartmouth, Dartmouth, MA, United States
| | - Courtny L Franco
- Department of Communication and Information Science, University of Alabama, Tuscaloosa, AL, United States
| |
Collapse
|
44
|
Kittel AFD, Olderbak S, Wilhelm O. Sty in the Mind's Eye: A Meta-Analytic Investigation of the Nomological Network and Internal Consistency of the "Reading the Mind in the Eyes" Test. Assessment 2021; 29:872-895. [PMID: 33645295 DOI: 10.1177/1073191121996469] [Citation(s) in RCA: 28] [Impact Index Per Article: 9.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
Abstract
The Reading the Mind in the Eyes Test (RMET) is the most popular adult measure of individual differences in theory of mind. We present a meta-analytic investigation of the test's psychometric properties (k = 119 effect sizes, 61 studies, ntotal = 8,611 persons). Using random effects models, we found the internal consistency of the test was acceptable (α = .73). However, the RMET was more strongly related with emotion perception (r = .33, ρ = .48) relative to alternative theory of mind measures (r = .29, ρ = .39), and weakly to moderately related with vocabulary (r = .25, ρ = .32), cognitive empathy (r = .14, ρ = .20), and affective empathy (r = .13, ρ = .19). Overall, we conclude that the RMET operates rather as emotion perception measure than as theory of mind measure, challenging the interpretation of RMET results.
Collapse
|
45
|
Radlak B, Cooper C, Summers F, Phillips LH. Multiple sclerosis, emotion perception and social functioning. J Neuropsychol 2021; 15:500-515. [PMID: 33522134 DOI: 10.1111/jnp.12237] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2019] [Revised: 11/10/2020] [Indexed: 11/29/2022]
Abstract
People with multiple sclerosis (MS) can experience problems in interpreting others' emotions from faces or voices. However, to date little is known about whether difficulties in emotion perception in MS are related to broader aspects of social functioning. Also, there are few studies reporting the effect of MS on more ecologically valid assessments of emotion perception using multimodal videos. The current study looks at (1) the effect of MS on perceiving emotions from faces, voices and multimodal videos; (2) the possible role of slowed processing and executive dysfunction in emotion perception problems in MS and (3) the relationship between emotion perception and broader social functioning in MS. 53 people with MS and 31 healthy controls completed tasks of emotion perception and cognition, and assessed their levels of social support and social participation. Participants with MS performed worse than demographically matched controls on all measures of emotion perception. Emotion perception performance was related to cognitive measures in those with MS. Also, significant associations were found between emotion perception difficulties in MS and poorer social function. In particular, people with MS who had poorer emotion perception also reported lower levels of social support from their friends, and regression analysis showed that this prediction was maintained even when disease severity and cognitive function were taken into account. These results show that problems with emotion perception in MS extend to more realistic tasks and may predict key aspects of social functioning.
Collapse
Affiliation(s)
- Bogna Radlak
- School of Psychology, University of Aberdeen, UK.,Department of Neuropsychology, Ninewells Hospital in Dundee, UK
| | - Clare Cooper
- Health Psychology Group, University of Aberdeen, UK
| | - Fiona Summers
- Department of Neuropsychology, Aberdeen Royal Infirmary, UK
| | | |
Collapse
|
46
|
Chen J, Zhang Y, Zhao G. The Qingdao Preschooler Facial Expression Set: Acquisition and Validation of Chinese Children's Facial Emotion Stimuli. Front Psychol 2021; 11:554821. [PMID: 33551893 PMCID: PMC7858654 DOI: 10.3389/fpsyg.2020.554821] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2020] [Accepted: 12/17/2020] [Indexed: 12/03/2022] Open
Abstract
Traditional research on emotion-face processing has primarily focused on the expression of basic emotions using adult emotional face stimuli. Stimulus sets featuring child faces or emotions other than basic emotions are rare. The current study describes the acquisition and evaluation of the Qingdao Preschooler Facial Expression (QPFE) set, a facial stimulus set with images featuring 54 Chinese preschoolers' emotion expressions. The set includes 712 standardized color photographs of six basic emotions (joy, fear, anger, sadness, surprise, and disgust), five discrete positive emotions (interest, contentment, relief, pride, and amusement), and a neutral expression. The validity of the pictures was examined based on 43 adult raters' online evaluation, including agreement between designated emotions and raters' labels, as well as intensity and representativeness scores. Overall, these data should contribute to the developmental and cross-cultural research on children's emotion expressions and provide insights for future research on positive emotions.
Collapse
Affiliation(s)
- Jie Chen
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Yulin Zhang
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Guozhen Zhao
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
47
|
Girard JM, Cohn JF, Yin L, Morency LP. Reconsidering the Duchenne Smile: Formalizing and Testing Hypotheses about Eye Constriction and Positive Emotion. ACTA ACUST UNITED AC 2021; 2:32-47. [PMID: 34337430 DOI: 10.1007/s42761-020-00030-w] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
The common view of emotional expressions is that certain configurations of facial-muscle movements reliably reveal certain categories of emotion. The principal exemplar of this view is the Duchenne smile, a configuration of facial-muscle movements (i.e., smiling with eye constriction) that has been argued to reliably reveal genuine positive emotion. In this paper, we formalized a list of hypotheses that have been proposed regarding the Duchenne smile, briefly reviewed the literature weighing on these hypotheses, identified limitations and unanswered questions, and conducted two empirical studies to begin addressing these limitations and answering these questions. Both studies analyzed a database of 751 smiles observed while 136 participants completed experimental tasks designed to elicit amusement, embarrassment, fear, and physical pain. Study 1 focused on participants' self-reported positive emotion and Study 2 focused on how third-party observers would perceive videos of these smiles. Most of the hypotheses that have been proposed about the Duchenne smile were either contradicted by or only weakly supported by our data. Eye constriction did provide some information about experienced positive emotion, but this information was lacking in specificity, already provided by other smile characteristics, and highly dependent on context. Eye constriction provided more information about perceived positive emotion, including some unique information over other smile characteristics, but context was also important here as well. Overall, our results suggest that accurately inferring positive emotion from a smile requires more sophisticated methods than simply looking for the presence/absence (or even the intensity) of eye constriction.
Collapse
Affiliation(s)
| | | | - Lijun Yin
- Binghamton University, Binghamton, NY, USA
| | | |
Collapse
|
48
|
Jeong JW, Kim HT, Lee SH, Lee H. Effects of an Audiovisual Emotion Perception Training for Schizophrenia: A Preliminary Study. Front Psychiatry 2021; 12:522094. [PMID: 34025462 PMCID: PMC8131526 DOI: 10.3389/fpsyt.2021.522094] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/20/2019] [Accepted: 03/18/2021] [Indexed: 11/13/2022] Open
Abstract
Individuals with schizophrenia show a reduced ability to integrate facial and vocal information in emotion perception. Although emotion perception has been a target for treatment, no study has yet examined the effect of multimodal training on emotion perception in schizophrenia. In the present study, we developed an audiovisual emotion perception training and test in which a voice and a face were simultaneously presented, and subjects were asked to judge whether the emotions of the voice and the face matched. The voices were either angry or happy, and the faces were morphed on a continuum ranging from angry to happy. Sixteen patients with schizophrenia participated in six training sessions and three test sessions (i.e., pre-training, post-training, and generalization). Eighteen healthy controls participated only in pre-training test session. Prior to training, the patients with schizophrenia performed significantly worse than did the controls in the recognition of anger; however, following the training, the patients showed a significant improvement in recognizing anger, which was maintained and generalized to a new set of stimuli. The patients also improved the recognition of happiness following the training, but this effect was not maintained or generalized. These results provide preliminary evidence that a multimodal, audiovisual training may yield improvements in anger perception for patients with schizophrenia.
Collapse
Affiliation(s)
- Ji Woon Jeong
- Department of Psychology, Korea University, Seoul, South Korea
| | - Hyun Taek Kim
- Department of Psychology, Korea University, Seoul, South Korea
| | - Seung-Hwan Lee
- Department of Psychiatry, Ilsan-Paik Hospital, Inje University, Goyang, South Korea
| | - Hyejeen Lee
- Department of Psychology, Chonnam National University, Gwangju, South Korea
| |
Collapse
|
49
|
Rajananda S, Zhu J, Peters MAK. Normal observers show no evidence for blindsight in facial emotion perception. Neurosci Conscious 2020; 2020:niaa023. [PMID: 33343928 PMCID: PMC7734439 DOI: 10.1093/nc/niaa023] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2020] [Revised: 08/05/2020] [Accepted: 09/07/2020] [Indexed: 11/16/2022] Open
Abstract
Some researchers have argued that normal human observers can exhibit "blindsight-like" behavior: the ability to discriminate or identify a stimulus without being aware of it. However, we recently used a bias-free task to show that what looks like blindsight may in fact be an artifact of typical experimental paradigms' susceptibility to response bias. While those findings challenge previous reports of blindsight in normal observers, they do not rule out the possibility that different stimuli or techniques could still reveal perception without awareness. One intriguing candidate is emotion processing, since processing of emotional stimuli (e.g. fearful/happy faces) has been reported to potentially bypass conscious visual circuits. Here we used the bias-free blindsight paradigm to investigate whether emotion processing might reveal "featural blindsight," i.e. ability to identify a face's emotion without introspective access to the task-relevant features that led to the discrimination decision. However, we saw no evidence for emotion processing "featural blindsight": as before, whenever participants could identify a face's emotion they displayed introspective access to the task-relevant features, matching predictions of a Bayesian ideal observer. These results add to the growing body of evidence that perceptual discrimination ability without introspective access may not be possible for neurologically intact observers.
Collapse
Affiliation(s)
- Sivananda Rajananda
- Department of Bioengineering, University of California Riverside, Riverside, CA 92521, USA
| | - Jeanette Zhu
- Department of Psychology, University of California Los Angeles, Los Angeles, CA 90095, USA
| | - Megan A K Peters
- Department of Bioengineering, University of California Riverside, Riverside, CA 92521, USA
- Department of Cognitive Science, University of California Irvine, Irvine, CA 92697, USA
| |
Collapse
|
50
|
Cassel A, McDonald S, Kelly M. Establishing 'proof of concept' for a social cognition group treatment program (SIFT IT) after traumatic brain injury: two case studies. Brain Inj 2020; 34:1781-93. [PMID: 33180565 DOI: 10.1080/02699052.2020.1831072] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
OBJECTIVE Social cognitive deficits are prevalent after traumatic brain injury (TBI). Despite this, few remediation studies exist. This study aimed to demonstrate 'proof of concept' for a novel group treatment that comprehensively targeted the core processes of social cognition. DESIGN Pre-post case study with two participants, "Greg" and "Aaron", living with severe TBI, with three assessment time points. METHOD Participants were screened at baseline to confirm social cognitive deficits: Greg exhibited difficulties with emotion perception and detecting hints; Aaron with detecting sarcasm and hints. Both reported everyday social problems. Participants then completed the 14-week group treatment program (SIFT IT). Feasibility and outcome measures were repeated post-group and at three-month follow-up. RESULTS The study procedure was implemented with 100% assessment and 89% SIFT IT session attendance, albeit with a lack of proxy-report measures. Both participants described procedures as acceptable, although suggested more group participants could be beneficial. They both demonstrated reliable improvements (RCI > 1.96) on relevant social cognitive measures. Qualitative feedback corroborated findings: Greg reported generalization of therapeutic gains, Aaron reported increased self-awareness but nominal generalization. CONCLUSION Feasibility and limited efficacy outcomes established 'proof of concept' of SIFT IT. Findings will inform the study protocol for a larger randomized-controlled trial.
Collapse
|