1
|
Moosavi J, Resch A, Lecchi A, Sokolov AN, Fallgatter AJ, Pavlova MA. Reading language of the eyes in female depression. Cereb Cortex 2024; 34:bhae253. [PMID: 38990517 DOI: 10.1093/cercor/bhae253] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2024] [Revised: 05/31/2024] [Accepted: 06/03/2024] [Indexed: 07/12/2024] Open
Abstract
Aberrations in non-verbal social cognition have been reported to coincide with major depressive disorder. Yet little is known about the role of the eyes. To fill this gap, the present study explores whether and, if so, how reading language of the eyes is altered in depression. For this purpose, patients and person-by-person matched typically developing individuals were administered the Emotions in Masked Faces task and Reading the Mind in the Eyes Test, modified, both of which contained a comparable amount of visual information available. For achieving group homogeneity, we set a focus on females as major depressive disorder displays a gender-specific profile. The findings show that facial masks selectively affect inferring emotions: recognition of sadness and anger are more heavily compromised in major depressive disorder as compared with typically developing controls, whereas the recognition of fear, happiness, and neutral expressions remains unhindered. Disgust, the forgotten emotion of psychiatry, is the least recognizable emotion in both groups. On the Reading the Mind in the Eyes Test patients exhibit lower accuracy on positive expressions than their typically developing peers, but do not differ on negative items. In both depressive and typically developing individuals, the ability to recognize emotions behind a mask and performance on the Reading the Mind in the Eyes Test are linked to each other in processing speed, but not recognition accuracy. The outcome provides a blueprint for understanding the complexities of reading language of the eyes within and beyond the COVID-19 pandemic.
Collapse
Affiliation(s)
- Jonas Moosavi
- Social Neuroscience Unit, Department of Psychiatry and Psychotherapy, Tübingen Center for Mental Health (TüCMH), Medical School and University Hospital, Eberhard Karls University of Tübingen, Calwerstr. 14, 72076, Tübingen, Germany
| | - Annika Resch
- Social Neuroscience Unit, Department of Psychiatry and Psychotherapy, Tübingen Center for Mental Health (TüCMH), Medical School and University Hospital, Eberhard Karls University of Tübingen, Calwerstr. 14, 72076, Tübingen, Germany
| | - Alessandro Lecchi
- Social Neuroscience Unit, Department of Psychiatry and Psychotherapy, Tübingen Center for Mental Health (TüCMH), Medical School and University Hospital, Eberhard Karls University of Tübingen, Calwerstr. 14, 72076, Tübingen, Germany
| | - Alexander N Sokolov
- Social Neuroscience Unit, Department of Psychiatry and Psychotherapy, Tübingen Center for Mental Health (TüCMH), Medical School and University Hospital, Eberhard Karls University of Tübingen, Calwerstr. 14, 72076, Tübingen, Germany
| | - Andreas J Fallgatter
- Social Neuroscience Unit, Department of Psychiatry and Psychotherapy, Tübingen Center for Mental Health (TüCMH), Medical School and University Hospital, Eberhard Karls University of Tübingen, Calwerstr. 14, 72076, Tübingen, Germany
- German Center for Mental Health (DZPG), Partner Site Tübingen, Tübingen, Germany
| | - Marina A Pavlova
- Social Neuroscience Unit, Department of Psychiatry and Psychotherapy, Tübingen Center for Mental Health (TüCMH), Medical School and University Hospital, Eberhard Karls University of Tübingen, Calwerstr. 14, 72076, Tübingen, Germany
| |
Collapse
|
2
|
Schnitzler T, Korn C, C. Herpertz S, Fuchs T. Emotion recognition in autism spectrum condition during the COVID-19 pandemic. AUTISM : THE INTERNATIONAL JOURNAL OF RESEARCH AND PRACTICE 2024; 28:1690-1702. [PMID: 37882152 PMCID: PMC11191665 DOI: 10.1177/13623613231203306] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2023]
Abstract
LAY ABSTRACT In the COVID-19 pandemic, wearing face masks became mandatory to prevent the spread of the virus. However, they restrict the ability to recognize emotions to the upper part of the face. Since individuals with autism spectrum condition often tend to look at the lower half of the face, they may be particularly restricted in emotion recognition by people wearing masks, since they are now forced to look at the upper half of the face. The current study compared the recognition of facially expressed emotions between individuals with and without autism spectrum condition. Each photo was shown in three types, once uncovered, once with face mask, and once with sunglasses. Our results revealed a reduction in accuracy of individuals with autism spectrum condition at recognizing emotions in all three stimulus types and exhibited more difficulties distinguishing anger, fear, pride, and embarrassment. During the emotion recognition task, there was no difference in which facial areas the groups looked at. We did not find evidence that the disadvantages of individuals with autism spectrum condition in emotion recognition were due to looking at different areas of the face.
Collapse
|
3
|
Fujihara Y, Guo K, Liu CH. Relationship between types of anxiety and the ability to recognize facial expressions. Acta Psychol (Amst) 2023; 241:104100. [PMID: 38041913 DOI: 10.1016/j.actpsy.2023.104100] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2022] [Revised: 11/16/2023] [Accepted: 11/28/2023] [Indexed: 12/04/2023] Open
Abstract
This study examined whether three subtypes of anxiety (trait anxiety, state anxiety, and social anxiety) have different effects on recognition of facial expressions. One hundred and thirty-eight participants matched facial expressions of three intensity levels (20 %, 40 %, 100 %) with one of the six emotion labels ("happy", "sad", "fear", "angry", "disgust", and "surprise"). While using a conventional method of analysis we were able to replicate some significant correlations between each anxiety type and recognition performance found in the literature. However, when we used partial correlation to isolate the effect of each anxiety type, most of these correlations were no longer significant, apart from the negative correlations between Beck Anxiety Inventory and reaction time to fearful faces displayed at 40 % intensity level, and the correlations between anxiety and categorisation errors. Specifically, social anxiety was positively correlated with misidentifying a happy face as a disgust face at 40 % intensity level, and state anxiety negatively correlated with misidentifying a happy face as a sad face at 20 % intensity level. However, these partial correlation analyses became non-significant after p value adjustment for multiple comparisons. Our eye tracking data also showed that state anxiety may be associated with reduced fixations on the eye regions of low-intensity sad or fearful faces. These analyses cast doubts on some effects reported in the previous studies because they are likely to reflect a mixture of influences from highly correlated anxiety subtypes.
Collapse
Affiliation(s)
- Yuya Fujihara
- Department of Psychology, Yasuda Women's University, Japan.
| | - Kun Guo
- School of Psychology, University of Lincoln, Brayford Pool, Lincoln, Lincolnshire LN6 7TS, United Kingdom.
| | - Chang Hong Liu
- Department of Psychology, Bournemouth University, United Kingdom.
| |
Collapse
|
4
|
Todd E, Subendran S, Wright G, Guo K. Emotion category-modulated interpretation bias in perceiving ambiguous facial expressions. Perception 2023; 52:695-711. [PMID: 37427421 PMCID: PMC10510303 DOI: 10.1177/03010066231186936] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2023] [Accepted: 06/22/2023] [Indexed: 07/11/2023]
Abstract
In contrast to prototypical facial expressions, we show less perceptual tolerance in perceiving vague expressions by demonstrating an interpretation bias, such as more frequent perception of anger or happiness when categorizing ambiguous expressions of angry and happy faces that are morphed in different proportions and displayed under high- or low-quality conditions. However, it remains unclear whether this interpretation bias is specific to emotion categories or reflects a general negativity versus positivity bias and whether the degree of this bias is affected by the valence or category of two morphed expressions. These questions were examined in two eye-tracking experiments by systematically manipulating expression ambiguity and image quality in fear- and sad-happiness faces (Experiment 1) and by directly comparing anger-, fear-, sadness-, and disgust-happiness expressions (Experiment 2). We found that increasing expression ambiguity and degrading image quality induced a general negativity versus positivity bias in expression categorization. The degree of negativity bias, the associated reaction time and face-viewing gaze allocation were further manipulated by different expression combinations. It seems that although we show a viewing condition-dependent bias in interpreting vague facial expressions that display valence-contradicting expressive cues, it appears that the perception of these ambiguous expressions is guided by a categorical process similar to that involved in perceiving prototypical expressions.
Collapse
|
5
|
Folz J, Akdağ R, Nikolić M, van Steenbergen H, Kret ME. Facial mimicry and metacognitive judgments in emotion recognition are distinctly modulated by social anxiety and autistic traits. Sci Rep 2023; 13:9730. [PMID: 37322077 PMCID: PMC10272184 DOI: 10.1038/s41598-023-35773-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2022] [Accepted: 05/23/2023] [Indexed: 06/17/2023] Open
Abstract
Facial mimicry as well as the accurate assessment of one's performance when judging others' emotional expressions have been suggested to inform successful emotion recognition. Differences in the integration of these two information sources might explain alterations in the perception of others' emotions in individuals with Social Anxiety Disorder and individuals on the autism spectrum. Using a non-clinical sample (N = 57), we examined the role of social anxiety and autistic traits in the link between facial mimicry, or confidence in one's performance, and emotion recognition. While participants were presented with videos of spontaneous emotional facial expressions, we measured their facial muscle activity, asked them to label the expressions and indicate their confidence in accurately labelling the expressions. Our results showed that confidence in emotion recognition was lower with higher social anxiety traits even though actual recognition was not related to social anxiety traits. Higher autistic traits, in contrast, were associated with worse recognition, and a weakened link between facial mimicry and performance. Consequently, high social anxiety traits might not affect emotion recognition itself, but the top-down evaluation of own abilities in emotion recognition contexts. High autistic traits, in contrast, may be related to lower integration of sensorimotor simulations, which promote emotion recognition.
Collapse
Affiliation(s)
- Julia Folz
- Department of Cognitive Psychology, Institute of Psychology, Leiden University, Leiden, The Netherlands.
- Leiden Institute for Brain and Cognition (LIBC), Leiden University, Leiden, The Netherlands.
| | - Rüya Akdağ
- Department of Cognitive Psychology, Institute of Psychology, Leiden University, Leiden, The Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden University, Leiden, The Netherlands
| | - Milica Nikolić
- Department of Cognitive Psychology, Institute of Psychology, Leiden University, Leiden, The Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden University, Leiden, The Netherlands
- Research Institute of Child Development and Education, University of Amsterdam, Amsterdam, The Netherlands
| | - Henk van Steenbergen
- Department of Cognitive Psychology, Institute of Psychology, Leiden University, Leiden, The Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden University, Leiden, The Netherlands
| | - Mariska E Kret
- Department of Cognitive Psychology, Institute of Psychology, Leiden University, Leiden, The Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden University, Leiden, The Netherlands
| |
Collapse
|
6
|
Akiyama T, Matsumoto K, Osaka K, Tanioka R, Betriana F, Zhao Y, Kai Y, Miyagawa M, Yasuhara Y, Ito H, Soriano G, Tanioka T. Comparison of Subjective Facial Emotion Recognition and "Facial Emotion Recognition Based on Multi-Task Cascaded Convolutional Network Face Detection" between Patients with Schizophrenia and Healthy Participants. Healthcare (Basel) 2022; 10:healthcare10122363. [PMID: 36553887 PMCID: PMC9777528 DOI: 10.3390/healthcare10122363] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2022] [Revised: 11/16/2022] [Accepted: 11/21/2022] [Indexed: 11/27/2022] Open
Abstract
Patients with schizophrenia may exhibit a flat affect and poor facial expressions. This study aimed to compare subjective facial emotion recognition (FER) and FER based on multi-task cascaded convolutional network (MTCNN) face detection in 31 patients with schizophrenia (patient group) and 40 healthy participants (healthy participant group). A Pepper Robot was used to converse with the 71 aforementioned participants; these conversations were recorded on video. Subjective FER (assigned by medical experts based on video recordings) and FER based on MTCNN face detection was used to understand facial expressions during conversations. This study confirmed the discriminant accuracy of the FER based on MTCNN face detection. The analysis of the smiles of healthy participants revealed that the kappa coefficients of subjective FER (by six examiners) and FER based on MTCNN face detection concurred (κ = 0.63). The perfect agreement rate between the subjective FER (by three medical experts) and FER based on MTCNN face detection in the patient, and healthy participant groups were analyzed using Fisher's exact probability test where no significant difference was observed (p = 0.72). The validity and reliability were assessed by comparing the subjective FER and FER based on MTCNN face detection. The reliability coefficient of FER based on MTCNN face detection was low for both the patient and healthy participant groups.
Collapse
Affiliation(s)
- Toshiya Akiyama
- Graduate School of Health Sciences, Tokushima University, Tokushima 770-8509, Japan
| | - Kazuyuki Matsumoto
- Graduate School of Engineering, Tokushima University, Tokushima 770-8506, Japan
| | - Kyoko Osaka
- Department of Psychiatric Nursing, Nursing Course of Kochi Medical School, Kochi University, Kochi 783-8505, Japan
| | - Ryuichi Tanioka
- Department of Physical Therapy, Hiroshima Cosmopolitan University, Hiroshima 734-0014, Japan
| | | | - Yueren Zhao
- Department of Psychiatry, Fujita Health University, Nagoya 470-1192, Japan
| | - Yoshihiro Kai
- Department of Mechanical Engineering, Tokai University, Tokyo 151-8677, Japan
| | - Misao Miyagawa
- Department of Nursing, Faculty of Health and Welfare, Tokushima Bunri University, Tokushima 770-8514, Japan
| | - Yuko Yasuhara
- Institute of Biomedical Sciences, Tokushima University, Tokushima 770-8509, Japan
| | - Hirokazu Ito
- Institute of Biomedical Sciences, Tokushima University, Tokushima 770-8509, Japan
| | - Gil Soriano
- Department of Nursing, College of Allied Health, National University Philippines, Manila 1008, Philippines
| | - Tetsuya Tanioka
- Institute of Biomedical Sciences, Tokushima University, Tokushima 770-8509, Japan
- Correspondence:
| |
Collapse
|
7
|
Kim M, Cho Y, Kim SY. Effects of diagnostic regions on facial emotion recognition: The moving window technique. Front Psychol 2022; 13:966623. [PMID: 36186300 PMCID: PMC9518794 DOI: 10.3389/fpsyg.2022.966623] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2022] [Accepted: 08/01/2022] [Indexed: 11/13/2022] Open
Abstract
With regard to facial emotion recognition, previous studies found that specific facial regions were attended more in order to identify certain emotions. We investigated whether a preferential search for emotion-specific diagnostic regions could contribute toward the accurate recognition of facial emotions. Twenty-three neurotypical adults performed an emotion recognition task using six basic emotions: anger, disgust, fear, happiness, sadness, and surprise. The participants’ exploration patterns for the faces were measured using the Moving Window Technique (MWT). This technique presented a small window on a blurred face, and the participants explored the face stimuli through a mouse-controlled window in order to recognize the emotions on the face. Our results revealed that when the participants explored the diagnostic regions for each emotion more frequently, the correct recognition of the emotions occurred at a faster rate. To the best of our knowledge, this current study is the first to present evidence that an exploration of emotion-specific diagnostic regions can predict the reaction time of accurate emotion recognition among neurotypical adults. Such findings can be further applied in the evaluation and/or training (regarding emotion recognition functions) of both typically and atypically developing children with emotion recognition difficulties.
Collapse
Affiliation(s)
- Minhee Kim
- Department of Psychology, Duksung Women’s University, Seoul, South Korea
| | - Youngwug Cho
- Department of Computer Science, Hanyang University, Seoul, South Korea
| | - So-Yeon Kim
- Department of Psychology, Duksung Women’s University, Seoul, South Korea
- *Correspondence: So-Yeon Kim,
| |
Collapse
|
8
|
Guo K, Hare A, Liu CH. Impact of Face Masks and Viewers' Anxiety on Ratings of First Impressions from Faces. Perception 2021; 51:37-50. [PMID: 34904869 PMCID: PMC8772253 DOI: 10.1177/03010066211065230] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
Face mask is now a common feature in our social environment. Although face covering reduces our ability to recognize other's face identity and facial expressions, little is known about its impact on the formation of first impressions from faces. In two online experiments, we presented unfamiliar faces displaying neutral expressions with and without face masks, and participants rated the perceived approachableness, trustworthiness, attractiveness, and dominance from each face on a 9-point scale. Their anxiety levels were measured by the State-Trait Anxiety Inventory and Social Interaction Anxiety Scale. In comparison with mask-off condition, wearing face masks (mask-on) significantly increased the perceived approachableness and trustworthiness ratings, but showed little impact on increasing attractiveness or decreasing dominance ratings. Furthermore, both trait and state anxiety scores were negatively correlated with approachableness and trustworthiness ratings in both mask-off and mask-on conditions. Social anxiety scores, on the other hand, were negatively correlated with approachableness but not with trustworthiness ratings. It seems that the presence of a face mask can alter our first impressions of strangers. Although the ratings for approachableness, trustworthiness, attractiveness, and dominance were positively correlated, they appeared to be distinct constructs that were differentially influenced by face coverings and participants’ anxiety types and levels.
Collapse
Affiliation(s)
- Kun Guo
- School of Psychology, University of Lincoln, UK
| | | | | |
Collapse
|
9
|
Gehrer NA, Zajenkowska A, Bodecka M, Schönenberg M. Attention orienting to the eyes in violent female and male offenders: An eye-tracking study. Biol Psychol 2021; 163:108136. [PMID: 34129874 DOI: 10.1016/j.biopsycho.2021.108136] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2020] [Revised: 06/10/2021] [Accepted: 06/10/2021] [Indexed: 12/30/2022]
Abstract
Attention to the eyes and eye contact form an important basis for the development of empathy and social competences including prosocial behavior. Thus, impairments in attention to the eyes of an interaction partner might play a role in the etiology of antisocial behavior and violence. For the first time, the present study extends investigations of eye gaze to a large sample (N = 173) including not only male but also female violent offenders and a control group. We assessed viewing patterns during the categorization of emotional faces via eye tracking. Our results indicate a reduced frequency of initial attention shifts to the eyes in female and male offenders compared to controls, while there were no general group differences in overall attention to the eye region (i.e., relative dwell time). Thus, we conclude that violent offenders might be able to compensate for deficits in spontaneous attention orienting during later stages of information processing.
Collapse
Affiliation(s)
- Nina A Gehrer
- University of Tübingen, Department of Clinical Psychology and Psychotherapy, Tübingen, Germany.
| | - Anna Zajenkowska
- Maria Grzegorzewska University, Department of Psychology, Warsaw, Poland
| | - Marta Bodecka
- Maria Grzegorzewska University, Department of Psychology, Warsaw, Poland
| | - Michael Schönenberg
- University of Tübingen, Department of Clinical Psychology and Psychotherapy, Tübingen, Germany; University Hospital Tübingen, Department of Psychiatry and Psychotherapy, Tübingen, Germany
| |
Collapse
|
10
|
Kinchella J, Guo K. Facial Expression Ambiguity and Face Image Quality Affect Differently on Expression Interpretation Bias. Perception 2021; 50:328-342. [PMID: 33709837 DOI: 10.1177/03010066211000270] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
We often show an invariant or comparable recognition performance for perceiving prototypical facial expressions, such as happiness and anger, under different viewing settings. However, it is unclear to what extent the categorisation of ambiguous expressions and associated interpretation bias are invariant in degraded viewing conditions. In this exploratory eye-tracking study, we systematically manipulated both facial expression ambiguity (via morphing happy and angry expressions in different proportions) and face image clarity/quality (via manipulating image resolution) to measure participants' expression categorisation performance, perceived expression intensity, and associated face-viewing gaze distribution. Our analysis revealed that increasing facial expression ambiguity and decreasing face image quality induced the opposite direction of expression interpretation bias (negativity vs. positivity bias, or increased anger vs. increased happiness categorisation), the same direction of deterioration impact on rating expression intensity, and qualitatively different influence on face-viewing gaze allocation (decreased gaze at eyes but increased gaze at mouth vs. stronger central fixation bias). These novel findings suggest that in comparison with prototypical facial expressions, our visual system has less perceptual tolerance in processing ambiguous expressions which are subject to viewing condition-dependent interpretation bias.
Collapse
|
11
|
Ruba AL, Pollak SD. Children's emotion inferences from masked faces: Implications for social interactions during COVID-19. PLoS One 2020; 15:e0243708. [PMID: 33362251 PMCID: PMC7757816 DOI: 10.1371/journal.pone.0243708] [Citation(s) in RCA: 74] [Impact Index Per Article: 18.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2020] [Accepted: 11/29/2020] [Indexed: 11/28/2022] Open
Abstract
To slow the progression of COVID-19, the Centers for Disease Control (CDC) and the World Health Organization (WHO) have recommended wearing face coverings. However, very little is known about how occluding parts of the face might impact the emotion inferences that children make during social interactions. The current study recruited a racially diverse sample of school-aged (7- to 13-years) children from publicly funded after-school programs. Children made inferences from facial configurations that were not covered, wearing sunglasses to occlude the eyes, or wearing surgical masks to occlude the mouth. Children were still able to make accurate inferences about emotions, even when parts of the faces were covered. These data suggest that while there may be some challenges for children incurred by others wearing masks, in combination with other contextual cues, masks are unlikely to dramatically impair children's social interactions in their everyday lives.
Collapse
Affiliation(s)
- Ashley L. Ruba
- Department of Psychology and Waisman Center, University of Wisconsin – Madison, Madison, Wisconsin, United States of America
| | - Seth D. Pollak
- Department of Psychology and Waisman Center, University of Wisconsin – Madison, Madison, Wisconsin, United States of America
| |
Collapse
|
12
|
Pavic K, Oker A, Chetouani M, Chaby L. Age-related changes in gaze behaviour during social interaction: An eye-tracking study with an embodied conversational agent. Q J Exp Psychol (Hove) 2020; 74:1128-1139. [PMID: 33283649 DOI: 10.1177/1747021820982165] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/30/2023]
Abstract
Previous research has highlighted age-related differences in social perception, in particular emotional expression processing. To date, such studies have largely focused on approaches that use static emotional stimuli that the participant has to identify passively without the possibility of any interaction. In this study, we propose an interactive virtual environment to better address age-related variations in social and emotional perception. A group of 22 young (18-30 years) and 20 older (60-80 years) adults were engaged in a face-to-face conversation with an embodied conversational agent. Participants were invited to interact naturally with the agent and to identify his facial expression. Their gaze behaviour was captured by an eye-tracking device throughout the interaction. We also explored whether the Big Five personality traits (particularly extraversion) and anxiety modulated gaze during the social interaction. Findings suggested that age-related differences in gaze behaviour were only apparent when decoding social signals (i.e., listening to a partner's question, identifying facial expressions) and not when communicating social information (i.e., when speaking). Furthermore, higher extraversion levels consistently led to a shorter amount of time gazing towards the eyes, whereas higher anxiety levels led to slight modulations of gaze only when participants were listening to questions. Face-to-face conversation with virtual agents can provide a more naturalistic framework for the assessment of online socio-emotional interaction in older adults, which is not easily observable in classical offline paradigms. This study provides novel and important insights into the specific circumstances in which older adults may experience difficulties in social interactions.
Collapse
Affiliation(s)
- Katarina Pavic
- Institut de psychologie, Université de Paris, Boulogne-Billancourt, France.,Université de Paris, VAC, Boulogne-Billancourt, France
| | - Ali Oker
- Laboratoire Cognition Santé Société (EA 6291), Université de Reims Champagne-Ardenne, Reims, France
| | - Mohamed Chetouani
- Institut des systèmes intelligents et de robotique (ISIR), Sorbonne Université, CNRS UMR7222, Paris, France
| | - Laurence Chaby
- Institut de psychologie, Université de Paris, Boulogne-Billancourt, France.,Institut des systèmes intelligents et de robotique (ISIR), Sorbonne Université, CNRS UMR7222, Paris, France
| |
Collapse
|
13
|
Lebert A, Chaby L, Garnot C, Vergilino-Perez D. The impact of emotional videos and emotional static faces on postural control through a personality trait approach. Exp Brain Res 2020; 238:2877-2886. [DOI: 10.1007/s00221-020-05941-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2020] [Accepted: 10/01/2020] [Indexed: 11/30/2022]
|
14
|
Stevenson N, Guo K. Image Valence Modulates the Processing of Low-Resolution Affective Natural Scenes. Perception 2020; 49:1057-1068. [PMID: 32924858 DOI: 10.1177/0301006620957213] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
In natural vision, noisy and distorted visual inputs often change our perceptual strategy in scene perception. However, it is unclear the extent to which the affective meaning embedded in the degraded natural scenes modulates our scene understanding and associated eye movements. In this eye-tracking experiment by presenting natural scene images with different categories and levels of emotional valence (high-positive, medium-positive, neutral/low-positive, medium-negative, and high-negative), we systematically investigated human participants' perceptual sensitivity (image valence categorization and arousal rating) and image-viewing gaze behaviour to the changes of image resolution. Our analysis revealed that reducing image resolution led to decreased valence recognition and arousal rating, decreased number of fixations in image-viewing but increased individual fixation duration, and stronger central fixation bias. Furthermore, these distortion effects were modulated by the scene valence with less deterioration impact on the valence categorization of negatively valenced scenes and on the gaze behaviour in viewing of high emotionally charged (high-positive and high-negative) scenes. It seems that our visual system shows a valence-modulated susceptibility to the image distortions in scene perception.
Collapse
|
15
|
Guo K, Calver L, Soornack Y, Bourke P. Valence-dependent Disruption in Processing of Facial Expressions of Emotion in Early Visual Cortex—A Transcranial Magnetic Stimulation Study. J Cogn Neurosci 2020; 32:906-916. [DOI: 10.1162/jocn_a_01520] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
Our visual inputs are often entangled with affective meanings in natural vision, implying the existence of extensive interaction between visual and emotional processing. However, little is known about the neural mechanism underlying such interaction. This exploratory transcranial magnetic stimulation (TMS) study examined the possible involvement of the early visual cortex (EVC, Area V1/V2/V3) in perceiving facial expressions of different emotional valences. Across three experiments, single-pulse TMS was delivered at different time windows (50–150 msec) after a brief 10-msec onset of face images, and participants reported the visibility and perceived emotional valence of faces. Interestingly, earlier TMS at ∼90 msec only reduced the face visibility irrespective of displayed expressions, but later TMS at ∼120 msec selectively disrupted the recognition of negative facial expressions, indicating the involvement of EVC in the processing of negative expressions at a later time window, possibly beyond the initial processing of fed-forward facial structure information. The observed TMS effect was further modulated by individuals' anxiety level. TMS at ∼110–120 msec disrupted the recognition of anger significantly more for those scoring relatively low in trait anxiety than the high scorers, suggesting that cognitive bias influences the processing of facial expressions in EVC. Taken together, it seems that EVC is involved in structural encoding of (at least) negative facial emotional valence, such as fear and anger, possibly under modulation from higher cortical areas.
Collapse
|
16
|
Maza A, Moliner B, Ferri J, Llorens R. Visual Behavior, Pupil Dilation, and Ability to Identify Emotions From Facial Expressions After Stroke. Front Neurol 2020; 10:1415. [PMID: 32116988 PMCID: PMC7016192 DOI: 10.3389/fneur.2019.01415] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2019] [Accepted: 12/27/2019] [Indexed: 11/16/2022] Open
Abstract
Social cognition is the innate human ability to interpret the emotional state of others from contextual verbal and non-verbal information, and to self-regulate accordingly. Facial expressions are one of the most relevant sources of non-verbal communication, and their interpretation has been extensively investigated in the literature, using both behavioral and physiological measures, such as those derived from visual activity and visual responses. The decoding of facial expressions of emotion is performed by conscious and unconscious cognitive processes that involve a complex brain network that can be damaged after cerebrovascular accidents. A diminished ability to identify facial expressions of emotion has been reported after stroke, which has traditionally been attributed to impaired emotional processing. While this can be true, an alteration in visual behavior after brain injury could also negatively contribute to this ability. This study investigated the accuracy, distribution of responses, visual behavior, and pupil dilation of individuals with stroke while identifying emotional facial expressions. Our results corroborated impaired performance after stroke and exhibited decreased attention to the eyes, evidenced by a diminished time and number of fixations made in this area in comparison to healthy subjects and comparable pupil dilation. The differences in visual behavior reached statistical significance in some emotions when comparing individuals with stroke with impaired performance with healthy subjects, but not when individuals post-stroke with comparable performance were considered. The performance dependence of visual behavior, although not determinant, might indicate that altered visual behavior could be a negatively contributing factor for emotion recognition from facial expressions.
Collapse
Affiliation(s)
- Anny Maza
- Neurorehabilitation and Brain Research Group, Instituto de Investigación e Innovación en Bioingeniería, Universitat Politècnica de València, Valencia, Spain
| | - Belén Moliner
- NEURORHB, Servicio de Neurorrehabilitación de Hospitales Vithas, Valencia, Spain
| | - Joan Ferri
- NEURORHB, Servicio de Neurorrehabilitación de Hospitales Vithas, Valencia, Spain
| | - Roberto Llorens
- Neurorehabilitation and Brain Research Group, Instituto de Investigación e Innovación en Bioingeniería, Universitat Politècnica de València, Valencia, Spain.,NEURORHB, Servicio de Neurorrehabilitación de Hospitales Vithas, Valencia, Spain
| |
Collapse
|
17
|
Guo K, Soornack Y, Settle R. Expression-dependent susceptibility to face distortions in processing of facial expressions of emotion. Vision Res 2018; 157:112-122. [PMID: 29496513 DOI: 10.1016/j.visres.2018.02.001] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2017] [Revised: 02/02/2018] [Accepted: 02/04/2018] [Indexed: 11/29/2022]
Abstract
Our capability of recognizing facial expressions of emotion under different viewing conditions implies the existence of an invariant expression representation. As natural visual signals are often distorted and our perceptual strategy changes with external noise level, it is essential to understand how expression perception is susceptible to face distortion and whether the same facial cues are used to process high- and low-quality face images. We systematically manipulated face image resolution (experiment 1) and blur (experiment 2), and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. Our analysis revealed a reasonable tolerance to face distortion in expression perception. Reducing image resolution up to 48 × 64 pixels or increasing image blur up to 15 cycles/image had little impact on expression assessment and associated gaze behaviour. Further distortion led to decreased expression categorization accuracy and intensity rating, increased reaction time and fixation duration, and stronger central fixation bias which was not driven by distortion-induced changes in local image saliency. Interestingly, the observed distortion effects were expression-dependent with less deterioration impact on happy and surprise expressions, suggesting this distortion-invariant facial expression perception might be achieved through the categorical model involving a non-linear configural combination of local facial features.
Collapse
Affiliation(s)
- Kun Guo
- School of Psychology, University of Lincoln, UK.
| | | | | |
Collapse
|