1
|
Urtado MB, Rodrigues RD, Fukusima SS. Visual Field Restriction in the Recognition of Basic Facial Expressions: A Combined Eye Tracking and Gaze Contingency Study. Behav Sci (Basel) 2024; 14:355. [PMID: 38785846 PMCID: PMC11117586 DOI: 10.3390/bs14050355] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2023] [Revised: 04/05/2024] [Accepted: 04/18/2024] [Indexed: 05/25/2024] Open
Abstract
Uncertainties and discrepant results in identifying crucial areas for emotional facial expression recognition may stem from the eye tracking data analysis methods used. Many studies employ parameters of analysis that predominantly prioritize the examination of the foveal vision angle, ignoring the potential influences of simultaneous parafoveal and peripheral information. To explore the possible underlying causes of these discrepancies, we investigated the role of the visual field aperture in emotional facial expression recognition with 163 volunteers randomly assigned to three groups: no visual restriction (NVR), parafoveal and foveal vision (PFFV), and foveal vision (FV). Employing eye tracking and gaze contingency, we collected visual inspection and judgment data over 30 frontal face images, equally distributed among five emotions. Raw eye tracking data underwent Eye Movements Metrics and Visualizations (EyeMMV) processing. Accordingly, the visual inspection time, number of fixations, and fixation duration increased with the visual field restriction. Nevertheless, the accuracy showed significant differences among the NVR/FV and PFFV/FV groups, despite there being no difference in NVR/PFFV. The findings underscore the impact of specific visual field areas on facial expression recognition, highlighting the importance of parafoveal vision. The results suggest that eye tracking data analysis methods should incorporate projection angles extending to at least the parafoveal level.
Collapse
Affiliation(s)
- Melina Boratto Urtado
- Faculty of Philosophy, Sciences and Letters at Ribeirão Preto, University of São Paulo, Ribeirão Preto 14040-901, Brazil;
| | | | - Sergio Sheiji Fukusima
- Faculty of Philosophy, Sciences and Letters at Ribeirão Preto, University of São Paulo, Ribeirão Preto 14040-901, Brazil;
| |
Collapse
|
2
|
Hamlin N, Myers K, Taylor BK, Doucet GE. Role of Emotion Reactivity to Predict Facial Emotion Recognition Changes with Aging. Exp Aging Res 2023:1-18. [PMID: 37660356 PMCID: PMC10908871 DOI: 10.1080/0361073x.2023.2254658] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2023] [Accepted: 08/28/2023] [Indexed: 09/05/2023]
Abstract
Emotional intelligence includes an assortment of factors related to emotion function. Such factors involve emotion recognition (in this case via facial expression), emotion trait, reactivity, and regulation. We aimed to investigate how the subjective appraisals of emotional intelligence (i.e. trait, reactivity, and regulation) are associated with objective emotion recognition accuracy, and how these associations differ between young and older adults. Data were extracted from the CamCAN dataset (189 adults: 57 young/118 older) from assessments measuring these emotion constructs. Using linear regression models, we found that greater negative reactivity was associated with better emotion recognition accuracy among older adults, though the pattern was opposite for young adults with the greatest difference in disgust and surprise recognition. Positive reactivity and depression level predicted surprise recognition, with the associations significantly differing between the age groups. The present findings suggest the level to which older and young adults react to emotional stimuli differentially predicts their ability to correctly identify facial emotion expressions. Older adults with higher negative reactivity may be able to integrate their negative emotions effectively in order to recognize other's negative emotions more accurately. Alternatively, young adults may experience interference from negative reactivity, lowering their ability to recognize other's negative emotions.
Collapse
Affiliation(s)
- Noah Hamlin
- Institute for Human Neuroscience, Boys Town National Research Hospital, Omaha, NE
| | - Katrina Myers
- Institute for Human Neuroscience, Boys Town National Research Hospital, Omaha, NE
| | - Brittany K. Taylor
- Institute for Human Neuroscience, Boys Town National Research Hospital, Omaha, NE
- Department of Pharmacology and Neuroscience, Creighton University, Omaha, NE
| | - Gaelle E. Doucet
- Institute for Human Neuroscience, Boys Town National Research Hospital, Omaha, NE
- Department of Pharmacology and Neuroscience, Creighton University, Omaha, NE
| |
Collapse
|
3
|
Chaudhary S, Zhang S, Zhornitsky S, Chen Y, Chao HH, Li CSR. Age-related reduction in trait anxiety: Behavioral and neural evidence of automaticity in negative facial emotion processing. Neuroimage 2023; 276:120207. [PMID: 37263454 PMCID: PMC10330646 DOI: 10.1016/j.neuroimage.2023.120207] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Revised: 05/14/2023] [Accepted: 05/29/2023] [Indexed: 06/03/2023] Open
Abstract
Trait anxiety diminishes with age, which may result from age-related decline in registering salient emotional stimuli and/or enhancement in emotion regulation. We tested the hypotheses in 88 adults 21 to 85 years of age and studied with fMRI of the Hariri task. Age-related decline in stimulus registration would manifest in delayed reaction time (RT) and diminished saliency circuit activity in response to emotional vs. neutral stimuli. Enhanced control of negative emotions would manifest in diminished limbic/emotional circuit and higher prefrontal cortical (PFC) responses to negative emotion. The results showed that anxiety was negatively correlated with age. Age was associated with faster RT and diminished activation of the medial PFC, in the area of the dorsal and rostral anterior cingulate cortex (dACC/rACC) - a hub of the saliency circuit - during matching of negative but not positive vs. neutral emotional faces. A slope test confirmed the differences in the regressions. Further, age was not associated with activation of the PFC in whole-brain regression or in region-of-interest analysis of the dorsolateral PFC, an area identified from meta-analyses of the emotion regulation literature. Together, the findings fail to support either hypothesis; rather, the findings suggest age-related automaticity in processing negative emotions as a potential mechanism of diminished anxiety. Automaticity results in faster RT and diminished anterior cingulate activity in response to negative but not positive emotional stimuli. In support, analyses of psychophysiological interaction demonstrated higher dACC/rACC connectivity with the default mode network, which has been implicated in automaticity in information processing. As age increased, individuals demonstrated faster RT with higher connectivity during matching of negative vs. neutral images. Automaticity in negative emotion processing needs to be investigated as a mechanism of age-related reduction in anxiety.
Collapse
Affiliation(s)
- Shefali Chaudhary
- Department of Psychiatry, Yale University School of Medicine, New Haven, CT 06519, United States.
| | - Sheng Zhang
- Department of Psychiatry, Yale University School of Medicine, New Haven, CT 06519, United States.
| | - Simon Zhornitsky
- Department of Psychiatry, Yale University School of Medicine, New Haven, CT 06519, United States.
| | - Yu Chen
- Department of Psychiatry, Yale University School of Medicine, New Haven, CT 06519, United States.
| | - Herta H Chao
- VA Connecticut Healthcare System, West Haven, CT 06516, United States; Department of Medicine, Yale University School of Medicine, New Haven, CT 06519, United States.
| | - Chiang-Shan R Li
- Department of Psychiatry, Yale University School of Medicine, New Haven, CT 06519, United States; Department of Neuroscience, Yale University School of Medicine, New Haven, CT 06520, United States; Wu Tsai Institute, Yale University, New Haven, CT 06520, United States.
| |
Collapse
|
4
|
Kim M, Cho Y, Kim SY. Effects of diagnostic regions on facial emotion recognition: The moving window technique. Front Psychol 2022; 13:966623. [PMID: 36186300 PMCID: PMC9518794 DOI: 10.3389/fpsyg.2022.966623] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2022] [Accepted: 08/01/2022] [Indexed: 11/13/2022] Open
Abstract
With regard to facial emotion recognition, previous studies found that specific facial regions were attended more in order to identify certain emotions. We investigated whether a preferential search for emotion-specific diagnostic regions could contribute toward the accurate recognition of facial emotions. Twenty-three neurotypical adults performed an emotion recognition task using six basic emotions: anger, disgust, fear, happiness, sadness, and surprise. The participants’ exploration patterns for the faces were measured using the Moving Window Technique (MWT). This technique presented a small window on a blurred face, and the participants explored the face stimuli through a mouse-controlled window in order to recognize the emotions on the face. Our results revealed that when the participants explored the diagnostic regions for each emotion more frequently, the correct recognition of the emotions occurred at a faster rate. To the best of our knowledge, this current study is the first to present evidence that an exploration of emotion-specific diagnostic regions can predict the reaction time of accurate emotion recognition among neurotypical adults. Such findings can be further applied in the evaluation and/or training (regarding emotion recognition functions) of both typically and atypically developing children with emotion recognition difficulties.
Collapse
Affiliation(s)
- Minhee Kim
- Department of Psychology, Duksung Women’s University, Seoul, South Korea
| | - Youngwug Cho
- Department of Computer Science, Hanyang University, Seoul, South Korea
| | - So-Yeon Kim
- Department of Psychology, Duksung Women’s University, Seoul, South Korea
- *Correspondence: So-Yeon Kim,
| |
Collapse
|
5
|
Slessor G, Insch P, Donaldson I, Sciaponaite V, Adamowicz M, Phillips LH. Adult Age Differences in Using Information From the Eyes and Mouth to Make Decisions About Others' Emotions. J Gerontol B Psychol Sci Soc Sci 2022; 77:2241-2251. [PMID: 35948271 PMCID: PMC9799183 DOI: 10.1093/geronb/gbac097] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Indexed: 01/13/2023] Open
Abstract
OBJECTIVES Older adults are often less accurate than younger counterparts at identifying emotions such as anger, sadness, and fear from faces. They also look less at the eyes and more at the mouth during emotion perception. The current studies advance understanding of the nature of these age effects on emotional processing. METHODS Younger and older participants identified emotions from pictures of eyes or mouths (Experiment 1) and incongruent mouth-eyes emotion combinations (Experiment 2). In Experiment 3, participants categorized emotions from pictures in which face masks covered the mouth region. RESULTS Older adults were worse than young at identifying anger and sadness from eyes, but better at identifying the same emotions from the mouth region (Experiment 1) and they were more likely than young to use information from the mouth to classify anger, fear, and disgust (Experiment 2). In Experiment 3, face masks impaired perception of anger, sadness, and fear more for older compared to younger adults. DISCUSSION These studies indicate that older people are more able than young to interpret emotional information from the mouth, they are more biased to use information from the mouth, and suffer more difficulty in emotion perception when the mouth is covered with a face mask. This has implications for social communication in different age groups.
Collapse
Affiliation(s)
| | - Pauline Insch
- School of Psychology, University of Aberdeen, Aberdeen, UK
| | - Isla Donaldson
- School of Psychology, University of Aberdeen, Aberdeen, UK
| | | | | | - Louise H Phillips
- Address correspondence to: Louise Phillips, PhD, School of Psychology, University of Aberdeen, Aberdeen AB24 3FX, UK. E-mail:
| |
Collapse
|
6
|
Msika EF, Ehrlé N, Gaston-Bellegarde A, Orriols E, Piolino P, Narme P. Using a Computer-Based Virtual Environment to Assess Social Cognition in Aging: An Exploratory Study of the REALSoCog Task. Front Psychol 2022; 13:882165. [PMID: 35664139 PMCID: PMC9157049 DOI: 10.3389/fpsyg.2022.882165] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2022] [Accepted: 03/25/2022] [Indexed: 12/03/2022] Open
Abstract
Although previous studies have suggested that some component processes of social cognition decline in normal aging, several methodological limitations can be pointed out. Traditional sociocognitive tasks assess processes separately and lack ecological validity. In the present study, the main aim was to propose an integrative social cognition assessment in normal aging using an original computer-based task developed in non-immersive virtual reality. Forty-five young adults (YA) and 50 older adults (OA) were asked to navigate in a simulated city environment and to judge several situations that they encountered. These situations investigated social norms by displaying control or (conventional/moral) transgressions. Following each situation, the participants were asked several questions in order to assess their ability to make moral judgments, affective and cognitive theory of mind, emotional reactivity and empathy, and the propensity to act in a socially appropriate or inappropriate way. The main results showed (i) a preserved ability to detect moral and conventional transgressions with advancing age; (ii) participants’ preserved cognitive ToM abilities; (iii) an age-related decline in affective ToM, that disappeared when the victim was a senior; (iv) preserved emotional reactivity and emotional empathy in normal aging; (v) an increase in inappropriate behavioral intentions in normal aging. Offering more naturalistic conditions, this new task is an interesting integrative measure of sociocognitive functioning to better reflect social behavior in daily living.
Collapse
Affiliation(s)
- Eva-Flore Msika
- MC2Lab (UR 7536), Institut de Psychologie, Université Paris Cité, Paris, France
| | - Nathalie Ehrlé
- MC2Lab (UR 7536), Institut de Psychologie, Université Paris Cité, Paris, France.,Service de Neurologie, CHRU Maison-Blanche, Reims, France
| | | | - Eric Orriols
- MC2Lab (UR 7536), Institut de Psychologie, Université Paris Cité, Paris, France
| | - Pascale Piolino
- MC2Lab (UR 7536), Institut de Psychologie, Université Paris Cité, Paris, France
| | - Pauline Narme
- MC2Lab (UR 7536), Institut de Psychologie, Université Paris Cité, Paris, France
| |
Collapse
|
7
|
Low ACY, Oh VYS, Tong EMW, Scarf D, Ruffman T. Older adults have difficulty decoding emotions from the eyes, whereas easterners have difficulty decoding emotion from the mouth. Sci Rep 2022; 12:7408. [PMID: 35524152 PMCID: PMC9076610 DOI: 10.1038/s41598-022-11381-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2021] [Accepted: 04/19/2022] [Indexed: 12/05/2022] Open
Abstract
Older adults and Easterners have worse emotion recognition (than young adults and Westerners, respectively), but the question of why remains unanswered. Older adults look less at eyes, whereas Easterners look less at mouths, raising the possibility that compelling older adults to look at eyes, and Easterners to look at mouths, might improve recognition. We did this by comparing emotion recognition in 108 young adults and 109 older adults from New Zealand and Singapore in the (a) eyes on their own (b) mouth on its own or (c) full face. Older adults were worse than young adults on 4/6 emotions with the Eyes Only stimuli, but only 1/6 emotions with the Mouth Only stimuli. In contrast, Easterners were worse than Westerners on 6/6 emotions for Mouth Only and Full Face stimuli, but were equal on all six emotions for Eyes Only stimuli. These results provide a substantial leap forward because they point to the precise difficulty for older adults and Easterners. Older adults have more consistent difficulty identifying individual emotions in the eyes compared to the mouth, likely due to declining brain functioning, whereas Easterners have more consistent difficulty identifying emotions from the mouth than the eyes, likely due to inexperience inferring mouth information.
Collapse
Affiliation(s)
- Anna C Y Low
- Department of Psychology, University of Otago, P.O. Box 56, Dunedin, 9054, New Zealand
| | - Vincent Y S Oh
- Department of Psychology, National University of Singapore, Block AS4, Level 2, 9 Arts Link, Singapore, 117570, Singapore
| | - Eddie M W Tong
- Department of Psychology, National University of Singapore, Block AS4, Level 2, 9 Arts Link, Singapore, 117570, Singapore
| | - Damian Scarf
- Department of Psychology, University of Otago, P.O. Box 56, Dunedin, 9054, New Zealand
| | - Ted Ruffman
- Department of Psychology, University of Otago, P.O. Box 56, Dunedin, 9054, New Zealand.
| |
Collapse
|
8
|
Pavic K, Oker A, Chetouani M, Chaby L. Age-related changes in gaze behaviour during social interaction: An eye-tracking study with an embodied conversational agent. Q J Exp Psychol (Hove) 2020; 74:1128-1139. [PMID: 33283649 DOI: 10.1177/1747021820982165] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/30/2023]
Abstract
Previous research has highlighted age-related differences in social perception, in particular emotional expression processing. To date, such studies have largely focused on approaches that use static emotional stimuli that the participant has to identify passively without the possibility of any interaction. In this study, we propose an interactive virtual environment to better address age-related variations in social and emotional perception. A group of 22 young (18-30 years) and 20 older (60-80 years) adults were engaged in a face-to-face conversation with an embodied conversational agent. Participants were invited to interact naturally with the agent and to identify his facial expression. Their gaze behaviour was captured by an eye-tracking device throughout the interaction. We also explored whether the Big Five personality traits (particularly extraversion) and anxiety modulated gaze during the social interaction. Findings suggested that age-related differences in gaze behaviour were only apparent when decoding social signals (i.e., listening to a partner's question, identifying facial expressions) and not when communicating social information (i.e., when speaking). Furthermore, higher extraversion levels consistently led to a shorter amount of time gazing towards the eyes, whereas higher anxiety levels led to slight modulations of gaze only when participants were listening to questions. Face-to-face conversation with virtual agents can provide a more naturalistic framework for the assessment of online socio-emotional interaction in older adults, which is not easily observable in classical offline paradigms. This study provides novel and important insights into the specific circumstances in which older adults may experience difficulties in social interactions.
Collapse
Affiliation(s)
- Katarina Pavic
- Institut de psychologie, Université de Paris, Boulogne-Billancourt, France.,Université de Paris, VAC, Boulogne-Billancourt, France
| | - Ali Oker
- Laboratoire Cognition Santé Société (EA 6291), Université de Reims Champagne-Ardenne, Reims, France
| | - Mohamed Chetouani
- Institut des systèmes intelligents et de robotique (ISIR), Sorbonne Université, CNRS UMR7222, Paris, France
| | - Laurence Chaby
- Institut de psychologie, Université de Paris, Boulogne-Billancourt, France.,Institut des systèmes intelligents et de robotique (ISIR), Sorbonne Université, CNRS UMR7222, Paris, France
| |
Collapse
|
9
|
Modelling User Preference for Embodied Artificial Intelligence and Appearance in Realistic Humanoid Robots. INFORMATICS 2020. [DOI: 10.3390/informatics7030028] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023] Open
Abstract
Realistic humanoid robots (RHRs) with embodied artificial intelligence (EAI) have numerous applications in society as the human face is the most natural interface for communication and the human body the most effective form for traversing the manmade areas of the planet. Thus, developing RHRs with high degrees of human-likeness provides a life-like vessel for humans to physically and naturally interact with technology in a manner insurmountable to any other form of non-biological human emulation. This study outlines a human–robot interaction (HRI) experiment employing two automated RHRs with a contrasting appearance and personality. The selective sample group employed in this study is composed of 20 individuals, categorised by age and gender for a diverse statistical analysis. Galvanic skin response, facial expression analysis, and AI analytics permitted cross-analysis of biometric and AI data with participant testimonies to reify the results. This study concludes that younger test subjects preferred HRI with a younger-looking RHR and the more senior age group with an older looking RHR. Moreover, the female test group preferred HRI with an RHR with a younger appearance and male subjects with an older looking RHR. This research is useful for modelling the appearance and personality of RHRs with EAI for specific jobs such as care for the elderly and social companions for the young, isolated, and vulnerable.
Collapse
|
10
|
Abbruzzese L, Magnani N, Robertson IH, Mancuso M. Age and Gender Differences in Emotion Recognition. Front Psychol 2019; 10:2371. [PMID: 31708832 PMCID: PMC6819430 DOI: 10.3389/fpsyg.2019.02371] [Citation(s) in RCA: 53] [Impact Index Per Article: 10.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2019] [Accepted: 10/04/2019] [Indexed: 12/19/2022] Open
Abstract
Background Existing literature suggests that age affects recognition of affective facial expressions. Eye-tracking studies highlighted that age-related differences in recognition of emotions could be explained by different face exploration patterns due to attentional impairment. Gender also seems to play a role in recognition of emotions. Unfortunately, little is known about the differences in emotion perception abilities across lifespans for men and women, even if females show more ability from infancy. Objective The present study aimed to examine the role of age and gender on facial emotion recognition in relation to neuropsychological functions and face exploration strategies. We also aimed to explore the associations between emotion recognition and quality of life. Methods 60 healthy people were consecutively enrolled in the study and divided into two groups: Younger Adults and Older Adults. Participants were assessed for: emotion recognition, attention abilities, frontal functioning, memory functioning and quality of life satisfaction. During the execution of the emotion recognition test using the Pictures of Facial Affects (PoFA) and a modified version of PoFA (M-PoFA), subject’s eye movements were recorded with an Eye Tracker. Results Significant differences between younger and older adults were detected for fear recognition when adjusted for cognitive functioning and eye-gaze fixations characteristics. Adjusted means of fear recognition were significantly higher in the younger group than in the older group. With regard to gender’s effects, old females recognized identical pairs of emotions better than old males. Considering the Satisfaction Profile (SAT-P) we detected negative correlations between some dimensions (Physical functioning, Sleep/feeding/free time) and emotion recognition (i.e., sadness, and disgust). Conclusion The current study provided novel insights into the specific mechanisms that may explain differences in emotion recognition, examining how age and gender differences can be outlined by cognitive functioning and face exploration strategies.
Collapse
Affiliation(s)
| | - Nadia Magnani
- Adult Mental Health Service, NHS-USL Tuscany South-Est, Grosseto, Italy
| | - Ian H Robertson
- Global Brain Health Institute, Trinity College Institute of Neuroscience, Trinity College Dublin, The University of Dublin, Dublin, Ireland
| | - Mauro Mancuso
- Tuscany Rehabilitation Clinic, Montevarchi, Italy.,Physical and Rehabilitative Medicine Unit, NHS-USL Tuscany South-Est, Grosseto, Italy
| |
Collapse
|
11
|
Correction: Exploring emotional expression recognition in aging adults using the Moving Window Technique. PLoS One 2018; 13:e0208767. [PMID: 30513125 PMCID: PMC6279025 DOI: 10.1371/journal.pone.0208767] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
Abstract
[This corrects the article DOI: 10.1371/journal.pone.0205341.].
Collapse
|