1
|
Xu HZ, Peng XR, Huan SY, Xu JJ, Yu J, Ma QG. Are older adults less generous? Age differences in emotion-related social decision making. Neuroimage 2024; 297:120756. [PMID: 39074759 DOI: 10.1016/j.neuroimage.2024.120756] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2024] [Revised: 07/09/2024] [Accepted: 07/24/2024] [Indexed: 07/31/2024] Open
Abstract
In social interaction, age-related differences in emotional processing may lead to varied social decision making between young and older adults. However, previous studies of social decision making have paid less attention to the interactants' emotions, leaving age differences and underlying neural mechanisms unexplored. To address this gap, the present study combined functional and structural magnetic resonance imaging, employing a modified dictator game task with recipients displaying either neutral or sad facial expressions. Behavioral results indicated that although older adults' overall allocations did not differ significantly from those of young adults, older adults' allocations showing a decrease in emotion-related generosity compared to young adults. Using representational similarity analysis, we found that older adults showed reduced neural representations of recipients' emotions and gray matter volume in the right anterior cingulate gyrus (ACC), right insula, and left dorsomedial prefrontal cortex (DMPFC) compared to young adults. More importantly, mediation analyses indicated that age influenced allocations not only through serial mediation of neural representations of the right insula and left DMPFC, but also through serial mediation of the mean gray matter volume of the right ACC and left DMPFC. This study identifies the potential neural pathways through which age affects emotion-related social decision making, advancing our understanding of older adults' social interaction behavior that they may not be less generous unless confronted with individuals with specific emotions.
Collapse
Affiliation(s)
- Hong-Zhou Xu
- Faculty of Psychology, Southwest University, Chongqing 400715, China
| | - Xue-Rui Peng
- Faculty of Psychology, Technische Universität Dresden, Dresden 01062, Germany; Centre for Tactile Internet with Human-in-the-Loop, Technische Universität Dresden, Dresden 01062, Germany; Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig 04103, Germany
| | - Shen-Yin Huan
- Faculty of Psychology, Southwest University, Chongqing 400715, China
| | - Jia-Jie Xu
- Faculty of Psychology, Southwest University, Chongqing 400715, China
| | - Jing Yu
- Faculty of Psychology, Southwest University, Chongqing 400715, China.
| | - Qing-Guo Ma
- Neuromanagement Laboratory, School of Management, Zhejiang University, Hangzhou 310058, China; Institute of Neural Management Sciences, Zhejiang University of Technology, Hangzhou 310014, China
| |
Collapse
|
2
|
Brener SA, Frankenhuis WE, Young ES, Ellis BJ. Social Class, Sex, and the Ability to Recognize Emotions: The Main Effect is in the Interaction. PERSONALITY AND SOCIAL PSYCHOLOGY BULLETIN 2024; 50:1197-1210. [PMID: 37013847 DOI: 10.1177/01461672231159775] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/05/2023]
Abstract
Previous research has demonstrated an inverse relation between subjective social class (SSC) and performance on emotion recognition tasks. Study 1 (N = 418) involved a preregistered replication of this effect using the Reading the Mind in the Eyes Task and the Cambridge Mindreading Face-Voice Battery. The inverse relation replicated; however, exploratory analyses revealed a significant interaction between sex and SSC in predicting emotion recognition, indicating that the effect was driven by males. In Study 2 (N = 745), we preregistered and tested the interaction on a separate archival dataset. The interaction replicated; the association between SSC and emotion recognition again occurred only in males. Exploratory analyses (Study 3; N = 381) examined the generalizability of the interaction to incidental face memory. Our results underscore the need to reevaluate previous research establishing the main effects of social class and sex on emotion recognition abilities, as these effects apparently moderate each other.
Collapse
Affiliation(s)
| | - Willem E Frankenhuis
- Utrecht University, The Netherlands
- Max Planck Institute for the Study of Crime, Security and Law, Freiburg, Germany
| | | | | |
Collapse
|
3
|
Shoenfelt A, Pehlivanoglu D, Lin T, Ziaei M, Feifel D, Ebner NC. Effects of chronic intranasal oxytocin on visual attention to faces vs. natural scenes in older adults. Psychoneuroendocrinology 2024; 164:107018. [PMID: 38461634 DOI: 10.1016/j.psyneuen.2024.107018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/15/2023] [Revised: 02/07/2024] [Accepted: 03/03/2024] [Indexed: 03/12/2024]
Abstract
Aging is associated with changes in face processing, including desensitization to face cues like gaze direction and an attentional preference to faces with positive over negative emotional valence. A parallel line of research has shown that acute administration of oxytocin (OT) increases visual attention to social stimuli such as human faces. The current study examined effects of chronic OT administration among older adults on fixation duration to faces that varied in emotional expression, gaze direction, age, and sex. One hundred and twelve generally healthy older adults (aged 55-95 years) underwent a randomized, placebo-controlled, double-blind, between-subject clinical trial in which they self-administered either OT or placebo (P) intranasally twice a day for 4 weeks. The behavioral task involved rating the trustworthiness of faces (i.e., social stimuli) and natural scenes (i.e., non-social control stimuli) during eye tracking and was conducted before and after the intervention. Fixation duration to both the faces and the natural scenes declined from pre- to post-intervention, however this decline was less pronounced among older adults in the OT compared to the P group for faces but not scenes. Further, face cues (emotional expression, gaze direction, age, sex) did not moderate the treatment effect. This study provides first evidence that chronic intranasal OT maintains salience of social cues over time in older adults, perhaps buffering effects of habituation. These findings enhance understanding of OT effects on social cognition among older adults, and would benefit from follow up with a young adult comparison group to directly speak to specificity of observed effects to older adults and reflection of the aging process.
Collapse
Affiliation(s)
- Alayna Shoenfelt
- Department of Psychology, University of Florida, P.O. Box 112250, Gainesville, FL 32611-2250, USA.
| | - Didem Pehlivanoglu
- Department of Psychology, University of Florida, P.O. Box 112250, Gainesville, FL 32611-2250, USA
| | - Tian Lin
- Department of Psychology, University of Florida, P.O. Box 112250, Gainesville, FL 32611-2250, USA
| | - Maryam Ziaei
- Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology, Trondheim 7030, Norway; K.G. Jebsen Centre for Alzheimer's disease, Norwegian University of Science and Technology, Trondheim 7030, Norway
| | - David Feifel
- Department of Psychiatry, University of California San Diego, 9500 Gilman Dr, La Jolla, CA 92093, USA
| | - Natalie C Ebner
- Department of Psychology, University of Florida, P.O. Box 112250, Gainesville, FL 32611-2250, USA; Cognitive Aging and Memory Program, Clinical Translational Research Program, University of Florida, 2004 Mowry Road, Gainesville, FL 32611, USA; McKnight Brain Institute, University of Florida, 1149 Newell Drive, Gainesville, FL 32610, USA.
| |
Collapse
|
4
|
Ebner NC, Horta M, El-Shafie D. New directions for studying the aging social-cognitive brain. Curr Opin Psychol 2024; 56:101768. [PMID: 38104362 PMCID: PMC10939782 DOI: 10.1016/j.copsyc.2023.101768] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2023] [Revised: 11/17/2023] [Accepted: 11/21/2023] [Indexed: 12/19/2023]
Abstract
The study of social cognition has extended across the lifespan with a recent special focus on the impacts of aging on the social-cognitive brain. This review summarizes current knowledge on social perception, theory of mind, empathy, and social behavior from a social-cognitive neuroscience of aging perspective and identifies new directions for studying the aging social-cognitive brain. These new directions highlight the need for (i) standardized operationalization and analysis of social-cognitive constructs; (ii) use of naturalistic paradigms to enhance ecological validity of social-cognitive measures; (iii) application of repeated assessments via single-N designs for robust delineation of social-cognitive processes in the aging brain; (iv) increased representation of vulnerable aging populations in social-cognitive brain research to enhance diversity, promote generalizability, and allow for cross-population comparisons.
Collapse
Affiliation(s)
- Natalie C Ebner
- Department of Psychology, University of Florida, Gainesville, FL, USA; Institute on Aging, University of Florida, Gainesville, FL, USA; Center for Cognitive Aging and Memory, University of Florida, Gainesville, FL, USA.
| | - Marilyn Horta
- Department of Psychology, University of Florida, Gainesville, FL, USA; Pain Research and Intervention Center of Excellence, University of Florida, Gainesville, FL, USA
| | - Dalia El-Shafie
- Department of Psychology, University of Florida, Gainesville, FL, USA
| |
Collapse
|
5
|
Burgio F, Menardi A, Benavides-Varela S, Danesin L, Giustiniani A, Van den Stock J, De Mitri R, Biundo R, Meneghello F, Antonini A, Vallesi A, de Gelder B, Semenza C. Facial emotion recognition in individuals with mild cognitive impairment: An exploratory study. COGNITIVE, AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2024:10.3758/s13415-024-01160-5. [PMID: 38316707 DOI: 10.3758/s13415-024-01160-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 01/10/2024] [Indexed: 02/07/2024]
Abstract
Understanding facial emotions is fundamental to interact in social environments and modify behavior accordingly. Neurodegenerative processes can progressively transform affective responses and affect social competence. This exploratory study examined the neurocognitive correlates of face recognition, in individuals with two mild cognitive impairment (MCI) etiologies (prodromal to dementia - MCI, or consequent to Parkinson's disease - PD-MCI). Performance on the identification and memorization of neutral and emotional facial expressions was assessed in 31 individuals with MCI, 26 with PD-MCI, and 30 healthy controls (HC). Individuals with MCI exhibited selective impairment in recognizing faces expressing fear, along with difficulties in remembering both neutral and emotional faces. Conversely, individuals with PD-MCI showed no differences compared with the HC in either emotion recognition or memory. In MCI, no significant association emerged between the memory for facial expressions and cognitive difficulties. In PD-MCI, regression analyses showed significant associations with higher-level cognitive functions in the emotional memory task, suggesting the presence of compensatory mechanisms. In a subset of participants, voxel-based morphometry revealed that the performance on emotional tasks correlated with regional changes in gray matter volume. The performance in the matching of negative expressions was predicted by volumetric changes in brain areas engaged in face and emotional processing, in particular increased volume in thalamic nuclei and atrophy in the right parietal cortex. Future studies should leverage on neuroimaging data to determine whether differences in emotional recognition are mediated by pathology-specific atrophic patterns.
Collapse
Affiliation(s)
| | - Arianna Menardi
- Department of Neuroscience, University of Padova, 35128, Padova, Italy
- Padova Neuroscience Center, University of Padova, 35129, Padova, Italy
| | - Silvia Benavides-Varela
- Padova Neuroscience Center, University of Padova, 35129, Padova, Italy
- Department of Developmental and Social Psychology, University of Padova, Padova, Italy
| | | | | | - Jan Van den Stock
- Department of Neuroscience, Leuven Brain Institute, KU Leuven, 3000, Leuven, Belgium
- Geriatric Psychiatry, University Psychiatric Center KU Leuven, 3000, Leuven, Belgium
| | | | - Roberta Biundo
- Department of General Psychology (DPG), University of Padua, 35131, Padua, Italy
- Study Center for Neurodegeneration (CESNE), University of Padua, 35131, Padua, Italy
| | - Francesca Meneghello
- Unità Operativa Complessa Cure Primarie Distretto 3 Mirano-Dolo, Aulss 3, Serenissima, Italy
| | - Angelo Antonini
- Parkinson's Disease and Movement Disorders Unit, Department of Neuroscience, Centre for Rare Neurological Diseases (ERN-RND), University of Padova, Padova, Italy
| | - Antonino Vallesi
- Department of Neuroscience, University of Padova, 35128, Padova, Italy
- Padova Neuroscience Center, University of Padova, 35129, Padova, Italy
| | - Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, 6200, MD, Maastricht, the Netherlands
| | - Carlo Semenza
- Department of Neuroscience, University of Padova, 35128, Padova, Italy
- Padova Neuroscience Center, University of Padova, 35129, Padova, Italy
| |
Collapse
|
6
|
Chen KW, Lee SC, Chou FHC, Chiang HY, Hsueh IP, Chen PH, Wang SP, Ju YJ, Hsieh CL. Development of a Rasch-calibrated emotion recognition video test for patients with schizophrenia. Arch Clin Neuropsychol 2023:acad098. [PMID: 38163920 DOI: 10.1093/arclin/acad098] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2023] [Revised: 10/31/2023] [Accepted: 11/29/2023] [Indexed: 01/03/2024] Open
Abstract
Patients with schizophrenia tend to have deficits in emotion recognition (ER) that affect their social function. However, the commonly-used ER measures appear incomprehensive, unreliable and invalid, making it difficult to comprehensively evaluate ER. The purposes of this study were to develop the Computerized Emotion Recognition Video Test (CERVT) evaluating ER ability in patients with schizophrenia. This study was divided into two phases. First, we selected candidate CERVT items/videos of 8 basic emotion domains from a published database. Second, we validated the selected CERVT items using Rasch analysis. Finally, the 269 patients and 177 healthy adults were recruited to ensure the participants had diverse abilities. After the removal of 21 misfit (infit or outfit mean square > 1.4) items and adjustment of the item difficulties of the 26 items with severe differential item functioning, the remaining 217 items were finalized as the CERVT items. All the CERVT items showed good model fits with small eigenvalues (≤ 2) based on the residual-based principal components analysis for each domain, supporting the unidimensionality of these items. The 8 domains of the CERVT had good to excellent reliabilities (average Rasch reliabilities = 0.84-0.93). The CERVT contains items of the 8 basic emotions with individualized scores. Moreover, the CERVT showed acceptable reliability and validity, and the scores were not affected by examinees' gender. Thus, the CERVT has the potential to provide a comprehensive, reliable, valid, and gender-unbiased assessment of ER for patients with schizophrenia.
Collapse
Affiliation(s)
- Kuan-Wei Chen
- Department of Occupational Therapy, Kaohsiung Municipal Kai-Syuan Psychiatric Hospital, Kaohsiung, Taiwan
- Department of Occupational Therapy, Shu-Zen Junior College of Medicine and Management, Kaohsiung, Taiwan
| | - Shih-Chieh Lee
- School of Occupational Therapy, College of Medicine, National Taiwan University, Taipei, Taiwan
- Department of Psychiatry, National Taiwan University Hospital, Taipei, Taiwan
| | - Frank Huang-Chih Chou
- Superintendent Office, Kaohsiung Municipal Kai-Syuan Psychiatric Hospital, Kaohsiung, Taiwan
| | - Hsin-Yu Chiang
- Department of Occupational Therapy, College of Medicine, Fu Jen Catholic University, New Taipei City, Taiwan
| | - I-Ping Hsueh
- School of Occupational Therapy, College of Medicine, National Taiwan University, Taipei, Taiwan
- Department of Physical Medicine and Rehabilitation, National Taiwan University Hospital, Taipei, Taiwan
| | - Po-Hsi Chen
- Department of Educational Psychology and Counseling, Institute for Research Excellent in Learning Sciences, National Taiwan Normal University, Taipei, Taiwan
| | - San-Ping Wang
- Department of Occupational Therapy, Taoyuan Psychiatric Center, Ministry of Health and Welfare, Taoyuan, Taiwan
| | - Yu-Jeng Ju
- School of Occupational Therapy, College of Medicine, National Taiwan University, Taipei, Taiwan
| | - Ching-Lin Hsieh
- School of Occupational Therapy, College of Medicine, National Taiwan University, Taipei, Taiwan
- Department of Physical Medicine and Rehabilitation, National Taiwan University Hospital, Taipei, Taiwan
- Department of Occupational Therapy, College of Medical and Health Sciences, Asia University, Taichung, Taiwan
| |
Collapse
|
7
|
Orlando I, Ricci C, Griffanti L, Filippini N. Neural correlates of successful emotion recognition in healthy elderly: a multimodal imaging study. Soc Cogn Affect Neurosci 2023; 18:nsad058. [PMID: 37837299 PMCID: PMC10612567 DOI: 10.1093/scan/nsad058] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2023] [Revised: 08/11/2023] [Accepted: 10/05/2023] [Indexed: 10/15/2023] Open
Abstract
The ageing process is associated with reduced emotional recognition (ER) performance. The ER ability is an essential part of non-verbal communication, and its role is crucial for proper social functioning. Here, using the 'Cambridge Centre for Ageing and Neuroscience cohort sample', we investigated when ER, measured using a facial emotion recognition test, begins to consistently decrease along the lifespan. Moreover, using structural and functional MRI data, we identified the neural correlates associated with ER maintenance in the age groups showing early signs of ER decline (N = 283; age range: 58-89 years). The ER performance was positively correlated with greater volume in the superior parietal lobule, higher white matter integrity in the corpus callosum and greater functional connectivity in the mid-cingulate area. Our results suggest that higher ER accuracy in older people is associated with preserved gray and white matter volumes in cognitive or interconnecting areas, subserving brain regions directly involved in emotional processing.
Collapse
Affiliation(s)
- Isabella Orlando
- Department of Psychology, Salesian Pontifical University of Rome, Rome 00139, Italy
| | - Carlo Ricci
- Department of Psychology, Salesian Pontifical University of Rome, Rome 00139, Italy
- Department of Psychology, Walden Institute of Rome, Rome 00186, Italy
| | - Ludovica Griffanti
- Wellcome Centre for Integrative Neuroimaging, Oxford Centre for Human Brain Activity, Department of Psychiatry, University of Oxford, Oxford OX3 7JX, UK
- Wellcome Centre for Integrative Neuroimaging, Oxford Centre for Functional MRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford OX3 9DU, UK
| | | |
Collapse
|
8
|
Baglione H, Coulombe V, Martel-Sauvageau V, Monetta L. The impacts of aging on the comprehension of affective prosody: A systematic review. APPLIED NEUROPSYCHOLOGY. ADULT 2023:1-16. [PMID: 37603689 DOI: 10.1080/23279095.2023.2245940] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/23/2023]
Abstract
Recent clinical reports have suggested a possible decline in the ability to understand emotions in speech (affective prosody comprehension) with aging. The present study aims to further examine the differences in performance between older and younger adults in terms of affective prosody comprehension. Following a recent cognitive model dividing affective prosody comprehension into perceptual and lexico-semantic components, a cognitive approach targeting these components was adopted. The influence of emotions' valence and category on aging performance was also investigated. A systematic review of the literature was carried out using six databases. Twenty-one articles, presenting 25 experiments, were included. All experiments analyzed affective prosody comprehension performance of older versus younger adults. The results confirmed that older adults' performance in identifying emotions in speech was reduced compared to younger adults. The results also brought out the fact that affective prosody comprehension abilities could be modulated by the emotion category but not by the emotional valence. Various theories account for this difference in performance, namely auditory perception, brain aging, and socioemotional selectivity theory suggesting that older people tend to neglect negative emotions. However, the explanation of the underlying deficits of the affective prosody decline is still limited.
Collapse
Affiliation(s)
- Héloïse Baglione
- Département de réadaptation, Université Laval, Québec City, Quebec, Canada
- Département de readaptation, Centre interdisciplinaire de recherche en réadaptation et intégration sociale (CIRRIS), Québec City, Quebec, Canada
| | - Valérie Coulombe
- Département de réadaptation, Université Laval, Québec City, Quebec, Canada
- Département de readaptation, Centre interdisciplinaire de recherche en réadaptation et intégration sociale (CIRRIS), Québec City, Quebec, Canada
| | - Vincent Martel-Sauvageau
- Département de réadaptation, Université Laval, Québec City, Quebec, Canada
- Département de readaptation, Centre interdisciplinaire de recherche en réadaptation et intégration sociale (CIRRIS), Québec City, Quebec, Canada
| | - Laura Monetta
- Département de réadaptation, Université Laval, Québec City, Quebec, Canada
- Département de readaptation, Centre interdisciplinaire de recherche en réadaptation et intégration sociale (CIRRIS), Québec City, Quebec, Canada
| |
Collapse
|
9
|
Long H, Peluso N, Baker CI, Japee S, Taubert J. A database of heterogeneous faces for studying naturalistic expressions. Sci Rep 2023; 13:5383. [PMID: 37012369 PMCID: PMC10070342 DOI: 10.1038/s41598-023-32659-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2022] [Accepted: 03/30/2023] [Indexed: 04/05/2023] Open
Abstract
Facial expressions are thought to be complex visual signals, critical for communication between social agents. Most prior work aimed at understanding how facial expressions are recognized has relied on stimulus databases featuring posed facial expressions, designed to represent putative emotional categories (such as 'happy' and 'angry'). Here we use an alternative selection strategy to develop the Wild Faces Database (WFD); a set of one thousand images capturing a diverse range of ambient facial behaviors from outside of the laboratory. We characterized the perceived emotional content in these images using a standard categorization task in which participants were asked to classify the apparent facial expression in each image. In addition, participants were asked to indicate the intensity and genuineness of each expression. While modal scores indicate that the WFD captures a range of different emotional expressions, in comparing the WFD to images taken from other, more conventional databases, we found that participants responded more variably and less specifically to the wild-type faces, perhaps indicating that natural expressions are more multiplexed than a categorical model would predict. We argue that this variability can be employed to explore latent dimensions in our mental representation of facial expressions. Further, images in the WFD were rated as less intense and more genuine than images taken from other databases, suggesting a greater degree of authenticity among WFD images. The strong positive correlation between intensity and genuineness scores demonstrating that even the high arousal states captured in the WFD were perceived as authentic. Collectively, these findings highlight the potential utility of the WFD as a new resource for bridging the gap between the laboratory and real world in studies of expression recognition.
Collapse
Affiliation(s)
- Houqiu Long
- The School of Psychology, The University of Queensland, St Lucia, QLD, Australia
| | - Natalie Peluso
- The School of Psychology, The University of Queensland, St Lucia, QLD, Australia
| | - Chris I Baker
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA
| | - Shruti Japee
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA
| | - Jessica Taubert
- The School of Psychology, The University of Queensland, St Lucia, QLD, Australia.
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA.
| |
Collapse
|
10
|
Effects of aging on face processing: An ERP study of the own-age bias with neutral and emotional faces. Cortex 2023; 161:13-25. [PMID: 36878097 DOI: 10.1016/j.cortex.2023.01.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2022] [Revised: 07/25/2022] [Accepted: 01/19/2023] [Indexed: 02/20/2023]
Abstract
Older adults systematically show an enhanced N170 amplitude during the visualization of facial expressions of emotion. The present study aimed to replicate this finding, further investigating if this effect is specific to facial stimuli, present in other neural correlates of face processing, and modulated by own-age faces. To this purpose, younger (n = 25; Mage = 28.36), middle-aged (n = 23; Mage = 48.74), and older adults (n = 25; Mage = 67.36) performed two face/emotion identification tasks during an EEG recording. The results showed that groups did not differ regarding P100 amplitude, but older adults had increased N170 amplitude for both facial and non-facial stimuli. The event-related potentials analysed were not modulated by an own-age bias, but older faces elicited larger N170 in the Emotion Identification Task for all groups. This increased amplitude may reflect a higher ambiguity of older faces due to age-related changes in their physical features, which may elicit higher neural resources to decode. Regarding P250, older faces elicited decreased amplitudes than younger faces, which may reflect a reduced processing of the emotional content of older faces. This interpretation is consistent with the lower accuracy obtained for this category of stimuli across groups. These results have important social implications and suggest that aging may hamper the neural processing of facial expressions of emotion, especially for own-age peers.
Collapse
|
11
|
Gao Y, Chonpracha P, Li B, Prinyawiwatkul W. Effects of other people's facial emotional expression on consumers' perceptions of chocolate chip cookies containing cricket protein. J Food Sci 2023; 88:185-204. [PMID: 36658671 DOI: 10.1111/1750-3841.16469] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2022] [Revised: 11/29/2022] [Accepted: 01/03/2023] [Indexed: 01/21/2023]
Abstract
Edible insects are recognized as a potential alternative and sustainable source of high-quality protein for the human diet. Entomophagy is highly related to negative emotions that may cause reluctance to adopt insects as food in Western countries. During human interaction, a person's facial emotional expression (FEE) may influence other people's emotional responses. A person's emotional state may affect his/her food preference and food choice. Understanding how other people's FEE would affect consumers' emotional profiles, liking, and subsequent willingness to try (WTT) and purchase intent (PI) toward insect-containing food products may help increase the acceptance of entomophagy. This study identified emotional responses toward chocolate chip cookies containing cricket protein using valence and arousal scales in order to explore the effects of other people's FEE (positive, negative, and/or sensation seeking) and to find the correlation between consumers' emotional and overall liking (OL) responses for cricket-containing chocolate chip cookies. Predicting PI for such cookies was also performed. For consumers who perceived positive emotion from other people's FEE after watching a short video clip, their emotional feeling was raised on both valence and arousal dimensions, while negative FEE stimulus imparted the opposite effects. The OL scores and emotional intensities after watching the three FEE videos were highly related to consumers' PI. Males compared to females rated the cricket-containing cookies higher on positive emotion intensity, OL, and PI. Among the three FEEs evaluated, the positive emotional stimulus would be beneficial in increasing acceptance, WTT, and PI of insect-containing foods. PRACTICAL APPLICATION: Edible insects are potentially alternative and sustainable sources of high-quality protein for the human diet. Entomophagy is highly related to negative emotions that cause reluctance to adopt insects as food in Western countries. Other people's facial emotional expressions (FEEs) may affect consumer food-evoked emotional profiles, overall liking (OL), and purchase intent (PI). For consumers who perceived positive emotion from other people's FEE, their emotional feeling was raised on both valence and arousal dimensions, and OL scores and emotion intensities were highly related to consumers' PI. Exploiting positive emotional stimuli as demonstrated in this study would be beneficial in increasing acceptance of insect-containing food.
Collapse
Affiliation(s)
- Yupeng Gao
- School of Nutrition and Food Sciences, Louisiana State University Agricultural Center, Baton Rouge, Louisiana, USA
| | - Pitchayapat Chonpracha
- School of Nutrition and Food Sciences, Louisiana State University Agricultural Center, Baton Rouge, Louisiana, USA
| | - Bin Li
- Department of Experimental Statistics, Louisiana State University Agricultural Center, Baton Rouge, Louisiana, USA
| | - Witoon Prinyawiwatkul
- School of Nutrition and Food Sciences, Louisiana State University Agricultural Center, Baton Rouge, Louisiana, USA
| |
Collapse
|
12
|
Simonetti S, Davis C, Kim J. Older adults' emotion recognition: No auditory-visual benefit for less clear expressions. PLoS One 2022; 17:e0279822. [PMID: 36584136 PMCID: PMC9803091 DOI: 10.1371/journal.pone.0279822] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2022] [Accepted: 12/15/2022] [Indexed: 12/31/2022] Open
Abstract
The ability to recognise emotion from faces or voices appears to decline with advancing age. However, some studies have shown that emotion recognition of auditory-visual (AV) expressions is largely unaffected by age, i.e., older adults get a larger benefit from AV presentation than younger adults resulting in similar AV recognition levels. An issue with these studies is that they used well-recognised emotional expressions that are unlikely to generalise to real-life settings. To examine if an AV emotion recognition benefit generalizes across well and less well recognised stimuli, we conducted an emotion recognition study using expressions that had clear or unclear emotion information for both modalities, or clear visual, but unclear auditory information. Older (n = 30) and younger (n = 30) participants were tested on stimuli of anger, happiness, sadness, surprise, and disgust (expressed in spoken sentences) in auditory-only (AO), visual-only (VO), or AV format. Participants were required to respond by choosing one of 5 emotion options. Younger adults were more accurate in recognising emotions than older adults except for clear VO expressions. Younger adults showed an AV benefit even when unimodal recognition was poor. No such AV benefit was found for older adults; indeed, AV was worse than VO recognition when AO recognition was poor. Analyses of confusion responses indicated that older adults generated more confusion responses that were common between AO and VO conditions, than younger adults. We propose that older adults' poorer AV performance may be due to a combination of weak auditory emotion recognition and response uncertainty that resulted in a higher cognitive load.
Collapse
Affiliation(s)
- Simone Simonetti
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
- Brain and Mind Centre, School of Psychology, University of Sydney, Sydney, Australia
| | - Chris Davis
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Jeesun Kim
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
- * E-mail:
| |
Collapse
|
13
|
Simonetti S, Davis C, Kim J. Older adults get masked emotion priming for happy but not angry faces: evidence for a positivity effect in early perceptual processing of emotional signals. Cogn Emot 2022; 36:1576-1593. [PMID: 36300438 DOI: 10.1080/02699931.2022.2138269] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/19/2023]
Abstract
In higher-level cognitive tasks, older compared to younger adults show a bias towards positive emotion information and away from negative information (a positivity effect). It is unclear whether this effect occurs in early perceptual processing. This issue is important for determining if the positivity effect is due to automatic rather than controlled processing. We tested this with older and younger adults on a positive/negative face emotion valence classification task using masked priming. Positive (happy) and negative (angry) face targets were preceded by masked repetition or valence primes with neutral face baselines. In Experiment 1, 30 younger and 30 older adults were tested with 50 ms primes. Younger adults showed repetition priming for both positive and negative targets. Older adults showed repetition priming for positive but not negative targets. Neither group showed valence priming. In Experiment 2, 30 older and 29 younger adults were tested with longer duration primes. Younger adults showed repetition priming for both positive and negative emotions, and no valence priming. Older adults only showed repetition and valence priming for positive targets. We proposed older adults' lack of angry face priming was due to an early attention orienting strategy favouring happy expressions at the expense of angry ones.
Collapse
Affiliation(s)
- Simone Simonetti
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
- Brain and Mind Centre, School of Psychology, University of Sydney, Sydney, Australia
| | - Chris Davis
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Jeesun Kim
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| |
Collapse
|
14
|
Gourlay C, Collin P, D'Auteuil C, Jacques M, Caron PO, Scherzer PB. Age differences in social-cognitive abilities across the stages of adulthood and path model investigation of adult social cognition. NEUROPSYCHOLOGY, DEVELOPMENT, AND COGNITION. SECTION B, AGING, NEUROPSYCHOLOGY AND COGNITION 2022; 29:1033-1067. [PMID: 34355998 DOI: 10.1080/13825585.2021.1962789] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/03/2020] [Accepted: 07/27/2021] [Indexed: 06/13/2023]
Abstract
Accumulating evidence points toward an association between older age and performance decrements in social cognition (SC). We explored age-related variations in four components of SC: emotion recognition, theory of mind, social judgment, and blame attributions. A total of 120 adults divided into three stages (18-34 years, 35-59 years, 60-85 years) completed a battery of SC. Between and within age-group differences in SC were investigated. Path analyses were used to identify relationships among the components. Emotion recognition and theory of mind showed differences beginning either in midlife, or after. Blame attributions and social judgment did not show a significant difference. However, social judgment varied significantly within groups. Path models revealed a relationship between emotion recognition and theory of mind. Findings highlight age-related differences in some components and a link between two components. Strategies promoting social functioning in aging might help to maintain or improve these abilities over time.
Collapse
Affiliation(s)
- Catherine Gourlay
- Département De Psychologie, Université Du Québec À Montréal, Montréal, Québec, Canada
| | - Pascal Collin
- Département De Psychologie, Université Du Québec À Montréal, Montréal, Québec, Canada
| | - Camille D'Auteuil
- Département De Psychologie, Université Du Québec À Montréal, Montréal, Québec, Canada
| | - Marie Jacques
- Département De Psychologie, Université Du Québec À Montréal, Montréal, Québec, Canada
| | | | - Peter B Scherzer
- Département De Psychologie, Université Du Québec À Montréal, Montréal, Québec, Canada
| |
Collapse
|
15
|
Francisco HC, Bregola AG, Ottaviani AC, Luchesi BM, Orlandi FDS, Fraga FJ, Guarisco LPC, Pavarini SCI. The association between language and recognition of facial emotional expressions in elderly individuals. Codas 2022; 34:e20210052. [PMID: 35894306 PMCID: PMC9886300 DOI: 10.1590/2317-1782/20212021052pt] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2021] [Accepted: 12/28/2021] [Indexed: 02/03/2023] Open
Abstract
PURPOSE To check the association between a good performance of language and the recognition of facial emotional expressions in elderly individuals. METHODS Transversal study performed with 118 elderly individuals from the primary care services of health of a city in the state of São Paulo. Sociodemographic data were collected, regarding the performance of language through the domain of Addenbrooke Cognitive Examination - Revised and Recognition of Facial Emotional Expressions. The sample was divided in thirds according to the performance of language: T1 = the best, T2 = average, and T3 = the worst. The groups T1xT3 were compared regarding the performance of recognition of facial expressions of anger, disgust, fear, happiness, sadness, and surprise, and for the intensities of 40%, 60%, 80%, and 100%. The association of independent variables over the performance of language was analyzed through logistic regression. The multivariate model was built from the results of the univariate analyses and has included the continuous variables by emotion and by intensity. Age and schooling associated to the performance of language in the univariate model were included in the multivariate model in order to adjust association analyses. RESULTS The sample was mainly female (84.7%), with an average age of 70.5 years old, and 3.5 schooling years. The variables associated to the best performance of language in comparative analysis of T1 and T3 were: surprise (OR = 1.485, IC 95% 1.194 - 1.846), and disgust (OR = 1.143, IC 95% 1.005 - 1.300). CONCLUSION The recognition of facial emotional expressions of surprise and disgust were shown as important factors associated to the good performance of language.
Collapse
Affiliation(s)
- Helen Capeleto Francisco
- Programa de Pós-graduação em Enfermagem, Universidade Federal de São Carlos – UFSCar - São Carlos (SP), Brasil.
| | - Allan Gustavo Bregola
- School of Health Sciences, University of East Anglia – UEA - Norwich, Norfolk, United Kingdom.
| | - Ana Carolina Ottaviani
- Programa de Pós-graduação em Enfermagem, Universidade Federal de São Carlos – UFSCar - São Carlos (SP), Brasil.
| | - Bruna Moretti Luchesi
- Programa de Pós-graduação em Enfermagem. Universidade Federal de Mato Grosso do Sul – UFMS - Campus de Três Lagoas - Três Lagoas (MS), Brasil.
| | - Fabiana de Souza Orlandi
- Programa de Pós-graduação em Enfermagem, Universidade Federal de São Carlos – UFSCar - São Carlos (SP), Brasil.,Programa de Pós-graduação em Gerontologia, Universidade Federal de São Carlos – UFSCar - São Carlos (SP), Brasil.
| | - Francisco José Fraga
- Centro de Engenharia, Modelagem e Ciências Sociais Aplicadas – CECS, Universidade Federal do ABC – UFABC - Santo André (SP), Brasil.
| | - Letícia Pimenta Costa Guarisco
- Programa de Pós-graduação em Enfermagem, Universidade Federal de São Carlos – UFSCar - São Carlos (SP), Brasil.,Programa de Pós-graduação em Gerontologia, Universidade Federal de São Carlos – UFSCar - São Carlos (SP), Brasil.
| | - Sofia Cristina Iost Pavarini
- Programa de Pós-graduação em Enfermagem, Universidade Federal de São Carlos – UFSCar - São Carlos (SP), Brasil.,Programa de Pós-graduação em Gerontologia, Universidade Federal de São Carlos – UFSCar - São Carlos (SP), Brasil.
| |
Collapse
|
16
|
Henke L, Guseva M, Wagemans K, Pischedda D, Haynes JD, Jahn G, Anders S. Surgical face masks do not impair the decoding of facial expressions of negative affect more severely in older than in younger adults. Cogn Res Princ Implic 2022; 7:63. [PMID: 35841438 PMCID: PMC9287709 DOI: 10.1186/s41235-022-00403-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2021] [Accepted: 05/30/2022] [Indexed: 12/11/2022] Open
Abstract
Surgical face masks reduce the spread of airborne pathogens but also disturb the flow of information between individuals. The risk of getting seriously ill after infection with SARS-COV-2 during the present COVID-19 pandemic amplifies with age, suggesting that face masks should be worn especially during face-to-face contact with and between older people. However, the ability to accurately perceive and understand communication signals decreases with age, and it is currently unknown whether face masks impair facial communication more severely in older people. We compared the impact of surgical face masks on dynamic facial emotion recognition in younger (18–30 years) and older (65–85 years) adults (N = 96) in an online study. Participants watched short video clips of young women who facially expressed anger, fear, contempt or sadness. Faces of half of the women were covered by a digitally added surgical face mask. As expected, emotion recognition accuracy declined with age, and face masks reduced emotion recognition accuracy in both younger and older participants. Unexpectedly, the effect of face masks did not differ between age groups. Further analyses showed that masks also reduced the participants’ overall confidence in their emotion judgements, but not their performance awareness (the difference between their confidence ratings for correct and incorrect responses). Again, there were no mask-by-age interactions. Finally, data obtained with a newly developed questionnaire (attitudes towards face masks, atom) suggest that younger and older people do not differ in how much they feel impaired in their understanding of other people’s emotions by face masks or how useful they find face masks in confining the COVID-19 pandemic. In sum, these findings do not provide evidence that the impact of face masks on the decoding of facial signals is disproportionally larger in older people.
Collapse
Affiliation(s)
- Lea Henke
- Department of Psychology, Universität zu Lübeck, Lübeck, Germany
| | - Maja Guseva
- Bernstein Center for Computational Neuroscience, Charité - Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany.,Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Berlin, Germany.,Department of Psychology, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Katja Wagemans
- Department of Neurology, Universität zu Lübeck, Ratzeburger Allee 160, Lübeck, Germany
| | - Doris Pischedda
- Bernstein Center for Computational Neuroscience, Charité - Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany.,Science of Intelligence, Research Cluster of Excellence, Technische Universität Berlin, Berlin, Germany
| | - John-Dylan Haynes
- Bernstein Center for Computational Neuroscience, Charité - Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany.,Science of Intelligence, Research Cluster of Excellence, Technische Universität Berlin, Berlin, Germany.,Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Georg Jahn
- Department of Psychology, Chemnitz University of Technology, Chemnitz, Germany
| | - Silke Anders
- Department of Neurology, Universität zu Lübeck, Ratzeburger Allee 160, Lübeck, Germany. .,Center of Brain, Behavior and Metabolism (CBBM), Universität zu Lübeck, Lübeck, Germany. .,Department of Psychology, Universität zu Lübeck, Lübeck, Germany.
| |
Collapse
|
17
|
Calić G, Glumbić N, Petrović-Lazić M, Đorđević M, Mentus T. Searching for Best Predictors of Paralinguistic Comprehension and Production of Emotions in Communication in Adults With Moderate Intellectual Disability. Front Psychol 2022; 13:884242. [PMID: 35880187 PMCID: PMC9308010 DOI: 10.3389/fpsyg.2022.884242] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2022] [Accepted: 06/06/2022] [Indexed: 11/13/2022] Open
Abstract
Paralinguistic comprehension and production of emotions in communication include the skills of recognizing and interpreting emotional states with the help of facial expressions, prosody and intonation. In the relevant scientific literature, the skills of paralinguistic comprehension and production of emotions in communication are related primarily to receptive language abilities, although some authors found also their correlations with intellectual abilities and acoustic features of the voice. Therefore, the aim of this study was to investigate which of the mentioned variables (receptive language ability, acoustic features of voice, intellectual ability, social-demographic), presents the most relevant predictor of paralinguistic comprehension and paralinguistic production of emotions in communication in adults with moderate intellectual disabilities (MID). The sample included 41 adults with MID, 20–49 years of age (M = 34.34, SD = 7.809), 29 of whom had MID of unknown etiology, while 12 had Down syndrome. All participants are native speakers of Serbian. Two subscales from The Assessment Battery for Communication – Paralinguistic comprehension of emotions in communication and Paralinguistic production of emotions in communication, were used to assess the examinees from the aspect of paralinguistic comprehension and production skills. For the graduation of examinees from the aspect of assumed predictor variables, the following instruments were used: Peabody Picture Vocabulary Test was used to assess receptive language abilities, Computerized Speech Lab (“Kay Elemetrics” Corp., model 4300) was used to assess acoustic features of voice, and Raven’s Progressive Matrices were used to assess intellectual ability. Hierarchical regression analysis was applied to investigate to which extent the proposed variables present an actual predictor variables for paralinguistic comprehension and production of emotions in communication as dependent variables. The results of this analysis showed that only receptive language skills had statistically significant predictive value for paralinguistic comprehension of emotions (β = 0.468, t = 2.236, p < 0.05), while the factor related to voice frequency and interruptions, form the domain of acoustic voice characteristics, displays predictive value for paralinguistic production of emotions (β = 0.280, t = 2.076, p < 0.05). Consequently, this study, in the adult population with MID, evidenced a greater importance of voice and language in relation to intellectual abilities in understanding and producing emotions.
Collapse
|
18
|
Hamilton LJ, Gourley AN, Krendl AC. They Cannot, They Will Not, or We Are Asking the Wrong Questions: Re-examining Age-Related Decline in Social Cognition. Front Psychol 2022; 13:894522. [PMID: 35645861 PMCID: PMC9131941 DOI: 10.3389/fpsyg.2022.894522] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2022] [Accepted: 04/20/2022] [Indexed: 11/13/2022] Open
Abstract
Social cognition is critical for successfully navigating social relationships. Current evidence suggests that older adults exhibit poorer performance in several core social-cognitive domains compared to younger adults. Neurocognitive decline is commonly discussed as one of the key arbiters of age-related decline in social-cognitive abilities. While evidence supports this notion, age effects are likely attributable to multiple factors. This paper aims to recontextualize past evidence by focusing issues of motivation, task design, and representative samples. In light of these issues, we identify directions for future research to aide our understanding of social-cognitive aging.
Collapse
Affiliation(s)
- Lucas J Hamilton
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN, United States
| | - Amy N Gourley
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN, United States
| | - Anne C Krendl
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN, United States
| |
Collapse
|
19
|
Low ACY, Oh VYS, Tong EMW, Scarf D, Ruffman T. Older adults have difficulty decoding emotions from the eyes, whereas easterners have difficulty decoding emotion from the mouth. Sci Rep 2022; 12:7408. [PMID: 35524152 PMCID: PMC9076610 DOI: 10.1038/s41598-022-11381-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2021] [Accepted: 04/19/2022] [Indexed: 12/05/2022] Open
Abstract
Older adults and Easterners have worse emotion recognition (than young adults and Westerners, respectively), but the question of why remains unanswered. Older adults look less at eyes, whereas Easterners look less at mouths, raising the possibility that compelling older adults to look at eyes, and Easterners to look at mouths, might improve recognition. We did this by comparing emotion recognition in 108 young adults and 109 older adults from New Zealand and Singapore in the (a) eyes on their own (b) mouth on its own or (c) full face. Older adults were worse than young adults on 4/6 emotions with the Eyes Only stimuli, but only 1/6 emotions with the Mouth Only stimuli. In contrast, Easterners were worse than Westerners on 6/6 emotions for Mouth Only and Full Face stimuli, but were equal on all six emotions for Eyes Only stimuli. These results provide a substantial leap forward because they point to the precise difficulty for older adults and Easterners. Older adults have more consistent difficulty identifying individual emotions in the eyes compared to the mouth, likely due to declining brain functioning, whereas Easterners have more consistent difficulty identifying emotions from the mouth than the eyes, likely due to inexperience inferring mouth information.
Collapse
Affiliation(s)
- Anna C Y Low
- Department of Psychology, University of Otago, P.O. Box 56, Dunedin, 9054, New Zealand
| | - Vincent Y S Oh
- Department of Psychology, National University of Singapore, Block AS4, Level 2, 9 Arts Link, Singapore, 117570, Singapore
| | - Eddie M W Tong
- Department of Psychology, National University of Singapore, Block AS4, Level 2, 9 Arts Link, Singapore, 117570, Singapore
| | - Damian Scarf
- Department of Psychology, University of Otago, P.O. Box 56, Dunedin, 9054, New Zealand
| | - Ted Ruffman
- Department of Psychology, University of Otago, P.O. Box 56, Dunedin, 9054, New Zealand.
| |
Collapse
|
20
|
Lee SC, Lin GH, Shih CL, Chen KW, Liu CC, Kuo CJ, Hsieh CL. Error patterns of facial emotion recognition in patients with schizophrenia. J Affect Disord 2022; 300:441-448. [PMID: 34979185 DOI: 10.1016/j.jad.2021.12.130] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/06/2021] [Revised: 10/28/2021] [Accepted: 12/30/2021] [Indexed: 11/30/2022]
Abstract
Error patterns of facial emotion recognition (FER) indicate how individuals misinterpret others' facial expressions, which helps clinicians to manage related deficits. However, previous investigations are limited and may have been biased due to methodological issues (e.g., no consideration of response bias). This study aimed to propose a detectability index (d') for adjusting response bias and examine the error patterns of FER in patients with schizophrenia. Responses to 168 photos showing seven basic emotions, obtained from 351 patients with schizophrenia and 101 healthy adults, were extracted from a previous study. The differences in the d's between the two groups (Δd') were calculated to examine the error patterns of FER among the seven emotions. The findings were generally overlapped with those identified by the traditional confusion matrix. Four error patterns were found. First, the patients were insensitive to some negative emotions (i.e., sadness [Δd' = 0.83] and fear [Δd' = 0.72]). Second, they misrecognized happy faces as showing negative emotions (e.g., disgust [Δd' = 0.43] and sadness [Δd' = 0.37]). Third, they misinterpreted surprised faces as all the other emotions (Δd' = 0.41-0.87), except neutral. Fourth, they confused some negative emotions (e.g., misrecognizing fear as anger [Δd' = 0.87]). Our findings suggest that patients with schizophrenia show four error patterns of FER compared to healthy adults. Accordingly, interventions could be selected to improve their sensitivity to faces with negative emotions, differentiation of faces among positive and negative emotions, understanding of surprised faces, and discrimination of faces with negative emotions.
Collapse
Affiliation(s)
- Shih-Chieh Lee
- Department of Occupational Therapy, College of Medicine, National Cheng Kung University, Tainan, Taiwan; School of Occupational Therapy, College of Medicine, National Taiwan University, Taipei, Taiwan; Institute of Long-Term Care, MacKay Medical College, New Taipei City, Taiwan
| | - Gong-Hong Lin
- Master Program in Long-term Care, College of Nursing, Taipei Medical University, Taipei, Taiwan
| | - Ching-Lin Shih
- Institute of Education & Center for Teacher Education, National Sun Yat-Sen University, Kaohsiung, Taiwan
| | - Kuan-Wei Chen
- Department of Occupational Therapy, Kaohsiung Municipal Kai-Syuan Psychiatric Hospital, Kaohsiung, Taiwan
| | - Chen-Chung Liu
- Department of Psychiatry, National Taiwan University Hospital, Taipei, Taiwan; Department of Psychiatry, College of Medicine, National Taiwan University, Taipei, Taiwan
| | - Chian-Jue Kuo
- Songde Branch (Taipei City Psychiatric Center), Taipei City Hospital, Taipei, Taiwan; Department of Psychiatry, School of Medicine, College of Medicine, Taipei Medical University, Taipei, Taiwan; Psychiatric Research Center, Taipei Medical University Hospital, Taipei, Taiwan
| | - Ching-Lin Hsieh
- School of Occupational Therapy, College of Medicine, National Taiwan University, Taipei, Taiwan; Department of Physical Medicine and Rehabilitation, National Taiwan University Hospital, Taipei, Taiwan; Department of Occupational Therapy, College of Medical and Health Science, Asia University, Taichung, Taiwan.
| |
Collapse
|
21
|
Wang F, Zhou A, Wei C, Zuo X, Ma X, Zhao L, Jin H, Li Y, Guo D, Jia J. Good Performance of the Chinese Version of Mini Social Cognition and Emotional Assessment in the Early Diagnosis of Behavioral Variant Frontotemporal Dementia. Front Neurol 2022; 13:827945. [PMID: 35250831 PMCID: PMC8891700 DOI: 10.3389/fneur.2022.827945] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2021] [Accepted: 01/27/2022] [Indexed: 11/13/2022] Open
Abstract
Social cognition impairment has been recognized as an early and characteristic change in behavioral variant frontotemporal dementia (bvFTD). The Mini Social Cognition and Emotional Assessment (mini-SEA) is a clinical tool to rapidly evaluate social cognition. In this study, we explored the diagnostic value of social cognition by assessing the Chinese version of the mini-SEA and other standard neuropsychological tests in 22 patients with mild bvFTD, 26 patients with mild Alzheimer's disease (AD), including mild cognitive impairment (MCI) and mild dementia, and 30 control subjects. The discriminatory powers of these tests were evaluated and compared using the receiver operating characteristic curve (ROC). The mini-SEA scores of the bvFTD patients were significantly lower than those of the controls (Z = –6.850, adjusted P < 0.001) and AD patients (Z = –3.737, adjusted P = 0.001). ROC analysis showed that the mini-SEA had a high discriminatory power for differentiating bvFTD from the controls, with an area under the curve (AUC) value of 0.989 (95% CI = 0.905-1.000, P < 0.001). The AUC value of the mini-SEA for differentiating bvFTD from AD was 0.899 (95% CI = 0.777-0.967, P < 0.001), higher than that of the Auditory Verbal Learning Test Delayed Recall (AUC = 0.793), Boston Naming Test (AUC = 0.685) or Frontal Assessment Battery (AUC = 0.691). The Chinese version of mini-SEA is a good clinical tool for the early diagnosis of bvFTD, and has a high sensitivity and specificity to discriminate bvFTD from AD.
Collapse
Affiliation(s)
- Fen Wang
- Department of Neurology, Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, Beijing, China
- *Correspondence: Fen Wang
| | - Aihong Zhou
- Department of Neurology, Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, Beijing, China
| | - Cuibai Wei
- Department of Neurology, Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, Beijing, China
| | - Xiumei Zuo
- Department of Neurology, Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, Beijing, China
| | - Xiaowei Ma
- Department of Neurology, The First Hospital of Hebei Medical University, Shijiazhuang, China
| | - Lina Zhao
- Department of Neurology, Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, Beijing, China
| | - Hongmei Jin
- Department of Neurology, Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, Beijing, China
| | - Yan Li
- Department of Neurology, Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, Beijing, China
| | - Dongmei Guo
- Department of Neurology, Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, Beijing, China
| | - Jianping Jia
- Department of Neurology, Innovation Center for Neurological Disorders, Xuanwu Hospital, Capital Medical University, Beijing, China
- Jianping Jia
| |
Collapse
|
22
|
Age and gender effects on the human’s ability to decode posed and naturalistic emotional faces. Pattern Anal Appl 2022. [DOI: 10.1007/s10044-021-01049-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
23
|
Sreevidya P, Veni S, Ramana Murthy OV. Elder emotion classification through multimodal fusion of intermediate layers and cross-modal transfer learning. SIGNAL, IMAGE AND VIDEO PROCESSING 2022; 16:1281-1288. [PMID: 35069919 PMCID: PMC8763433 DOI: 10.1007/s11760-021-02079-x] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/13/2021] [Revised: 10/16/2021] [Accepted: 11/01/2021] [Indexed: 06/14/2023]
Abstract
The objective of the work is to develop an automated emotion recognition system specifically targeted to elderly people. A multi-modal system is developed which has integrated information from audio and video modalities. The database selected for experiments is ElderReact, which contains 1323 video clips of 3 to 8 s duration of people above the age of 50. Here, all the six available emotions Disgust, Anger, Fear, Happiness, Sadness and Surprise are considered. In order to develop an automated emotion recognition system for aged adults, we attempted different modeling techniques. Features are extracted, and neural network models with backpropagation are attempted for developing the models. Further, for the raw video model, transfer learning from pretrained networks is attempted. Convolutional neural network and long short-time memory-based models were taken by maintaining the continuity in time between the frames while capturing the emotions. For the audio model, cross-model transfer learning is applied. Both the models are combined by fusion of intermediate layers. The layers are selected through a grid-based search algorithm. The accuracy and F1-score show that the proposed approach is outperforming the state-of-the-art results. Classification of all the images shows a minimum relative improvement of 6.5% for happiness to a maximum of 46% increase for sadness over the baseline results.
Collapse
Affiliation(s)
- P. Sreevidya
- Department of Electronics and Communication Engineering, Amrita School of Engineering, Amrita Vishwa Vidyapeetham, Coimbatore, India
| | - S. Veni
- Department of Electronics and Communication Engineering, Amrita School of Engineering, Amrita Vishwa Vidyapeetham, Coimbatore, India
| | - O. V. Ramana Murthy
- Department of Electrical and Electronics Engineering, Amrita School of Engineering, Amrita Vishwa Vidyapeetham, Coimbatore, India
| |
Collapse
|
24
|
The age-related positivity effect in cognition: A review of key findings across different cognitive domains. PSYCHOLOGY OF LEARNING AND MOTIVATION 2022. [DOI: 10.1016/bs.plm.2022.08.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
|
25
|
Francisco HC, Bregola AG, Ottaviani AC, Luchesi BM, Orlandi FDS, Fraga FJ, Costa-Guarisco LP, Pavarini SCI. The association between language and recognition of facial emotional expressions in elderly individuals. Codas 2022. [DOI: 10.1590/2317-1782/20212021052en] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
ABSTRACT Purpose To check the association between a good performance of language and the recognition of facial emotional expressions in elderly individuals. Methods Transversal study performed with 118 elderly individuals from the primary care services of health of a city in the state of São Paulo. Sociodemographic data were collected, regarding the performance of language through the domain of Addenbrooke Cognitive Examination – Revised and Recognition of Facial Emotional Expressions. The sample was divided in thirds according to the performance of language: T1 = the best, T2 = average, and T3 = the worst. The groups T1xT3 were compared regarding the performance of recognition of facial expressions of anger, disgust, fear, happiness, sadness, and surprise, and for the intensities of 40%, 60%, 80%, and 100%. The association of independent variables over the performance of language was analyzed through logistic regression. The multivariate model was built from the results of the univariate analyses and has included the continuous variables by emotion and by intensity. Age and schooling associated to the performance of language in the univariate model were included in the multivariate model in order to adjust association analyses. Results The sample was mainly female (84.7%), with an average age of 70.5 years old, and 3.5 schooling years. The variables associated to the best performance of language in comparative analysis of T1 and T3 were: surprise (OR = 1.485, IC 95% 1.194 – 1.846), and disgust (OR = 1.143, IC 95% 1.005 – 1.300). Conclusion The recognition of facial emotional expressions of surprise and disgust were shown as important factors associated to the good performance of language.
Collapse
|
26
|
OUP accepted manuscript. Arch Clin Neuropsychol 2022; 37:1653-1661. [DOI: 10.1093/arclin/acac027] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/14/2022] [Indexed: 11/14/2022] Open
|
27
|
de Boer MJ, Jürgens T, Başkent D, Cornelissen FW. Auditory and Visual Integration for Emotion Recognition and Compensation for Degraded Signals are Preserved With Age. Trends Hear 2021; 25:23312165211045306. [PMID: 34617829 PMCID: PMC8642111 DOI: 10.1177/23312165211045306] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
Since emotion recognition involves integration of the visual and auditory
signals, it is likely that sensory impairments worsen emotion recognition. In
emotion recognition, young adults can compensate for unimodal sensory
degradations if the other modality is intact. However, most sensory impairments
occur in the elderly population and it is unknown whether older adults are
similarly capable of compensating for signal degradations. As a step towards
studying potential effects of real sensory impairments, this study examined how
degraded signals affect emotion recognition in older adults with normal hearing
and vision. The degradations were designed to approximate some aspects of
sensory impairments. Besides emotion recognition accuracy, we recorded eye
movements to capture perceptual strategies for emotion recognition. Overall,
older adults were as good as younger adults at integrating auditory and visual
information and at compensating for degraded signals. However, accuracy was
lower overall for older adults, indicating that aging leads to a general
decrease in emotion recognition. In addition to decreased accuracy, older adults
showed smaller adaptations of perceptual strategies in response to video
degradations. Concluding, this study showed that emotion recognition declines
with age, but that integration and compensation abilities are retained. In
addition, we speculate that the reduced ability of older adults to adapt their
perceptual strategies may be related to the increased time it takes them to
direct their attention to scene aspects that are relatively far away from
fixation.
Collapse
Affiliation(s)
- Minke J de Boer
- Research School of Behavioural and Cognitive Neuroscience, University of Groningen, Groningen, the Netherlands.,Department of Otorhinolaryngology, 10173University Medical Center Groningen, University of Groningen, Groningen, the Netherlands.,Laboratory of Experimental Ophthalmology, 10173University Medical Center Groningen, University of Groningen, Groningen, the Netherlands
| | - Tim Jürgens
- Institute of Acoustics, Technische Hochschule Lübeck, Lübeck, Germany
| | - Deniz Başkent
- Research School of Behavioural and Cognitive Neuroscience, University of Groningen, Groningen, the Netherlands.,Department of Otorhinolaryngology, 10173University Medical Center Groningen, University of Groningen, Groningen, the Netherlands
| | - Frans W Cornelissen
- Research School of Behavioural and Cognitive Neuroscience, University of Groningen, Groningen, the Netherlands.,Laboratory of Experimental Ophthalmology, 10173University Medical Center Groningen, University of Groningen, Groningen, the Netherlands
| |
Collapse
|
28
|
Sivasathiaseelan H, Marshall CR, Benhamou E, van Leeuwen JEP, Bond RL, Russell LL, Greaves C, Moore KM, Hardy CJD, Frost C, Rohrer JD, Scott SK, Warren JD. Laughter as a paradigm of socio-emotional signal processing in dementia. Cortex 2021; 142:186-203. [PMID: 34273798 PMCID: PMC8438290 DOI: 10.1016/j.cortex.2021.05.020] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2020] [Revised: 04/01/2021] [Accepted: 05/21/2021] [Indexed: 11/03/2022]
Abstract
Laughter is a fundamental communicative signal in our relations with other people and is used to convey a diverse repertoire of social and emotional information. It is therefore potentially a useful probe of impaired socio-emotional signal processing in neurodegenerative diseases. Here we investigated the cognitive and affective processing of laughter in forty-seven patients representing all major syndromes of frontotemporal dementia, a disease spectrum characterised by severe socio-emotional dysfunction (twenty-two with behavioural variant frontotemporal dementia, twelve with semantic variant primary progressive aphasia, thirteen with nonfluent-agrammatic variant primary progressive aphasia), in relation to fifteen patients with typical amnestic Alzheimer's disease and twenty healthy age-matched individuals. We assessed cognitive labelling (identification) and valence rating (affective evaluation) of samples of spontaneous (mirthful and hostile) and volitional (posed) laughter versus two auditory control conditions (a synthetic laughter-like stimulus and spoken numbers). Neuroanatomical associations of laughter processing were assessed using voxel-based morphometry of patients' brain MR images. While all dementia syndromes were associated with impaired identification of laughter subtypes relative to healthy controls, this was significantly more severe overall in frontotemporal dementia than in Alzheimer's disease and particularly in the behavioural and semantic variants, which also showed abnormal affective evaluation of laughter. Over the patient cohort, laughter identification accuracy was correlated with measures of daily-life socio-emotional functioning. Certain striking syndromic signatures emerged, including enhanced liking for hostile laughter in behavioural variant frontotemporal dementia, impaired processing of synthetic laughter in the nonfluent-agrammatic variant (consistent with a generic complex auditory perceptual deficit) and enhanced liking for numbers ('numerophilia') in the semantic variant. Across the patient cohort, overall laughter identification accuracy correlated with regional grey matter in a core network encompassing inferior frontal and cingulo-insular cortices; and more specific correlates of laughter identification accuracy were delineated in cortical regions mediating affective disambiguation (identification of hostile and posed laughter in orbitofrontal cortex) and authenticity (social intent) decoding (identification of mirthful and posed laughter in anteromedial prefrontal cortex) (all p < .05 after correction for multiple voxel-wise comparisons over the whole brain). These findings reveal a rich diversity of cognitive and affective laughter phenotypes in canonical dementia syndromes and suggest that laughter is an informative probe of neural mechanisms underpinning socio-emotional dysfunction in neurodegenerative disease.
Collapse
Affiliation(s)
- Harri Sivasathiaseelan
- Dementia Research Centre, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom.
| | - Charles R Marshall
- Dementia Research Centre, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom; Preventive Neurology Unit, Wolfson Institute of Preventive Medicine, Queen Mary University of London, London, United Kingdom
| | - Elia Benhamou
- Dementia Research Centre, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom
| | - Janneke E P van Leeuwen
- Dementia Research Centre, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom
| | - Rebecca L Bond
- Dementia Research Centre, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom
| | - Lucy L Russell
- Dementia Research Centre, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom
| | - Caroline Greaves
- Dementia Research Centre, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom
| | - Katrina M Moore
- Dementia Research Centre, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom
| | - Chris J D Hardy
- Dementia Research Centre, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom
| | - Chris Frost
- Dementia Research Centre, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom; Department of Medical Statistics, Faculty of Epidemiology and Population Health, London School of Hygiene and Tropical Medicine, London, United Kingdom
| | - Jonathan D Rohrer
- Dementia Research Centre, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom
| | - Sophie K Scott
- Institute of Cognitive Neuroscience, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom
| | - Jason D Warren
- Dementia Research Centre, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom
| |
Collapse
|
29
|
Hoemann K, Vicaria IM, Gendron M, Stanley JT. Introducing a Face Sort Paradigm to Evaluate Age Differences in Emotion Perception. J Gerontol B Psychol Sci Soc Sci 2021; 76:1272-1281. [PMID: 32211791 DOI: 10.1093/geronb/gbaa038] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2019] [Indexed: 11/14/2022] Open
Abstract
OBJECTIVES Previous research has uncovered age-related differences in emotion perception. To date, studies have relied heavily on forced-choice methods that stipulate possible responses. These constrained methods limit discovery of variation in emotion perception, which may be due to subtle differences in underlying concepts for emotion. METHOD We employed a face sort paradigm in which young (N = 42) and older adult (N = 43) participants were given 120 photographs portraying six target emotions (anger, disgust, fear, happiness, sadness, and neutral) and were instructed to create and label piles, such that individuals in each pile were feeling the same way. RESULTS There were no age differences in number of piles created, nor in how well labels mapped onto the target emotion categories. However, older adults demonstrated lower consistency in sorting, such that fewer photographs in a given pile belonged to the same target emotion category. At the same time, older adults labeled piles using emotion words that were acquired later in development, and thus are considered more semantically complex. DISCUSSION These findings partially support the hypothesis that older adults' concepts for emotions and emotional expressions are more complex than those of young adults, demonstrate the utility of incorporating less constrained experimental methods into the investigation of age-related differences in emotion perception, and are consistent with existing evidence of increased cognitive and emotional complexity in adulthood.
Collapse
Affiliation(s)
- Katie Hoemann
- Department of Psychology, Northeastern University, Boston, Massachusetts
| | - Ishabel M Vicaria
- Department of Psychology, Northeastern University, Boston, Massachusetts
| | - Maria Gendron
- Department of Psychology, Yale University, New Haven, Connecticut
| | | |
Collapse
|
30
|
Changes in Computer-Analyzed Facial Expressions with Age. SENSORS 2021; 21:s21144858. [PMID: 34300600 PMCID: PMC8309819 DOI: 10.3390/s21144858] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/17/2021] [Revised: 07/13/2021] [Accepted: 07/15/2021] [Indexed: 11/17/2022]
Abstract
Facial expressions are well known to change with age, but the quantitative properties of facial aging remain unclear. In the present study, we investigated the differences in the intensity of facial expressions between older (n = 56) and younger adults (n = 113). In laboratory experiments, the posed facial expressions of the participants were obtained based on six basic emotions and neutral facial expression stimuli, and the intensities of their faces were analyzed using a computer vision tool, OpenFace software. Our results showed that the older adults expressed strong expressions for some negative emotions and neutral faces. Furthermore, when making facial expressions, older adults used more face muscles than younger adults across the emotions. These results may help to understand the characteristics of facial expressions in aging and can provide empirical evidence for other fields regarding facial recognition.
Collapse
|
31
|
Gingras C, Coll MP, Tessier MH, Tremblay P, Jackson PL. Pain evaluation and prosocial behaviour are affected by age and sex. Eur J Pain 2021; 25:1925-1937. [PMID: 34057795 DOI: 10.1002/ejp.1809] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
BACKGROUND Pain assessment and pain care are influenced by the characteristics of both the patient and the caregiver. Some studies suggest that the pain of older persons and of females may be underestimated to a greater extent than the pain of younger and male individuals. AIMS This study investigated the effect of age and sex on prosocial behavior and pain evaluation. METHODS 40 young (18-30 y/o; 20 women) and 40 older adults (55-82 y/o; 20 women) acted as healthcare professionals rating the pain and offering help to patients of both age groups. Trait empathy and social desirability were measured with questionnaires. RESULTS Linear mixed models showed that older and male patients were offered more help and were perceived as being in more intense pain than younger and female patients. CONCLUSION The characteristics of the patients seem to have a greater impact on prosocial behavior and pain assessment compared to those of the observers, which bears significant implications for the treatment of pain in clinical contexts.
Collapse
Affiliation(s)
- Chloé Gingras
- École de Psychologie, Université Laval, Quebec City, QC, Canada.,Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (Cirris), Quebec City, QC, Canada.,CERVO Research Center, Quebec City, QC, Canada
| | | | - Marie-Hélène Tessier
- École de Psychologie, Université Laval, Quebec City, QC, Canada.,Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (Cirris), Quebec City, QC, Canada.,CERVO Research Center, Quebec City, QC, Canada
| | - Pascale Tremblay
- CERVO Research Center, Quebec City, QC, Canada.,Département de Réadaptation, Université Laval, Quebec City, QC, Canada
| | - Philip L Jackson
- École de Psychologie, Université Laval, Quebec City, QC, Canada.,Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (Cirris), Quebec City, QC, Canada.,CERVO Research Center, Quebec City, QC, Canada
| |
Collapse
|
32
|
Abo Foul Y, Eitan R, Mortillaro M, Aviezer H. Perceiving dynamic emotions expressed simultaneously in the face and body minimizes perceptual differences between young and older adults. J Gerontol B Psychol Sci Soc Sci 2021; 77:84-93. [PMID: 33842959 DOI: 10.1093/geronb/gbab064] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2020] [Indexed: 11/13/2022] Open
Abstract
OBJECTIVES It is commonly argued that older adults show difficulties in standardized tasks of emotional expression perception, yet most previous works relied on classic sets of static, decontextualized, and stereotypical facial expressions. In real-life, facial expressions are dynamic and embedded in a rich context, two key factors that may aid emotion perception. Specifically, body language provides important affective cues that may disambiguate facial movements. METHOD We compared emotion perception of dynamic faces, bodies, and their combination, in a sample of older (age 60-83, n=126) and young (age 18-30, n=124) adults. We used the Geneva Multimodal Emotion Portrayals (GEMEP) set, which includes a full view of expressers' faces and bodies, displaying a diverse range of positive and negative emotions, portrayed dynamically and holistically in a non-stereotypical, unconstrained manner. Critically, we digitally manipulated the dynamic cue such that perceivers viewed isolated faces (without bodies), isolated bodies (without faces), or faces with bodies. RESULTS Older adults showed better perception of positive and negative dynamic facial expressions, while young adults showed better perception of positive isolated dynamic bodily expressions. Importantly, emotion perception of faces with bodies was comparable across ages. DISCUSSION Dynamic emotion perception in young and older adults may be more similar than previously assumed, especially when the task is more realistic and ecological. Our results emphasize the importance of contextualized and ecological tasks in emotion perception across ages.
Collapse
Affiliation(s)
- Yasmin Abo Foul
- Department of Psychology, The Hebrew University of Jerusalem.,Department of Psychiatry, Hadassah-Hebrew University Medical Center, Jerusalem
| | - Renana Eitan
- Department of Psychiatry, Hadassah-Hebrew University Medical Center, Jerusalem.,Neuropsychiatry Unit, Jerusalem Mental Health Center, The Hebrew University of Jerusalem.,Department of Psychiatry, Brigham and Women's Hospital, Harvard Medical School, Boston
| | | | - Hillel Aviezer
- Department of Psychology, The Hebrew University of Jerusalem
| |
Collapse
|
33
|
Lalitharatne TD, Tan Y, He L, Leong F, Van Zalk N, de Lusignan S, Iida F, Nanayakkara T. MorphFace: A Hybrid Morphable Face for a Robopatient. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2020.3048670] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/14/2023]
|
34
|
Cortes DS, Tornberg C, Bänziger T, Elfenbein HA, Fischer H, Laukka P. Effects of aging on emotion recognition from dynamic multimodal expressions and vocalizations. Sci Rep 2021; 11:2647. [PMID: 33514829 PMCID: PMC7846600 DOI: 10.1038/s41598-021-82135-1] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2020] [Accepted: 01/15/2021] [Indexed: 12/20/2022] Open
Abstract
Age-related differences in emotion recognition have predominantly been investigated using static pictures of facial expressions, and positive emotions beyond happiness have rarely been included. The current study instead used dynamic facial and vocal stimuli, and included a wider than usual range of positive emotions. In Task 1, younger and older adults were tested for their abilities to recognize 12 emotions from brief video recordings presented in visual, auditory, and multimodal blocks. Task 2 assessed recognition of 18 emotions conveyed by non-linguistic vocalizations (e.g., laughter, sobs, and sighs). Results from both tasks showed that younger adults had significantly higher overall recognition rates than older adults. In Task 1, significant group differences (younger > older) were only observed for the auditory block (across all emotions), and for expressions of anger, irritation, and relief (across all presentation blocks). In Task 2, significant group differences were observed for 6 out of 9 positive, and 8 out of 9 negative emotions. Overall, results indicate that recognition of both positive and negative emotions show age-related differences. This suggests that the age-related positivity effect in emotion recognition may become less evident when dynamic emotional stimuli are used and happiness is not the only positive emotion under study.
Collapse
Affiliation(s)
- Diana S Cortes
- Department of Psychology, Stockholm University, Stockholm, Sweden.
| | | | - Tanja Bänziger
- Department of Psychology, Mid Sweden University, Östersund, Sweden
| | | | - Håkan Fischer
- Department of Psychology, Stockholm University, Stockholm, Sweden
| | - Petri Laukka
- Department of Psychology, Stockholm University, Stockholm, Sweden.
| |
Collapse
|
35
|
Atkinson L, Murray JE, Halberstadt J. Older Adults' Emotion Recognition Ability Is Unaffected by Stereotype Threat. Front Psychol 2021; 11:605724. [PMID: 33488464 PMCID: PMC7817847 DOI: 10.3389/fpsyg.2020.605724] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2020] [Accepted: 12/07/2020] [Indexed: 11/22/2022] Open
Abstract
Eliciting negative stereotypes about ageing commonly results in worse performance on many physical, memory, and cognitive tasks in adults aged over 65. The current studies explored the potential effect of this “stereotype threat” phenomenon on older adults’ emotion recognition, a cognitive ability that has been demonstrated to decline with age. In Study 1, stereotypes about emotion recognition ability across the lifespan were established. In Study 2, these stereotypes were utilised in a stereotype threat manipulation that framed an emotion recognition task as assessing either cognitive ability (stereotypically believed to worsen with age), social ability (believed to be stable across lifespan), or general abilities (control). Participants then completed an emotion recognition task in which they labelled dynamic expressions of negative and positive emotions. Self-reported threat concerns were also measured. Framing an emotion recognition task as assessing cognitive ability significantly heightened older adults’ (but not younger adults’) reports of stereotype threat concerns. Despite this, older adults’ emotion recognition performance was unaffected. Unlike other cognitive abilities, recognising facially expressed emotions may be unaffected by stereotype threat, possibly because emotion recognition is automatic, making it less susceptible to the cognitive load that stereotype threat produces.
Collapse
Affiliation(s)
- Lianne Atkinson
- Department of Psychology, University of Otago, Dunedin, New Zealand
| | - Janice E Murray
- Department of Psychology, University of Otago, Dunedin, New Zealand
| | | |
Collapse
|
36
|
Zloteanu M, Krumhuber EG. Expression Authenticity: The Role of Genuine and Deliberate Displays in Emotion Perception. Front Psychol 2021; 11:611248. [PMID: 33519624 PMCID: PMC7840656 DOI: 10.3389/fpsyg.2020.611248] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2020] [Accepted: 12/21/2020] [Indexed: 11/13/2022] Open
Abstract
People dedicate significant attention to others' facial expressions and to deciphering their meaning. Hence, knowing whether such expressions are genuine or deliberate is important. Early research proposed that authenticity could be discerned based on reliable facial muscle activations unique to genuine emotional experiences that are impossible to produce voluntarily. With an increasing body of research, such claims may no longer hold up to empirical scrutiny. In this article, expression authenticity is considered within the context of senders' ability to produce convincing facial displays that resemble genuine affect and human decoders' judgments of expression authenticity. This includes a discussion of spontaneous vs. posed expressions, as well as appearance- vs. elicitation-based approaches for defining emotion recognition accuracy. We further expand on the functional role of facial displays as neurophysiological states and communicative signals, thereby drawing upon the encoding-decoding and affect-induction perspectives of emotion expressions. Theoretical and methodological issues are addressed with the aim to instigate greater conceptual and operational clarity in future investigations of expression authenticity.
Collapse
Affiliation(s)
- Mircea Zloteanu
- Department of Criminology and Sociology, Kingston University London, Kingston, United Kingdom.,Department of Psychology, Kingston University London, Kingston, United Kingdom
| | - Eva G Krumhuber
- Department of Experimental Psychology, University College London, London, United Kingdom
| |
Collapse
|
37
|
Rothermich K, Giorio C, Falkins S, Leonard L, Roberts A. Nonliteral language processing across the lifespan. Acta Psychol (Amst) 2021; 212:103213. [PMID: 33220614 DOI: 10.1016/j.actpsy.2020.103213] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2020] [Revised: 10/21/2020] [Accepted: 10/26/2020] [Indexed: 12/13/2022] Open
Abstract
Studies investigating the effects of aging on nonliteral language processing have mainly focused on one sensory modality, for example written vignettes. In the current study, we used a video-based task to examine the effect of healthy aging on social communication perception using a novel database called RISC (Relation Inference in Social Communication). By means of an online recruitment platform, we asked young, middle-aged, and older adults between the ages of 18 and 76 (N = 100) to evaluate videos of actors using different forms of literal and nonliteral language, such as sarcasm or teasing. The participants' task was to infer the speakers' belief and the speakers' intention. Older participants demonstrated lower accuracy in discriminating nonliteral from literal interactions compared to younger and middle-aged groups. When evaluating speaker intentions, older adults judged sarcasm as friendlier compared to literal negative utterances. We also found that the older the participant, the more difficulty they have identifying teasing as insincere. Our results expand on age-related similarities and differences in evaluating speaker intentions and demonstrate the practicality of the RISC database for studying nonliteral language across the lifespan.
Collapse
|
38
|
Pavic K, Oker A, Chetouani M, Chaby L. Age-related changes in gaze behaviour during social interaction: An eye-tracking study with an embodied conversational agent. Q J Exp Psychol (Hove) 2020; 74:1128-1139. [PMID: 33283649 DOI: 10.1177/1747021820982165] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/30/2023]
Abstract
Previous research has highlighted age-related differences in social perception, in particular emotional expression processing. To date, such studies have largely focused on approaches that use static emotional stimuli that the participant has to identify passively without the possibility of any interaction. In this study, we propose an interactive virtual environment to better address age-related variations in social and emotional perception. A group of 22 young (18-30 years) and 20 older (60-80 years) adults were engaged in a face-to-face conversation with an embodied conversational agent. Participants were invited to interact naturally with the agent and to identify his facial expression. Their gaze behaviour was captured by an eye-tracking device throughout the interaction. We also explored whether the Big Five personality traits (particularly extraversion) and anxiety modulated gaze during the social interaction. Findings suggested that age-related differences in gaze behaviour were only apparent when decoding social signals (i.e., listening to a partner's question, identifying facial expressions) and not when communicating social information (i.e., when speaking). Furthermore, higher extraversion levels consistently led to a shorter amount of time gazing towards the eyes, whereas higher anxiety levels led to slight modulations of gaze only when participants were listening to questions. Face-to-face conversation with virtual agents can provide a more naturalistic framework for the assessment of online socio-emotional interaction in older adults, which is not easily observable in classical offline paradigms. This study provides novel and important insights into the specific circumstances in which older adults may experience difficulties in social interactions.
Collapse
Affiliation(s)
- Katarina Pavic
- Institut de psychologie, Université de Paris, Boulogne-Billancourt, France.,Université de Paris, VAC, Boulogne-Billancourt, France
| | - Ali Oker
- Laboratoire Cognition Santé Société (EA 6291), Université de Reims Champagne-Ardenne, Reims, France
| | - Mohamed Chetouani
- Institut des systèmes intelligents et de robotique (ISIR), Sorbonne Université, CNRS UMR7222, Paris, France
| | - Laurence Chaby
- Institut de psychologie, Université de Paris, Boulogne-Billancourt, France.,Institut des systèmes intelligents et de robotique (ISIR), Sorbonne Université, CNRS UMR7222, Paris, France
| |
Collapse
|
39
|
Lee SC, Lin GH, Liu CC, Chiu EC, Hsieh CL. Development of the CAT–FER: A Computerized Adaptive Test of Facial Emotion Recognition for Adults With Schizophrenia. Am J Occup Ther 2020; 75:7501205140p1-7501205140p11. [DOI: 10.5014/ajot.2020.043463] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
Abstract
Importance: The most frequently used measures of facial emotion recognition (FER) are insufficiently comprehensive, reliable, valid, and efficient; moreover, the impact of gender on scoring has not been controlled.
Objective: To develop a computerized adaptive test of FER for adults with schizophrenia.
Design: First, we selected photographs from a published database. Second, items that fitted well to a Rasch model were used to form the item bank. Third and last, we determined the best administration mode for prospective users to achieve both high reliability and efficiency.
Setting: Psychiatric hospitals and the community.
Participants: Adults living with schizophrenia (n = 351) and adults without diagnosed mental illness (n = 101).
Results: After removal of misfit items (infit or outfit ≥1.4), the remaining 165 items were selected to form an item bank. Among them, 39 showed severe gender bias, so the item difficulties were adjusted accordingly. On the basis of the item bank, two administration modes were recommended for prospective users. The reliable mode required approximately 128 items (nearly 20 min) to achieve reliability (.72–.81), similar to that of the entire item bank. The efficient mode required approximately 73 items (approximate 11 min) to provide acceptable reliability (.69–.73) for the seven domain scores.
Conclusions and Relevance: Our newly developed measure provides comprehensive, valid, and unbiased (to examinees’ gender) assessments of FER in adults living with schizophrenia. In addition, the administration modes can be flexibly changed to optimize the reliability or efficiency for prospective users.
What This Article Adds: This newly developed FER measure can help occupational therapists identify deficits in recognizing specific basic emotions and plan corresponding interventions to manage the impact on their clients’ social functions.
Collapse
Affiliation(s)
- Shih-Chieh Lee
- Shih-Chieh Lee, PhD, is Postdoctoral Researcher, School of Occupational Therapy, College of Medicine, National Taiwan University, Taipei, Taiwan
| | - Gong-Hong Lin
- Gong-Hong Lin, PhD, is Assistant Professor, Master Program in Long-Term Care, College of Nursing, Taipei Medical University, Taipei, Taiwan. At the time this article was submitted, Lin was Postdoctoral Researcher, School of Occupational Therapy, College of Medicine, National Taiwan University, Taipei, Taiwan
| | - Chen-Chung Liu
- Chen-Chung Liu, MD, PhD, is Psychiatrist, Department of Psychiatry, National Taiwan University Hospital, Taipei, Taiwan, and Associate Professor, Department of Psychiatry, College of Medicine, National Taiwan University, Taipei, Taiwan
| | - En-Chi Chiu
- En-Chi Chiu, PhD, is Associate Professor, Department of Long-Term Care, National Taipei University of Nursing and Health Sciences, Taipei, Taiwan;
| | - Ching-Lin Hsieh
- Ching-Lin Hsieh, PhD, is Professor, School of Occupational Therapy, College of Medicine, National Taiwan University, Taipei, Taiwan; Occupational Therapist, Department of Physical Medicine and Rehabilitation, National Taiwan University Hospital, Taipei, Taiwan; and Adjunct Professor, Department of Occupational Therapy, College of Medical and Health Science, Asia University, Taichung, Taiwan; clhs
| |
Collapse
|
40
|
Horta M, Pehlivanoglu D, Ebner NC. The Role of Intranasal Oxytocin on Social Cognition: An Integrative Human Lifespan Approach. Curr Behav Neurosci Rep 2020; 7:175-192. [PMID: 33717829 PMCID: PMC7951958 DOI: 10.1007/s40473-020-00214-5] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 08/31/2020] [Indexed: 12/26/2022]
Abstract
PURPOSE OF REVIEW This narrative review synthesizes research from the last two decades on the modulatory role of intranasal OT administration (IN-OT) on social cognition in early life, young/middle adulthood, and older adulthood. Advances and knowledge gaps are identified, and future research directions are discussed within an integrative human lifespan framework to guide novel research on IN-OT and social cognition. RECENT FINDINGS Current evidence regarding IN-OT modulation of social-cognitive processes, behavior, and related neurocircuitry is mixed, with some studies suggesting benefits (e.g., improved social perception/interactions, emotion processing) depending on contextual (e.g., social stimuli) and interindividual factors (e.g., age, sex, clinical status). Current research, however, is limited by a focus on isolated life phases, males, and select clinical populations as well as a lack of standardized protocols. SUMMARY This literature-based reflection proposes that greater generalizability of findings and scientific advancement on social-cognitive modulation via IN-OT require standardized, multi-method, longitudinal, and cross-sequential assessments in well-powered, well-controlled, and representative samples in line with an integrative lifespan approach, which considers development as a lifelong dynamic process involving both change and stability characterized by the interplay between genetic, neurobiological, and socio-behavioral factors.
Collapse
Affiliation(s)
- Marilyn Horta
- Department of Psychology, University of Florida, Gainesville, FL, USA
- Department of Epidemiology, University of Florida, Gainesville, FL, USA
| | | | - Natalie C. Ebner
- Department of Psychology, University of Florida, Gainesville, FL, USA
- Institute on Aging, Department of Aging & Geriatric Research, University of Florida, Gainesville, FL, USA
| |
Collapse
|
41
|
Zloteanu M, Bull P, Krumhuber EG, Richardson DC. Veracity judgement, not accuracy: Reconsidering the role of facial expressions, empathy, and emotion recognition training on deception detection. Q J Exp Psychol (Hove) 2020; 74:910-927. [PMID: 33234008 PMCID: PMC8056713 DOI: 10.1177/1747021820978851] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
People hold strong beliefs about the role of emotional cues in detecting deception. While research on the diagnostic value of such cues has been mixed, their influence on human veracity judgements is yet to be fully explored. Here, we address the relationship between emotional information and veracity judgements. In Study 1, the role of emotion recognition in the process of detecting naturalistic lies was investigated. Decoders’ veracity judgements were compared based on differences in trait empathy and their ability to recognise microexpressions and subtle expressions. Accuracy was found to be unrelated to facial cue recognition and negatively related to empathy. In Study 2, we manipulated decoders’ emotion recognition ability and the type of lies they saw: experiential or affective (emotional and unemotional). Decoders received either emotion recognition training, bogus training, or no training. In all scenarios, training did not affect veracity judgements. Experiential lies were easier to detect than affective lies; however, affective unemotional lies were overall the hardest to judge. The findings illustrate the complex relationship between emotion recognition and veracity judgements, with abilities for facial cue detection being high yet unrelated to deception accuracy.
Collapse
Affiliation(s)
- Mircea Zloteanu
- Department of Psychology, Teesside University, Middlesbrough, UK.,Department of Criminology and Sociology, Kingston University, London, UK
| | - Peter Bull
- Department of Psychology, University of York, York, UK.,Department of Psychology, University of Salford, Salford, UK
| | - Eva G Krumhuber
- Department of Experimental Psychology, University College London, London, UK
| | - Daniel C Richardson
- Department of Experimental Psychology, University College London, London, UK
| |
Collapse
|
42
|
Durbin KA, Rastegar S, Knight BG. Effects of age and mood on emotional face processing differ depending on the intensity of the facial expression. NEUROPSYCHOLOGY, DEVELOPMENT, AND COGNITION. SECTION B, AGING, NEUROPSYCHOLOGY AND COGNITION 2020; 27:902-917. [PMID: 31809671 PMCID: PMC7274884 DOI: 10.1080/13825585.2019.1700900] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/04/2019] [Accepted: 11/27/2019] [Indexed: 10/25/2022]
Abstract
Research suggests that mood can moderate age differences in recognizing facial emotion. In this study, we examined how an anxious versus calm mood state affected younger and older adults' processing of emotional faces. Older adults had greater difficulty identifying negative emotions, particularly when emotions were displayed at a low intensity level. However, an anxious mood did not affect age differences in emotional face recognition. In contrast, age, emotional intensity, and current mood state all affected the perceived intensity of emotion. The effects of age and mood on perceived emotional intensity were only observed for low intensity facial expressions. When induced into an anxious mood, younger adults perceived threatening emotions (i.e., fear, anger) as more emotionally intense, whereas older adults perceived anger and happiness to be more intense. These findings emphasize the need to consider both internal and external factors when investigating the effects of age on emotional face processing.
Collapse
Affiliation(s)
| | - Sarah Rastegar
- Department of Psychology, University of Southern California
| | - Bob G. Knight
- Department of Psychology, University of Southern California
- School of Psychology and Counseling, University of Southern Queensland
| |
Collapse
|
43
|
Khosdelazad S, Jorna LS, McDonald S, Rakers SE, Huitema RB, Buunk AM, Spikman JM. Comparing static and dynamic emotion recognition tests: Performance of healthy participants. PLoS One 2020; 15:e0241297. [PMID: 33112932 PMCID: PMC7592751 DOI: 10.1371/journal.pone.0241297] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Accepted: 10/12/2020] [Indexed: 11/19/2022] Open
Abstract
Facial expressions have a communicatory function and the ability to read them is a prerequisite for understanding feelings and thoughts of other individuals. Impairments in recognition of facial emotional expressions are frequently found in patients with neurological conditions (e.g. stroke, traumatic brain injury, frontotemporal dementia). Hence, a standard neuropsychological assessment should include measurement of emotion recognition. However, there is debate regarding which tests are most suitable. The current study evaluates and compares three different emotion recognition tests. 84 healthy participants were included and assessed with three tests, in varying order: a. Ekman 60 Faces Test (FEEST) b. Emotion Recognition Task (ERT) c. Emotion Evaluation Test (EET). The tests differ in type of stimuli from static photographs (FEEST) to more dynamic stimuli in the form of morphed photographs (ERT) to videos (EET). Comparing performances on the three tests, the lowest total scores (67.3% correct answers) were found for the ERT. Significant, but moderate correlations were found between the total scores of the three tests, but nearly all correlations between the same emotions across different tests were not significant. Furthermore, we found cross-over effects of the FEEST and EET to the ERT; participants attained higher total scores on the ERT when another emotion recognition test had been administered beforehand. Moreover, the ERT proved to be sensitive to the effects of age and education. The present findings indicate that despite some overlap, each emotion recognition test measures a unique part of the construct. The ERT seemed to be the most difficult test: performances were lowest and influenced by differences in age and education and it was the only test that showed a learning effect after practice with other tests. This highlights the importance of appropriate norms.
Collapse
Affiliation(s)
- Sara Khosdelazad
- Department of Neuropsychology, University of Groningen, University Medical Centre Groningen, Groningen, The Netherlands
- * E-mail:
| | - Lieke S. Jorna
- Department of Neuropsychology, University of Groningen, University Medical Centre Groningen, Groningen, The Netherlands
| | - Skye McDonald
- School of Psychology, University of New South Wales, Sydney, Australia
| | - Sandra E. Rakers
- Department of Neuropsychology, University of Groningen, University Medical Centre Groningen, Groningen, The Netherlands
| | - Rients B. Huitema
- Department of Neuropsychology, University of Groningen, University Medical Centre Groningen, Groningen, The Netherlands
| | - Anne M. Buunk
- Department of Neuropsychology, University of Groningen, University Medical Centre Groningen, Groningen, The Netherlands
| | - Jacoba M. Spikman
- Department of Neuropsychology, University of Groningen, University Medical Centre Groningen, Groningen, The Netherlands
| |
Collapse
|
44
|
Sleep, inflammation, and perception of sad facial emotion: A laboratory-based study in older adults. Brain Behav Immun 2020; 89:159-167. [PMID: 32531429 DOI: 10.1016/j.bbi.2020.06.011] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Revised: 06/06/2020] [Accepted: 06/06/2020] [Indexed: 12/11/2022] Open
Abstract
BACKGROUND Facial emotion perception (FEP) is pivotal for discriminating salient emotional information. Accumulating data indicate that FEP responses, particularly to sad emotional stimuli, are impaired in depression. This study tests whether sleep disturbance and inflammation, two risk factors for depression, contribute to impaired FEP to sad emotional stimuli. METHODS In older adults (n = 40, 71.7 ± 6.8y, 56.4% female), disturbance of sleep maintenance (i.e., wake time after sleep onset [WASO]) was evaluated by polysomnography. In the morning, plasma concentrations of two markers of systemic inflammation were evaluated (i.e., interleukin [IL]-6, tumor necrosis factor [TNF]-α), followed by two FEP tasks, which assessed delays in emotion recognition (ER) and ratings of perceived emotion intensity (EI) in response to sad facial emotional stimuli, with exploration of FEP responses to happiness and anger. Linear regression models tested whether WASO, IL-6, and TNF-α would be associated with impaired FEP to sad emotional stimuli. In addition, moderation tests examined whether inflammation would moderate the link between sleep disturbance and impaired FEP to sad emotional stimuli. RESULTS Longer WASO predicted longer ER delays (p < 0.05) and lower EI ratings in response to sad faces (p < 0.01). Further, higher TNF-α (p < 0.05) but not IL-6 predicted longer ER delays for sad faces, whereas higher IL-6 (p < 0.01) but not TNF-α predicted lower EI ratings for sad faces. Finally, TNF-α moderated the relationship between longer WASO and longer ER delays to sad faces (p < 0.001), while IL-6 moderated the relationship between longer WASO and lower EI ratings to sad faces (p < 0.01). Neither sleep nor inflammatory measures were associated with FEP responses to happiness or anger. CONCLUSION In older adults, disturbance of sleep maintenance is associated with impaired FEP to sad emotion, a relationship that appears to be moderated by inflammation. These data indicate that sleep disturbance and inflammation converge and contribute to impaired FEP with implications for risk for late-life depression.
Collapse
|
45
|
Gourlay C, Collin P, Caron PO, D'Auteuil C, Scherzer PB. Psychometric assessment of social cognitive tasks. APPLIED NEUROPSYCHOLOGY-ADULT 2020; 29:731-749. [PMID: 32841055 DOI: 10.1080/23279095.2020.1807348] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
Although there has been a marked increase in interest in social cognition (SC) in recent years, psychometric data relating to many tasks used to measure its components remain limited in healthy populations with only five articles published to date. It is accordingly premature to speak of a consensus concerning the specific components, or best tests of the components, and possible cultural differences. The present study sought to partially fill that gap, examining the psychometric properties of a battery of SC tasks in a sample of 100 healthy adults aged 18-85 years old. Initially, nine tasks assessing four SC components were selected: emotion recognition, theory of mind, attributional bias, and social judgment. Construct validity and criterion-related validity were assessed using factor and correlational analyses. Performance across age and sex groups was also investigated. Reliability was assessed through internal consistency, interrater and intercoder agreement. Results indicated satisfactory properties for the Ambiguous Intentions Hostility Questionnaire-blame score, the Social Judgment Task, the Facial Emotions Recognition Test, and a modified version of the Strange Stories Task. Statistically significant differences were found between the groups with regard to age and sex after accounting for demographic and cognitive factors. However, the correlations of these measures with relationship quality were mostly very low, raising questions about their concomitant validity. Other tasks showed sub-optimal properties, suggesting that some frequently used tests require further validation or modifications to ensure the quality of research findings. Based on the results, recommended measures for future studies and limitations are discussed.
Collapse
Affiliation(s)
- Catherine Gourlay
- Department of Psychology, Université du Québec à Montréal, Montréal, Canada
| | - Pascal Collin
- Department of Psychology, Université du Québec à Montréal, Montréal, Canada
| | | | - Camille D'Auteuil
- Department of Psychology, Université du Québec à Montréal, Montréal, Canada
| | - Peter B Scherzer
- Department of Psychology, Université du Québec à Montréal, Montréal, Canada
| |
Collapse
|
46
|
Recognition of emotional facial expressions in adolescents with attention deficit/hyperactivity disorder. J Adolesc 2020; 82:1-10. [PMID: 32442797 DOI: 10.1016/j.adolescence.2020.04.010] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2019] [Revised: 03/21/2020] [Accepted: 04/26/2020] [Indexed: 11/21/2022]
Abstract
INTRODUCTION Attention Deficit/Hyperactivity Disorder (ADHD) is associated with impaired social competencies, due in part to an inability to determine emotional states through facial expressions. Social interactions are a critical component of adolescence, which raises the question of how do adolescents with ADHD cope with this impairment. Yet, previous reviews do not distinguish between children and adolescents. This review focuses on the ability of adolescents (defined by the World Health Organization as 10-19 years old) with ADHD to recognize emotional facial expressions, when compared to their typically-developing peers. METHODS Comprehensive database search and analysis yielded 9 relevant studies published between 2008 and 2018. RESULTS The studies reviewed here examined recognition of emotional facial expressions in adolescents with ADHD. Behavioral measures (reaction time, reaction time variance and recognition accuracy) show no statistically significant differences between adolescents with ADHD and their typically-developing peers. However, neural responses as recorded using functional Magnetic Resonance Imaging (fMRI) or Event Related Potentials (ERP) find differences in brain activity and the temporal evolution of the reaction between the two groups. CONCLUSIONS Studies of children and of adults with ADHD find deficiencies in the recognition of emotional facial expressions. However, this review shows that adolescents with ADHD perform comparably to their peers on accuracy and rate, although their neural processing is different. This suggests that the methodologies employed by the ADHD and typically-developing adolescents to asses facial expressions are different. Further study is needed to determine what these may be.
Collapse
|
47
|
McKenzie K, Murray A, Murray K, O'Donnell M, Murray GC, Metcalfe D, McCarty K. An evaluation of the distribution properties, factor structure, and item response profile of an assessment of emotion recognition. Heliyon 2020; 6:e03572. [PMID: 32195395 PMCID: PMC7076041 DOI: 10.1016/j.heliyon.2020.e03572] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2019] [Revised: 10/01/2019] [Accepted: 03/09/2020] [Indexed: 11/18/2022] Open
Abstract
Many people with developmental disabilities, such as autism spectrum disorder and intellectual disability have emotion recognition (ER) difficulties compared with typically developing (TD) peers. Accurate assessment of the extent and nature of differences in ER requires an understanding of the response profiles to ER assessment stimuli. We analysed data from 504 TD individuals in response to an ER assessment in respect of distribution properties, factor structure, and item response profile. Eighteen emotion items discriminated better at lower levels of ER ability in TD participants. Neutral expressions were the hardest to interpret; surprise, anger, happy, and bored were easiest. The amount of contextual information in combination with the emotion being depicted also appeared to influence level of difficulty. Similar psychometric research is needed with people with developmental disabilities.
Collapse
Affiliation(s)
- Karen McKenzie
- Northumbria University, United Kingdom
- Corresponding author.
| | | | | | | | | | | | | |
Collapse
|
48
|
Murphy J, Millgate E, Geary H, Catmur C, Bird G. No effect of age on emotion recognition after accounting for cognitive factors and depression. Q J Exp Psychol (Hove) 2019; 72:2690-2704. [DOI: 10.1177/1747021819859514] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
Abstract
A decline in emotion recognition ability across the lifespan has been well documented. However, whether age predicts emotion recognition difficulties after accounting for potentially confounding factors which covary with age remains unclear. Although previous research suggested that age-related decline in emotion recognition ability may be partly a consequence of cognitive (fluid intelligence, processing speed) and affective (e.g., depression) factors, recent theories highlight a potential role for alexithymia (difficulty identifying and describing one’s emotions) and interoception (perception of the body’s internal state). This study therefore aimed to examine the recognition of anger and disgust across the adult lifespan in a group of 140 20–90-year-olds to see whether an effect of age would remain after controlling for a number of cognitive and affective factors potentially impacted by age. In addition, using an identity recognition control task, the study aimed to determine whether the factors accounting for the effects of age on emotion discrimination also contribute towards generalised face processing difficulties. Results revealed that discrimination of disgust and anger across the lifespan was predicted by processing speed and fluid intelligence, and negatively by depression. No effect of age was found after these factors were accounted for. Importantly, these effects were specific to emotion discrimination; only crystallised intelligence accounted for unique variance in identity discrimination. Contrary to expectations, although interoception and alexithymia were correlated with emotion discrimination abilities, these factors did not explain unique variance after accounting for other variables.
Collapse
Affiliation(s)
- Jennifer Murphy
- Social, Genetic and Developmental Psychiatry Centre, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, London, UK
| | - Edward Millgate
- Department of Psychology, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, London, UK
| | - Hayley Geary
- Department of Psychology, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, London, UK
| | - Caroline Catmur
- Department of Psychology, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, London, UK
| | - Geoffrey Bird
- Social, Genetic and Developmental Psychiatry Centre, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, London, UK
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| |
Collapse
|
49
|
Mitrovic M, Ristic M, Dimitrijevic B, Hadzi Pesic M. Facial Emotion Recognition and Persecutory Ideation in Paranoid Schizophrenia. Psychol Rep 2019; 123:1099-1116. [PMID: 31092137 DOI: 10.1177/0033294119849016] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The recognition of facial signals has a crucial role in social interaction. It is well known that people suffering from paranoid schizophrenia have problems in the social domain, predominantly related to misinterpreting the intentions, emotions, and actions of others. The aim of this study was to examine whether there are differences in facial emotion recognition between people with paranoid schizophrenia and healthy controls. In addition, we examined the correlation between facial emotion recognition and the expression of persecutory ideation in people suffering from paranoid schizophrenia. The study involved 60 participants, 30 of whom suffered from paranoid schizophrenia and 30 healthy controls, equalized by gender, age, and education. The following instruments were used: Japanese and Caucasian Facial Expressions of Emotion and Neutral Faces and the Persecutory Ideation Questionnaire. Compared with the controls, people suffering from paranoid schizophrenia were significantly less accurate in recognizing the following emotions: surprise, contempt, sadness, disgust, and emotionally neutral faces. Since the attribution of emotions to emotionally neutral faces is an important finding that could be linked with the social (dis)functionality of people suffering from paranoid schizophrenia, we analyzed and compared the wrong answers given by the two groups and found some differences between them. The results show that persecutory ideation has a statistically significant negative correlation with the successful recognition of emotionally neutral faces. All of the findings lead to the conclusion that paranoid schizophrenia, and within it the existence of persecutory ideation, leads to problems in recognizing the basic facial signals that form the foundation of everyday social interaction.
Collapse
Affiliation(s)
| | - Milica Ristic
- Pedagogical Faculty in Vranje, University of Nis, Serbia
| | | | | |
Collapse
|
50
|
Age-related decline in emotional perspective-taking: Its effect on the late positive potential. COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2018; 19:109-122. [PMID: 30341622 DOI: 10.3758/s13415-018-00648-1] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
Abstract
Aging is associated with changes in cognitive and affective functioning, which likely shape older adults' social cognition. As the neural and psychological mechanisms underlying age differences in social abilities remain poorly understood, the present study aims to extend the research in this field. To this purpose, younger (n = 30; Mage = 26.6), middle-aged (n = 30; Mage = 48.4), and older adults (n = 29; Mage = 64.5) performed a task designed to assess affective perspective-taking, during an EEG recording. In this task, participants decided whether a target facial expression of emotion (FEE) was congruent or incongruent with that of a masked intervener of a previous scenario, which portrayed a neutral or an emotional scene. Older adults showed worse performance in comparison to the other groups. Regarding electrophysiological results, while younger and middle-aged adults showed higher late positive potentials (LPPs) after FEEs congruent with previous scenarios than after incongruent FEEs, older adults had similar amplitudes after both. This insensitivity of older adults' LPPs in differentiating congruent from incongruent emotional context-target FEE may be related to their difficulty in generating information about others' inner states and using that information in social interactions.
Collapse
|