1
|
Yong MH, Waqas M, Ruffman T. Effects of age on behavioural and eye gaze on Theory of Mind using movie for social cognition. Q J Exp Psychol (Hove) 2024; 77:2476-2487. [PMID: 38356176 PMCID: PMC11607846 DOI: 10.1177/17470218241235811] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2023] [Revised: 11/10/2023] [Accepted: 01/08/2024] [Indexed: 02/16/2024]
Abstract
Evidence has shown that older adults have lower accuracy in Theory of Mind (ToM) tasks compared with young adults, but we are still unclear whether the difficulty in decoding mental states in older adults stems from not looking at the critical areas, and more so from the ageing Asian population. Most ToM studies use static images or short vignettes to measure ToM but these stimuli are dissimilar to everyday social interactions. We investigated this question using a dynamic task that measured both accuracy and error types, and examined the links between accuracy and error types to eye gaze fixation at critical areas (e.g., eyes, mouth, body). A total of 82 participants (38 older, 44 young adults) completed the Movie for the Assessment of Social Cognition (MASC) task on the eye tracker. Results showed that older adults had a lower overall accuracy with more errors in the ipo-ToM (under-mentalising) and no-ToM (lack of mentalisation) conditions compared with young adults. We analysed the eye gaze data using principal components analysis and found that increasing age and looking less at the face were related to lower MASC accuracy in our participants. Our findings suggest that ageing deficits in ToM are linked to a visual attention deficit specific to the perception of socially relevant nonverbal cues.
Collapse
|
2
|
Hamlin N, Myers K, Taylor BK, Doucet GE. Role of Emotion Reactivity to Predict Facial Emotion Recognition Changes with Aging. Exp Aging Res 2024; 50:550-567. [PMID: 37660356 PMCID: PMC10908871 DOI: 10.1080/0361073x.2023.2254658] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2023] [Accepted: 08/28/2023] [Indexed: 09/05/2023]
Abstract
Emotional intelligence includes an assortment of factors related to emotion function. Such factors involve emotion recognition (in this case via facial expression), emotion trait, reactivity, and regulation. We aimed to investigate how the subjective appraisals of emotional intelligence (i.e. trait, reactivity, and regulation) are associated with objective emotion recognition accuracy, and how these associations differ between young and older adults. Data were extracted from the CamCAN dataset (189 adults: 57 young/118 older) from assessments measuring these emotion constructs. Using linear regression models, we found that greater negative reactivity was associated with better emotion recognition accuracy among older adults, though the pattern was opposite for young adults with the greatest difference in disgust and surprise recognition. Positive reactivity and depression level predicted surprise recognition, with the associations significantly differing between the age groups. The present findings suggest the level to which older and young adults react to emotional stimuli differentially predicts their ability to correctly identify facial emotion expressions. Older adults with higher negative reactivity may be able to integrate their negative emotions effectively in order to recognize other's negative emotions more accurately. Alternatively, young adults may experience interference from negative reactivity, lowering their ability to recognize other's negative emotions.
Collapse
Affiliation(s)
- Noah Hamlin
- Institute for Human Neuroscience, Boys Town National Research Hospital, Omaha, NE
| | - Katrina Myers
- Institute for Human Neuroscience, Boys Town National Research Hospital, Omaha, NE
| | - Brittany K. Taylor
- Institute for Human Neuroscience, Boys Town National Research Hospital, Omaha, NE
- Department of Pharmacology and Neuroscience, Creighton University, Omaha, NE
| | - Gaelle E. Doucet
- Institute for Human Neuroscience, Boys Town National Research Hospital, Omaha, NE
- Department of Pharmacology and Neuroscience, Creighton University, Omaha, NE
| |
Collapse
|
3
|
Mastorogianni ME, Konstanti S, Dratsiou I, Bamidis PD. Masked emotions: does children's affective state influence emotion recognition? Front Psychol 2024; 15:1329070. [PMID: 38962230 PMCID: PMC11220387 DOI: 10.3389/fpsyg.2024.1329070] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2023] [Accepted: 04/22/2024] [Indexed: 07/05/2024] Open
Abstract
Introduction Facial emotion recognition abilities of children have been the focus of attention across various fields, with implications for communication, social interaction, and human behavior. In response to the COVID-19 pandemic, wearing a face mask in public became mandatory in many countries, hindering social information perception and emotion recognition. Given the importance of visual communication for children's social-emotional development, concerns have been raised on whether face masks could impair their ability to recognize emotions and thereby possibly impact their social-emotional development. Methods To this extent, a quasiexperimental study was designed with a two-fold objective: firstly, to identify children's accuracy in recognizing basic emotions (anger, happiness, fear, disgust, sadness) and emotional neutrality when presented with faces under two conditions: one with no-masks and another with faces partially covered by various types of masks (medical, nonmedical, surgical, or cloth); secondly, to explore any correlation between children's emotion recognition accuracy and their affective state. Sixty-nine (69) elementary school students aged 6-7 years old from Greece were recruited for this purpose. Following specific requirements of the second phase of the experiment students were assigned to one of three (3) distinct affective condition groups: Group A-Happiness, Group B-Sadness, and Group C-Emotional Neutrality. Image stimuli were drawn from the FACES Dataset, and students' affective state was registered using the self-reporting emotions-registration tool, AffectLecture app. Results The study's findings indicate that children can accurately recognize emotions even with masks, although recognizing disgust is more challenging. Additionally, following both positive and negative affective state priming promoted systematic inaccuracies in emotion recognition. Most significantly, results showed a negative bias for children in negative affective state and a positive bias for those in positive affective state. Discussion Children's affective state significantly influenced their emotion recognition abilities; sad affective states led to lower recognition overall and a bias toward recognizing sad expressions, while happy affective states resulted in a positive bias, improving recognition of happiness, and affecting how emotional neutrality and sadness were actually perceived. In conclusion, this study sheds light on the intriguing dynamics of how face masks affect children's emotion recognition, but also underlines the profound influence of their affective state.
Collapse
Affiliation(s)
- Maria Eirini Mastorogianni
- MSc in Learning Technologies-Education Sciences, School of Early Childhood Education, School of Electrical and Computer Engineering, School of Medicine, Aristotle University of Thessaloniki (AUTH), Thessaloniki, Greece
| | - Styliani Konstanti
- MSc in Learning Technologies-Education Sciences, School of Early Childhood Education, School of Electrical and Computer Engineering, School of Medicine, Aristotle University of Thessaloniki (AUTH), Thessaloniki, Greece
| | - Ioanna Dratsiou
- Medical Physics and Digital Innovation Laboratory, School of Medicine, Faculty of Health Sciences, Aristotle University of Thessaloniki (AUTH), Thessaloniki, Greece
| | - Panagiotis D. Bamidis
- Medical Physics and Digital Innovation Laboratory, School of Medicine, Faculty of Health Sciences, Aristotle University of Thessaloniki (AUTH), Thessaloniki, Greece
| |
Collapse
|
4
|
Yang Z, Wu Y, Liu S, Zhao L, Fan C, He W. Ensemble Coding of Crowd with Cross-Category Facial Expressions. Behav Sci (Basel) 2024; 14:508. [PMID: 38920840 PMCID: PMC11201231 DOI: 10.3390/bs14060508] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2024] [Revised: 05/26/2024] [Accepted: 06/14/2024] [Indexed: 06/27/2024] Open
Abstract
Ensemble coding allows observers to form an average to represent a set of elements. However, it is unclear whether observers can extract an average from a cross-category set. Previous investigations on this issue using low-level stimuli yielded contradictory results. The current study addressed this issue by presenting high-level stimuli (i.e., a crowd of facial expressions) simultaneously (Experiment 1) or sequentially (Experiment 2), and asked participants to complete a member judgment task. The results showed that participants could extract average information from a group of cross-category facial expressions with a short perceptual distance. These findings demonstrate cross-category ensemble coding of high-level stimuli, contributing to the understanding of ensemble coding and providing inspiration for future research.
Collapse
Affiliation(s)
- Zhi Yang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; (Z.Y.); (Y.W.); (S.L.); (L.Z.); (W.H.)
- Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, China
| | - Yifan Wu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; (Z.Y.); (Y.W.); (S.L.); (L.Z.); (W.H.)
- Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, China
| | - Shuaicheng Liu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; (Z.Y.); (Y.W.); (S.L.); (L.Z.); (W.H.)
- Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, China
| | - Lili Zhao
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; (Z.Y.); (Y.W.); (S.L.); (L.Z.); (W.H.)
- Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, China
| | - Cong Fan
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; (Z.Y.); (Y.W.); (S.L.); (L.Z.); (W.H.)
- Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, China
| | - Weiqi He
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; (Z.Y.); (Y.W.); (S.L.); (L.Z.); (W.H.)
- Key Laboratory of Brain and Cognitive Neuroscience, Dalian 116029, China
| |
Collapse
|
5
|
González-Gualda LM, Vicente-Querol MA, García AS, Molina JP, Latorre JM, Fernández-Sotos P, Fernández-Caballero A. An exploratory study of the effect of age and gender on face scanning during affect recognition in immersive virtual reality. Sci Rep 2024; 14:5553. [PMID: 38448515 PMCID: PMC10918108 DOI: 10.1038/s41598-024-55774-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2023] [Accepted: 02/26/2024] [Indexed: 03/08/2024] Open
Abstract
A person with impaired emotion recognition is not able to correctly identify facial expressions represented by other individuals. The aim of the present study is to assess eyes gaze and facial emotion recognition in a healthy population using dynamic avatars in immersive virtual reality (IVR). For the first time, the viewing of each area of interest of the face in IVR is studied by gender and age. This work in healthy people is conducted to assess the future usefulness of IVR in patients with deficits in the recognition of facial expressions. Seventy-four healthy volunteers participated in the study. The materials used were a laptop computer, a game controller, and a head-mounted display. Dynamic virtual faces randomly representing the six basic emotions plus neutral expression were used as stimuli. After the virtual human represented an emotion, a response panel was displayed with the seven possible options. Besides storing the hits and misses, the software program internally divided the faces into different areas of interest (AOIs) and recorded how long participants looked at each AOI. As regards the overall accuracy of the participants' responses, hits decreased from the youngest to the middle-aged and older adults. Also, all three groups spent the highest percentage of time looking at the eyes, but younger adults had the highest percentage. It is also noteworthy that attention to the face compared to the background decreased with age. Moreover, the hits between women and men were remarkably similar and, in fact, there were no statistically significant differences between them. In general, men paid more attention to the eyes than women, but women paid more attention to the forehead and mouth. In contrast to previous work, our study indicates that there are no differences between men and women in facial emotion recognition. Moreover, in line with previous work, the percentage of face viewing time for younger adults is higher than for older adults. However, contrary to earlier studies, older adults look more at the eyes than at the mouth.Consistent with other studies, the eyes are the AOI with the highest percentage of viewing time. For men the most viewed AOI is the eyes for all emotions in both hits and misses. Women look more at the eyes for all emotions, except for joy, fear, and anger on hits. On misses, they look more into the eyes for almost all emotions except surprise and fear.
Collapse
Affiliation(s)
- Luz M González-Gualda
- Servicio de Salud de Castilla-La Mancha, Complejo Hospitalario Universitario de Albacete, Servicio de Salud Mental, 02004, Albacete, Spain
| | - Miguel A Vicente-Querol
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
| | - Arturo S García
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
| | - José P Molina
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
| | - José M Latorre
- Departmento de Psicología, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
| | - Patricia Fernández-Sotos
- Servicio de Salud de Castilla-La Mancha, Complejo Hospitalario Universitario de Albacete, Servicio de Salud Mental, 02004, Albacete, Spain
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III), 28016, Madrid, Spain
| | - Antonio Fernández-Caballero
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain.
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain.
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III), 28016, Madrid, Spain.
| |
Collapse
|
6
|
Chen Y, Yang X, Howman H, Filik R. Individual differences in emoji comprehension: Gender, age, and culture. PLoS One 2024; 19:e0297379. [PMID: 38354159 PMCID: PMC10866486 DOI: 10.1371/journal.pone.0297379] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2023] [Accepted: 12/29/2023] [Indexed: 02/16/2024] Open
Abstract
Emoji are an important substitute for non-verbal cues (such as facial expressions) in online written communication. So far, however, little is known about individual differences regarding how they are perceived. In the current study, we examined the influence of gender, age, and culture on emoji comprehension. Specifically, a sample of 523 participants across the UK and China completed an emoji classification task. In this task, they were presented with a series of emoji, each representing one of six facial emotional expressions, across four commonly used platforms (Apple, Android, WeChat, and Windows). Their task was to choose from one of six labels (happy, sad, angry, surprised, fearful, disgusted) which emotion was represented by each emoji. Results showed that all factors (age, gender, and culture) had a significant impact on how emojis were classified by participants. This has important implications when considering emoji use, for example, conversation with partners from different cultures.
Collapse
Affiliation(s)
- Yihua Chen
- School of Psychology, University of Nottingham, University Park, Nottingham, United Kingdom
| | - Xingchen Yang
- School of Psychology, University of Nottingham, University Park, Nottingham, United Kingdom
| | - Hannah Howman
- School of Psychology, University of Nottingham, University Park, Nottingham, United Kingdom
| | - Ruth Filik
- School of Psychology, University of Nottingham, University Park, Nottingham, United Kingdom
| |
Collapse
|
7
|
Özcivelek T, Basmacı F, Turgut B, Akbulut K, Kılıçarslan MA. Perception of color mismatch or conspicuous marginal adaptation in extraoral prostheses with eye-tracking. J Prosthet Dent 2024; 131:332-339. [PMID: 38161076 DOI: 10.1016/j.prosdent.2023.11.021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2023] [Revised: 11/16/2023] [Accepted: 11/16/2023] [Indexed: 01/03/2024]
Abstract
STATEMENT OF PROBLEM Color matching and marginal integrity are major challenges when providing extraoral maxillofacial prostheses. Which of the color and marginal harmony features are more important for the extraoral prostheses to be inconspicuous is unclear. Studies on the perception of these prostheses with objective evaluation criteria are lacking. PURPOSE The purpose of this observational study was to investigate the significance of color mismatch and conspicuous marginal adaptation in the perception of extraoral maxillofacial prostheses using eye-tracking technology. The secondary aim was to evaluate the perception of extraoral maxillofacial prostheses with regard to the observers' sex. MATERIAL AND METHODS Twenty-seven face images, in 3 groups, representing well-fitting orbital prostheses with a color mismatch (IC), prostheses with a good color match but distinct marginal adaptation (IM), and symmetrical face images, were viewed for 5 seconds by 52 laypeople. Time to first fixation (TFF), fixation duration (FD), and fixation count (FC) at defined areas of interest were recorded and analyzed by an eye-tracking device. Because of the nested structure of data, a sex- and age-adjusted random intercept linear mixed effects model was used to assess the difference between IC, IM, and SI. Bonferroni corrected P values were used for pairwise comparisons. The difference between observers' sex was evaluated with random intercept mixed model by adjusting for age for each image. For repeated measurement analysis, the lm4, lmerTest, and emmeans libraries in R version 4.3.1 (R Foundation for Statistical Computing) were used (α=.05 for all tests). RESULTS Significant differences were found between the symmetrical image group and other study groups at the facial prosthesis region in all parameters (each P<.001). Observers first focused on the facial prostheses in IC (0.72 seconds) and in IM (0.789 seconds). Longer fixation durations, 1.909, 1.989 seconds for IC and IM (PIC<.001, PIM<.001), respectively, and a higher fixation count for IC (5.28) (P<.001) and for IM (5.45) (P<.001) were recorded on facial prostheses compared with other areas of interest. Women were more focused on the prosthesis than men in the IC and IM groups considering FD (PIC=.003, PIM<.001) and FC values (PIC=.016, PIM<.001, PSI<.001). Fixation duration for women and men was 2.097 seconds and 1.739 seconds in the IC group, 2.219 seconds and 1.78 seconds in the IM group, and 1.364 seconds and 1.222 seconds in the SI group, respectively. CONCLUSIONS Since the color mismatch and distinct marginal adaptation of maxillofacial prostheses were recognized using eye-tracking technology, both features appeared to be equally significant to be considered in fabrication procedures.
Collapse
Affiliation(s)
- Tuğgen Özcivelek
- Assistant Professor, Department of Prosthodontics, Faculty of Dentistry, Health Sciences University Gulhane, Ankara, Turkey.
| | - Fulya Basmacı
- Assistant Professor, Department of Prosthodontics, Faculty of Dentistry, Ankara Yıldırım Beyazıt University, Ankara, Turkey
| | - Berna Turgut
- Researcher, Department of Dentistry, Ankara Memorial Hospital, Ankara, Turkey
| | - Kuddusi Akbulut
- Assistant Professor, Department of Prosthodontics, Faculty of Dentistry, Cappadocia University, Nevşehir, Turkey
| | - Mehmet Ali Kılıçarslan
- Professor, Department of Prosthodontics, Faculty of Dentistry, Ankara University, Ankara, Turkey
| |
Collapse
|
8
|
Asalıoğlu EN, Göksun T. The role of hand gestures in emotion communication: Do type and size of gestures matter? PSYCHOLOGICAL RESEARCH 2023; 87:1880-1898. [PMID: 36436110 DOI: 10.1007/s00426-022-01774-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2022] [Accepted: 11/17/2022] [Indexed: 11/28/2022]
Abstract
We communicate emotions in a multimodal way, yet non-verbal emotion communication is a relatively understudied area of research. In three experiments, we investigated the role of gesture characteristics (e.g., type, size in space) on individuals' processing of emotional content. In Experiment 1, participants were asked to rate the emotional intensity of emotional narratives from the videoclips either with iconic or beat gestures. Participants in the iconic gesture condition rated the emotional intensity higher than participants in the beat gesture condition. In Experiment 2, the size of gestures and its interaction with gesture type were investigated in a within-subjects design. Participants again rated the emotional intensity of emotional narratives from the videoclips. Although individuals overall rated narrow gestures more emotionally intense than wider gestures, no effects of gesture type, or gesture size and type interaction were found. Experiment 3 was conducted to check whether findings of Experiment 2 were due to viewing gestures in all videoclips. We compared the gesture and no gesture (i.e., speech only) conditions and showed that there was not a difference between them on emotional ratings. However, we could not replicate the findings related to gesture size of Experiment 2. Overall, these findings indicate the importance of examining gesture's role in emotional contexts and that different gesture characteristics such as size of gestures can be considered in nonverbal communication.
Collapse
Affiliation(s)
- Esma Nur Asalıoğlu
- Department of Psychology, Koç University, Rumelifeneri Yolu, Sariyer, 34450, Istanbul, Turkey
| | - Tilbe Göksun
- Department of Psychology, Koç University, Rumelifeneri Yolu, Sariyer, 34450, Istanbul, Turkey.
| |
Collapse
|
9
|
Calić G, Glumbić N, Petrović-Lazić M, Đorđević M, Mentus T. Searching for Best Predictors of Paralinguistic Comprehension and Production of Emotions in Communication in Adults With Moderate Intellectual Disability. Front Psychol 2022; 13:884242. [PMID: 35880187 PMCID: PMC9308010 DOI: 10.3389/fpsyg.2022.884242] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2022] [Accepted: 06/06/2022] [Indexed: 11/13/2022] Open
Abstract
Paralinguistic comprehension and production of emotions in communication include the skills of recognizing and interpreting emotional states with the help of facial expressions, prosody and intonation. In the relevant scientific literature, the skills of paralinguistic comprehension and production of emotions in communication are related primarily to receptive language abilities, although some authors found also their correlations with intellectual abilities and acoustic features of the voice. Therefore, the aim of this study was to investigate which of the mentioned variables (receptive language ability, acoustic features of voice, intellectual ability, social-demographic), presents the most relevant predictor of paralinguistic comprehension and paralinguistic production of emotions in communication in adults with moderate intellectual disabilities (MID). The sample included 41 adults with MID, 20–49 years of age (M = 34.34, SD = 7.809), 29 of whom had MID of unknown etiology, while 12 had Down syndrome. All participants are native speakers of Serbian. Two subscales from The Assessment Battery for Communication – Paralinguistic comprehension of emotions in communication and Paralinguistic production of emotions in communication, were used to assess the examinees from the aspect of paralinguistic comprehension and production skills. For the graduation of examinees from the aspect of assumed predictor variables, the following instruments were used: Peabody Picture Vocabulary Test was used to assess receptive language abilities, Computerized Speech Lab (“Kay Elemetrics” Corp., model 4300) was used to assess acoustic features of voice, and Raven’s Progressive Matrices were used to assess intellectual ability. Hierarchical regression analysis was applied to investigate to which extent the proposed variables present an actual predictor variables for paralinguistic comprehension and production of emotions in communication as dependent variables. The results of this analysis showed that only receptive language skills had statistically significant predictive value for paralinguistic comprehension of emotions (β = 0.468, t = 2.236, p < 0.05), while the factor related to voice frequency and interruptions, form the domain of acoustic voice characteristics, displays predictive value for paralinguistic production of emotions (β = 0.280, t = 2.076, p < 0.05). Consequently, this study, in the adult population with MID, evidenced a greater importance of voice and language in relation to intellectual abilities in understanding and producing emotions.
Collapse
|
10
|
Proverbio AM, Cerri A. The Recognition of Facial Expressions Under Surgical Masks: The Primacy of Anger. Front Neurosci 2022; 16:864490. [PMID: 35784837 PMCID: PMC9243392 DOI: 10.3389/fnins.2022.864490] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Accepted: 05/31/2022] [Indexed: 01/29/2023] Open
Abstract
Background The need to wear surgical masks in everyday life has drawn the attention of psychologists to the negative effects of face covering on social processing. A recent but not homogeneous literature has highlighted large costs in the ability to recognize emotions. Methods Here it was investigated how mask covering impaired the recognition of facial mimicry in a large group of 220 undergraduate students. Sex differences in emotion recognition were also analyzed in two subgroups of 94 age-matched participants. Subjects were presented with 112 pictures displaying the faces of eight actors (4 women and 4 men) wearing or not wearing real facemasks, and expressing seven emotional states (neutrality, surprise, happiness, sadness, disgust, anger and fear). The task consisted in categorizing facial expressions while indicating the emotion recognizability with a 3-point Likert scale. Scores underwent repeated measures ANOVAs. Results Overall, face masking reduced emotion recognition by 31%. All emotions were affected by mask covering except for anger. Face covering was most detrimental to sadness and disgust, both relying on mouth and nose expressiveness. Women showed a better performance for subtle expressions such as surprise and sadness, both in masked and natural conditions, and men for fear recognition (in natural but especially masked conditions). Conclusion Anger display was unaffected by masking, also because corrugated forehead and frowning eyebrows were clearly exposed. Overall, facial masking seems to polarize non-verbal communication toward the happiness/anger dimension, while minimizing emotions that stimulate an empathic response in the observer.
Collapse
|
11
|
Low ACY, Oh VYS, Tong EMW, Scarf D, Ruffman T. Older adults have difficulty decoding emotions from the eyes, whereas easterners have difficulty decoding emotion from the mouth. Sci Rep 2022; 12:7408. [PMID: 35524152 PMCID: PMC9076610 DOI: 10.1038/s41598-022-11381-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2021] [Accepted: 04/19/2022] [Indexed: 12/05/2022] Open
Abstract
Older adults and Easterners have worse emotion recognition (than young adults and Westerners, respectively), but the question of why remains unanswered. Older adults look less at eyes, whereas Easterners look less at mouths, raising the possibility that compelling older adults to look at eyes, and Easterners to look at mouths, might improve recognition. We did this by comparing emotion recognition in 108 young adults and 109 older adults from New Zealand and Singapore in the (a) eyes on their own (b) mouth on its own or (c) full face. Older adults were worse than young adults on 4/6 emotions with the Eyes Only stimuli, but only 1/6 emotions with the Mouth Only stimuli. In contrast, Easterners were worse than Westerners on 6/6 emotions for Mouth Only and Full Face stimuli, but were equal on all six emotions for Eyes Only stimuli. These results provide a substantial leap forward because they point to the precise difficulty for older adults and Easterners. Older adults have more consistent difficulty identifying individual emotions in the eyes compared to the mouth, likely due to declining brain functioning, whereas Easterners have more consistent difficulty identifying emotions from the mouth than the eyes, likely due to inexperience inferring mouth information.
Collapse
Affiliation(s)
- Anna C Y Low
- Department of Psychology, University of Otago, P.O. Box 56, Dunedin, 9054, New Zealand
| | - Vincent Y S Oh
- Department of Psychology, National University of Singapore, Block AS4, Level 2, 9 Arts Link, Singapore, 117570, Singapore
| | - Eddie M W Tong
- Department of Psychology, National University of Singapore, Block AS4, Level 2, 9 Arts Link, Singapore, 117570, Singapore
| | - Damian Scarf
- Department of Psychology, University of Otago, P.O. Box 56, Dunedin, 9054, New Zealand
| | - Ted Ruffman
- Department of Psychology, University of Otago, P.O. Box 56, Dunedin, 9054, New Zealand.
| |
Collapse
|
12
|
Sarauskyte L, Monciunskaite R, Griksiene R. The role of sex and emotion on emotion perception in artificial faces: An ERP study. Brain Cogn 2022; 159:105860. [PMID: 35339916 DOI: 10.1016/j.bandc.2022.105860] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2021] [Revised: 02/08/2022] [Accepted: 03/10/2022] [Indexed: 11/17/2022]
Abstract
Sex has a significant impact on the perception of emotional expressions. However, it remains unclear whether sex influences the perception of emotions in artificial faces, which are becoming popular in emotion research. We used an emotion recognition task with FaceGen faces portraying six basic emotions aiming to investigate the effect of sex and emotion on behavioural and electrophysiological parameters. 71 participants performed the task while EEG was recorded. The recognition of sadness was the poorest, however, females recognized sadness better than males. ERP results indicated that fear, disgust, and anger evoked higher amplitudes of late positive potential over the left parietal region compared to neutral expression. Females demonstrated higher values of global field power as compared to males. The interaction between sex and emotion on ERPs was not significant. The results of our study may be valuable for future therapies and research, as it emphasizes possibly distinct processing of emotions and potential sex differences in the recognition of emotional expressions in FaceGen faces.
Collapse
Affiliation(s)
- Livija Sarauskyte
- Vilnius University, Life Sciences Center, Institute of Biosciences, Vilnius, Lithuania.
| | - Rasa Monciunskaite
- Vilnius University, Life Sciences Center, Institute of Biosciences, Vilnius, Lithuania
| | - Ramune Griksiene
- Vilnius University, Life Sciences Center, Institute of Biosciences, Vilnius, Lithuania
| |
Collapse
|
13
|
Ramachandra V, Longacre H. Unmasking the psychology of recognizing emotions of people wearing masks: The role of empathizing, systemizing, and autistic traits. PERSONALITY AND INDIVIDUAL DIFFERENCES 2022; 185:111249. [DOI: 10.1016/j.paid.2021.111249] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/03/2021] [Revised: 08/24/2021] [Accepted: 09/03/2021] [Indexed: 12/28/2022]
|
14
|
Age and gender effects on the human’s ability to decode posed and naturalistic emotional faces. Pattern Anal Appl 2022. [DOI: 10.1007/s10044-021-01049-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
15
|
Duran N, Atkinson AP. Foveal processing of emotion-informative facial features. PLoS One 2021; 16:e0260814. [PMID: 34855898 PMCID: PMC8638924 DOI: 10.1371/journal.pone.0260814] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Accepted: 11/17/2021] [Indexed: 11/18/2022] Open
Abstract
Certain facial features provide useful information for recognition of facial expressions. In two experiments, we investigated whether foveating informative features of briefly presented expressions improves recognition accuracy and whether these features are targeted reflexively when not foveated. Angry, fearful, surprised, and sad or disgusted expressions were presented briefly at locations which would ensure foveation of specific features. Foveating the mouth of fearful, surprised and disgusted expressions improved emotion recognition compared to foveating an eye or cheek or the central brow. Foveating the brow led to equivocal results in anger recognition across the two experiments, which might be due to the different combination of emotions used. There was no consistent evidence suggesting that reflexive first saccades targeted emotion-relevant features; instead, they targeted the closest feature to initial fixation. In a third experiment, angry, fearful, surprised and disgusted expressions were presented for 5 seconds. Duration of task-related fixations in the eyes, brow, nose and mouth regions was modulated by the presented expression. Moreover, longer fixation at the mouth positively correlated with anger and disgust accuracy both when these expressions were freely viewed (Experiment 2b) and when briefly presented at the mouth (Experiment 2a). Finally, an overall preference to fixate the mouth across all expressions correlated positively with anger and disgust accuracy. These findings suggest that foveal processing of informative features is functional/contributory to emotion recognition, but they are not automatically sought out when not foveated, and that facial emotion recognition performance is related to idiosyncratic gaze behaviour.
Collapse
Affiliation(s)
- Nazire Duran
- Department of Psychology, Durham University, Durham, United Kingdom
| | - Anthony P. Atkinson
- Department of Psychology, Durham University, Durham, United Kingdom
- * E-mail:
| |
Collapse
|
16
|
González-Alcaide G, Fernández-Ríos M, Redolat R, Serra E. Research on Emotion Recognition and Dementias: Foundations and Prospects. J Alzheimers Dis 2021; 82:939-950. [PMID: 34120903 DOI: 10.3233/jad-210096] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
BACKGROUND The study of emotion recognition could be crucial for detecting alterations in certain cognitive areas or as an early sign of neurological disorders. OBJECTIVE The main objective of the study is to characterize research development on emotion recognition, identifying the intellectual structure that supports this area of knowledge, and the main lines of research attracting investigators' interest. METHODS We identified publications on emotion recognition and dementia included in the Web of Science Core Collection, analyzing the scientific output and main disciplines involved in generating knowledge in the area. A co-citation analysis and an analysis of the bibliographic coupling between the retrieved documents elucidated the thematic orientations of the research and the reference works that constitute the foundation for development in the field. RESULTS A total of 345 documents, with 24,282 bibliographic references between them, were included. This is an emerging research area, attracting the interest of investigators in Neurosciences, Psychology, Clinical Neurology, and Psychiatry, among other disciplines. Four prominent topic areas were identified, linked to frontotemporal dementia, autism spectrum disorders, Alzheimer's disease, and Parkinson's and Huntington disease. Many recent papers focus on the detection of mild cognitive impairment. CONCLUSION Impaired emotion recognition may be a key sign facilitating the diagnosis and early treatment of different neurodegenerative diseases as well as for triggering the necessary provision of social and family support, explaining the growing research interest in this area.
Collapse
Affiliation(s)
| | - Mercedes Fernández-Ríos
- Departamento de Psicología Evolutiva y de la Educación, Universitat de Valencia, Valencia, Spain.,Asociación Familiares Alzheimer Valencia (AFAV), Valencia, Spain
| | - Rosa Redolat
- Departamento de Psicobiología, Universitat de Valencia, Valencia, Spain
| | - Emilia Serra
- Departamento de Psicología Evolutiva y de la Educación, Universitat de Valencia, Valencia, Spain
| |
Collapse
|
17
|
Giannouli V, Yordanova J, Kolev V. The Primacy of Beauty in Music, Visual Arts and Literature: Not Just a Replication Study in the Greek Language Exploring the Effects of Verbal Fluency, Age and Gender. Psychol Rep 2021; 125:2636-2663. [PMID: 34148455 PMCID: PMC9483706 DOI: 10.1177/00332941211026836] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
Research on aesthetic descriptors of art in different languages is scarce. The
aim of the present study was to elucidate the conceptual structure of aesthetic
experiences of three forms of art (music, visual arts and literature) in the
Greek language, which has not been explored so far. It was further aimed to
study if biological and cognitive factors such as age and gender might produce
differences in art appreciation. A total of 467 younger and older individuals
from Greece were asked to generate verbal descriptors (adjectives) in free
word-listing conditions in order to collect terms reflecting the
aesthetics-related semantic field of art. The capacity of verbal memory was
controlled by using a battery of neuropsychological tests. Analysis of generated
adjectives’ frequency and salience revealed that ‘beautiful’ was the most
prominent descriptor that was selected with a distinctive primacy for all three
forms of arts. The primacy of ‘beautiful’ was significantly more pronounced for
visual arts relative to music and literature. Although the aging-related decline
of verbal capacity was similar for males and females, the primacy of ‘beautiful’
depended on age and gender by being more emphasized for young females than
males, and for old males than females. Analysis of secondary descriptors and
pairs of adjectives revealed that affective and hedonic experiences are
essentially fixed in the semantic field of art reflection. It is concluded that
although the concept of the aesthetics seems to be diversified and rich, a clear
primacy of beauty is found for the Greek cultural environment and across
different forms of art. The results also highlight the presence of complex
influences of biological and cognitive factors on aesthetic art experiences.
Collapse
Affiliation(s)
- Vaitsa Giannouli
- Bulgarian Academy of Sciences, Institute of Neurobiology, Sofia, Bulgaria
| | - Juliana Yordanova
- Bulgarian Academy of Sciences, Institute of Neurobiology, Sofia, Bulgaria
| | - Vasil Kolev
- Bulgarian Academy of Sciences, Institute of Neurobiology, Sofia, Bulgaria
| |
Collapse
|
18
|
Gehrer NA, Zajenkowska A, Bodecka M, Schönenberg M. Attention orienting to the eyes in violent female and male offenders: An eye-tracking study. Biol Psychol 2021; 163:108136. [PMID: 34129874 DOI: 10.1016/j.biopsycho.2021.108136] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2020] [Revised: 06/10/2021] [Accepted: 06/10/2021] [Indexed: 12/30/2022]
Abstract
Attention to the eyes and eye contact form an important basis for the development of empathy and social competences including prosocial behavior. Thus, impairments in attention to the eyes of an interaction partner might play a role in the etiology of antisocial behavior and violence. For the first time, the present study extends investigations of eye gaze to a large sample (N = 173) including not only male but also female violent offenders and a control group. We assessed viewing patterns during the categorization of emotional faces via eye tracking. Our results indicate a reduced frequency of initial attention shifts to the eyes in female and male offenders compared to controls, while there were no general group differences in overall attention to the eye region (i.e., relative dwell time). Thus, we conclude that violent offenders might be able to compensate for deficits in spontaneous attention orienting during later stages of information processing.
Collapse
Affiliation(s)
- Nina A Gehrer
- University of Tübingen, Department of Clinical Psychology and Psychotherapy, Tübingen, Germany.
| | - Anna Zajenkowska
- Maria Grzegorzewska University, Department of Psychology, Warsaw, Poland
| | - Marta Bodecka
- Maria Grzegorzewska University, Department of Psychology, Warsaw, Poland
| | - Michael Schönenberg
- University of Tübingen, Department of Clinical Psychology and Psychotherapy, Tübingen, Germany; University Hospital Tübingen, Department of Psychiatry and Psychotherapy, Tübingen, Germany
| |
Collapse
|
19
|
Cortes DS, Tornberg C, Bänziger T, Elfenbein HA, Fischer H, Laukka P. Effects of aging on emotion recognition from dynamic multimodal expressions and vocalizations. Sci Rep 2021; 11:2647. [PMID: 33514829 PMCID: PMC7846600 DOI: 10.1038/s41598-021-82135-1] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2020] [Accepted: 01/15/2021] [Indexed: 12/20/2022] Open
Abstract
Age-related differences in emotion recognition have predominantly been investigated using static pictures of facial expressions, and positive emotions beyond happiness have rarely been included. The current study instead used dynamic facial and vocal stimuli, and included a wider than usual range of positive emotions. In Task 1, younger and older adults were tested for their abilities to recognize 12 emotions from brief video recordings presented in visual, auditory, and multimodal blocks. Task 2 assessed recognition of 18 emotions conveyed by non-linguistic vocalizations (e.g., laughter, sobs, and sighs). Results from both tasks showed that younger adults had significantly higher overall recognition rates than older adults. In Task 1, significant group differences (younger > older) were only observed for the auditory block (across all emotions), and for expressions of anger, irritation, and relief (across all presentation blocks). In Task 2, significant group differences were observed for 6 out of 9 positive, and 8 out of 9 negative emotions. Overall, results indicate that recognition of both positive and negative emotions show age-related differences. This suggests that the age-related positivity effect in emotion recognition may become less evident when dynamic emotional stimuli are used and happiness is not the only positive emotion under study.
Collapse
Affiliation(s)
- Diana S Cortes
- Department of Psychology, Stockholm University, Stockholm, Sweden.
| | | | - Tanja Bänziger
- Department of Psychology, Mid Sweden University, Östersund, Sweden
| | | | - Håkan Fischer
- Department of Psychology, Stockholm University, Stockholm, Sweden
| | - Petri Laukka
- Department of Psychology, Stockholm University, Stockholm, Sweden.
| |
Collapse
|
20
|
Correia-Caeiro C, Guo K, Mills D. Bodily emotional expressions are a primary source of information for dogs, but not for humans. Anim Cogn 2021; 24:267-279. [PMID: 33507407 PMCID: PMC8035094 DOI: 10.1007/s10071-021-01471-x] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2020] [Revised: 12/22/2020] [Accepted: 01/02/2021] [Indexed: 11/26/2022]
Abstract
Dogs have remarkable abilities to synergise their behaviour with that of people, but how dogs read facial and bodily emotional cues in comparison to humans remains unclear. Both species share the same ecological niche, are highly social and expressive, making them an ideal comparative model for intra- and inter-species emotion perception. We compared eye-tracking data from unrestrained humans and dogs when viewing dynamic and naturalistic emotional expressions in humans and dogs. Dogs attended more to the body than the head of human and dog figures, unlike humans who focused more on the head of both species. Dogs and humans also showed a clear age effect that reduced head gaze. Our results indicate a species-specific evolutionary adaptation for emotion perception, which is only partly modified for heterospecific cues. These results have important implications for managing the risk associated with human-dog interactions, where expressive and perceptual differences are crucial.
Collapse
Affiliation(s)
- Catia Correia-Caeiro
- School of Psychology, University of Lincoln, Lincoln, UK.
- School of Life Sciences, University of Lincoln, Lincoln, UK.
- Primate Research Institute, Kyoto University, Inuyama, Japan.
| | - Kun Guo
- School of Psychology, University of Lincoln, Lincoln, UK
| | - Daniel Mills
- School of Life Sciences, University of Lincoln, Lincoln, UK
| |
Collapse
|
21
|
Validation of dynamic virtual faces for facial affect recognition. PLoS One 2021; 16:e0246001. [PMID: 33493234 PMCID: PMC7833130 DOI: 10.1371/journal.pone.0246001] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2020] [Accepted: 01/12/2021] [Indexed: 11/29/2022] Open
Abstract
The ability to recognise facial emotions is essential for successful social interaction. The most common stimuli used when evaluating this ability are photographs. Although these stimuli have proved to be valid, they do not offer the level of realism that virtual humans have achieved. The objective of the present paper is the validation of a new set of dynamic virtual faces (DVFs) that mimic the six basic emotions plus the neutral expression. The faces are prepared to be observed with low and high dynamism, and from front and side views. For this purpose, 204 healthy participants, stratified by gender, age and education level, were recruited for assessing their facial affect recognition with the set of DVFs. The accuracy in responses was compared with the already validated Penn Emotion Recognition Test (ER-40). The results showed that DVFs were as valid as standardised natural faces for accurately recreating human-like facial expressions. The overall accuracy in the identification of emotions was higher for the DVFs (88.25%) than for the ER-40 faces (82.60%). The percentage of hits of each DVF emotion was high, especially for neutral expression and happiness emotion. No statistically significant differences were discovered regarding gender. Nor were significant differences found between younger adults and adults over 60 years. Moreover, there is an increase of hits for avatar faces showing a greater dynamism, as well as front views of the DVFs compared to their profile presentations. DVFs are as valid as standardised natural faces for accurately recreating human-like facial expressions of emotions.
Collapse
|
22
|
Ferreira BLC, Fabrício DDM, Chagas MHN. Are facial emotion recognition tasks adequate for assessing social cognition in older people? A review of the literature. Arch Gerontol Geriatr 2020; 92:104277. [PMID: 33091714 DOI: 10.1016/j.archger.2020.104277] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2020] [Revised: 09/25/2020] [Accepted: 09/30/2020] [Indexed: 02/04/2023]
Abstract
OBJECTIVE Facial emotion recognition (FER) is a component of social cognition and important to interpersonal relations. Therefore, tasks have been developed to assess this skill in different population. Regarding older people, even healthy individuals have a poorer performance compared to rate of correct answers commonly used to assess such tasks. Perform a systematic review to analyze studies addressing the performance of healthy older adults on FER tasks compared to the 70% correct response rate commonly used for the creation of stimulus banks. MATERIAL AND METHODS Searches were conducted up to May 2019 in the Pubmed, PsycInfo, Web of Science, and Scopus databases using the keywords ("faces" OR "facial") AND ("recognition" OR "expression" OR "emotional") AND ("elderly" OR "older adults"). RESULTS Twenty-seven articles were included in the present review. In 16 studies (59.2%), older people had correct response rates on FER lower than 70% on at least one of the emotions evaluated. Among the studies that evaluated each emotion specifically, 62.5% found correct response rates lower than 70% for the emotion fear, 50% for surprise, 50% for sadness, 37.5% for anger, 21.4% for disgust, and 5.9% for happiness. Moreover, the studies that evaluated the level of intensity of the emotions demonstrated a lower rate of correct responses when the intensity of the facial expression was low. CONCLUSION That studies employ methods and facial stimuli that may not be adequate for measuring this skill in older people. Thus, it is important to create adequate tasks for assessing the skill in this population.
Collapse
Affiliation(s)
- Bianca Letícia C Ferreira
- Department of Neurosciences and Behavioral Sciences, Universidade de São Paulo, Ribeirão Preto, SP, Brazil; Research Group on Mental Health, Cognition and Aging, Federal University of São Carlos, São Carlos, SP, Brazil.
| | - Daiene de Morais Fabrício
- Research Group on Mental Health, Cognition and Aging, Federal University of São Carlos, São Carlos, SP, Brazil
| | - Marcos Hortes N Chagas
- Department of Neurosciences and Behavioral Sciences, Universidade de São Paulo, Ribeirão Preto, SP, Brazil; Research Group on Mental Health, Cognition and Aging, Federal University of São Carlos, São Carlos, SP, Brazil; Bairral Institute of Psychiatry, Itapira, SP, Brazil
| |
Collapse
|
23
|
Tamm G, Kreegipuu K, Harro J. Updating facial emotional expressions in working memory: Differentiating trait anxiety and depressiveness. Acta Psychol (Amst) 2020; 209:103117. [PMID: 32603911 DOI: 10.1016/j.actpsy.2020.103117] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2018] [Revised: 05/04/2020] [Accepted: 06/09/2020] [Indexed: 01/25/2023] Open
Abstract
Individual differences in updating emotional facial expressions in working memory are not fully understood. Here we focused on the effects of high trait anxiety and high depressiveness in men and women on updating schematic emotional facial expressions (sad, angry, scheming, happy, neutral). A population representative sample of young adults was divided into four emotional disposition groups based on STAI-T and MADRS cut-offs: high anxiety (HA, n = 41), high depressiveness (HD, n = 31), high depressiveness & high anxiety (HAHD, n = 65) and control (CT, n = 155). Participants completed a 2-back task with schematic emotional faces, and valence/arousal ratings and verbal recognition tasks. A novel approach was used to separate encoding from retrieval. We found an interaction of emotional dispositions and emotional faces in updating accuracy. HD group made more errors than HA when encoding happy schematic faces. Other differences between emotional dispositions on updating measures were found but they were not specific to any emotional facial expression. Our findings suggest that there is a minor happy disadvantage in HD in contrast to HA which can be seen in lower accuracy for visual encoding of happy faces, but not in retrieval accuracy, the speed of updating, nor perception of emotional content in happy faces. These findings help to explain differences and similarities between high trait anxiety and high depressiveness in working memory and processing of facial expressions. The results are discussed in relation to prevalent theories of information processing in anxiety and depression.
Collapse
|
24
|
Abbruzzese L, Magnani N, Robertson IH, Mancuso M. Age and Gender Differences in Emotion Recognition. Front Psychol 2019; 10:2371. [PMID: 31708832 PMCID: PMC6819430 DOI: 10.3389/fpsyg.2019.02371] [Citation(s) in RCA: 57] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2019] [Accepted: 10/04/2019] [Indexed: 12/19/2022] Open
Abstract
Background Existing literature suggests that age affects recognition of affective facial expressions. Eye-tracking studies highlighted that age-related differences in recognition of emotions could be explained by different face exploration patterns due to attentional impairment. Gender also seems to play a role in recognition of emotions. Unfortunately, little is known about the differences in emotion perception abilities across lifespans for men and women, even if females show more ability from infancy. Objective The present study aimed to examine the role of age and gender on facial emotion recognition in relation to neuropsychological functions and face exploration strategies. We also aimed to explore the associations between emotion recognition and quality of life. Methods 60 healthy people were consecutively enrolled in the study and divided into two groups: Younger Adults and Older Adults. Participants were assessed for: emotion recognition, attention abilities, frontal functioning, memory functioning and quality of life satisfaction. During the execution of the emotion recognition test using the Pictures of Facial Affects (PoFA) and a modified version of PoFA (M-PoFA), subject’s eye movements were recorded with an Eye Tracker. Results Significant differences between younger and older adults were detected for fear recognition when adjusted for cognitive functioning and eye-gaze fixations characteristics. Adjusted means of fear recognition were significantly higher in the younger group than in the older group. With regard to gender’s effects, old females recognized identical pairs of emotions better than old males. Considering the Satisfaction Profile (SAT-P) we detected negative correlations between some dimensions (Physical functioning, Sleep/feeding/free time) and emotion recognition (i.e., sadness, and disgust). Conclusion The current study provided novel insights into the specific mechanisms that may explain differences in emotion recognition, examining how age and gender differences can be outlined by cognitive functioning and face exploration strategies.
Collapse
Affiliation(s)
| | - Nadia Magnani
- Adult Mental Health Service, NHS-USL Tuscany South-Est, Grosseto, Italy
| | - Ian H Robertson
- Global Brain Health Institute, Trinity College Institute of Neuroscience, Trinity College Dublin, The University of Dublin, Dublin, Ireland
| | - Mauro Mancuso
- Tuscany Rehabilitation Clinic, Montevarchi, Italy.,Physical and Rehabilitative Medicine Unit, NHS-USL Tuscany South-Est, Grosseto, Italy
| |
Collapse
|
25
|
Impaired attention toward the eyes in psychopathic offenders: Evidence from an eye tracking study. Behav Res Ther 2019; 118:121-129. [DOI: 10.1016/j.brat.2019.04.009] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2018] [Revised: 03/20/2019] [Accepted: 04/22/2019] [Indexed: 11/19/2022]
|
26
|
Gurera JW, Isaacowitz DM. Emotion regulation and emotion perception in aging: A perspective on age-related differences and similarities. PROGRESS IN BRAIN RESEARCH 2019; 247:329-351. [DOI: 10.1016/bs.pbr.2019.02.007] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/03/2022]
|
27
|
Tu YZ, Lin DW, Suzuki A, Goh JOS. East Asian Young and Older Adult Perceptions of Emotional Faces From an Age- and Sex-Fair East Asian Facial Expression Database. Front Psychol 2018; 9:2358. [PMID: 30555382 PMCID: PMC6281963 DOI: 10.3389/fpsyg.2018.02358] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2018] [Accepted: 11/10/2018] [Indexed: 11/21/2022] Open
Abstract
There is increasing interest in clarifying how different face emotion expressions are perceived by people from different cultures, of different ages and sex. However, scant availability of well-controlled emotional face stimuli from non-Western populations limit the evaluation of cultural differences in face emotion perception and how this might be modulated by age and sex differences. We present a database of East Asian face expression stimuli, enacted by young and older, male and female, Taiwanese using the Facial Action Coding System (FACS). Combined with a prior database, this present database consists of 90 identities with happy, sad, angry, fearful, disgusted, surprised and neutral expressions amounting to 628 photographs. Twenty young and 24 older East Asian raters scored the photographs for intensities of multiple-dimensions of emotions and induced affect. Multivariate analyses characterized the dimensionality of perceived emotions and quantified effects of age and sex. We also applied commercial software to extract computer-based metrics of emotions in photographs. Taiwanese raters perceived happy faces as one category, sad, angry, and disgusted expressions as one category, and fearful and surprised expressions as one category. Younger females were more sensitive to face emotions than younger males. Whereas, older males showed reduced face emotion sensitivity, older female sensitivity was similar or accentuated relative to young females. Commercial software dissociated six emotions according to the FACS demonstrating that defining visual features were present. Our findings show that East Asians perceive a different dimensionality of emotions than Western-based definitions in face recognition software, regardless of age and sex. Critically, stimuli with detailed cultural norms are indispensable in interpreting neural and behavioral responses involving human facial expression processing. To this end, we add to the tools, which are available upon request, for conducting such research.
Collapse
Affiliation(s)
- Yu-Zhen Tu
- Graduate Institute of Brain and Mind Sciences, College of Medicine, National Taiwan University, Taipei, Taiwan
| | - Dong-Wei Lin
- Graduate Institute of Brain and Mind Sciences, College of Medicine, National Taiwan University, Taipei, Taiwan
| | - Atsunobu Suzuki
- Department of Psychology, Graduate School of Humanities and Sociology, The University of Tokyo, Tokyo, Japan
| | - Joshua Oon Soo Goh
- Graduate Institute of Brain and Mind Sciences, College of Medicine, National Taiwan University, Taipei, Taiwan.,Department of Psychology, College of Science, National Taiwan University, Taipei, Taiwan.,Neurobiological and Cognitive Science Center, National Taiwan University, Taipei, Taiwan.,Center for Artificial Intelligence and Advanced Robotics, National Taiwan University, Taipei, Taiwan
| |
Collapse
|
28
|
Grainger SA, Steinvik HR, Henry JD, Phillips LH. The role of social attention in older adults’ ability to interpret naturalistic social scenes. Q J Exp Psychol (Hove) 2018; 72:1328-1343. [DOI: 10.1177/1747021818791774] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Age-related differences on theory of mind (ToM) tasks are well established. However, the literature has been criticised for predominantly relying on tasks with poor ecological validity, and consequently it remains unclear whether these age differences extend to tasks with greater realism. In addition, we currently have a limited understanding of the factors that may contribute to age-related declines in ToM. To address these issues, we conducted two studies that assessed age differences in ToM using multimodal social scene stimuli. Study 1 also examined eye movements to assess whether biases in visual attention may be related to age-related difficulties in ToM, and Study 2 included an assessment of social attention (as indexed by biological motion perception) and working memory to assess whether these capacities may explain age difficulties in ToM. In both studies, the results showed that older adults performed worse than their younger counterparts on the ToM tasks, indicating that age-related difficulties in ToM extend to measures that more closely represent everyday social interactions. The eye-tracking data in Study 1 showed that older adults gazed less at the faces of protagonists in the social scenes compared with younger adults; however, these visual biases were not associated with ToM ability. Study 2 showed that older age was associated with a reduced ability to detect biological motion cues, and this mediated age-related variance in ToM ability. These findings are discussed in relation to competing theoretical frameworks of ageing that predict either improvements or declines in ToM with age.
Collapse
Affiliation(s)
- Sarah A Grainger
- School of Psychology, The University of Queensland, St Lucia, QLD, Australia
| | | | - Julie D Henry
- School of Psychology, The University of Queensland, St Lucia, QLD, Australia
| | | |
Collapse
|
29
|
Gonçalves AR, Fernandes C, Pasion R, Ferreira-Santos F, Barbosa F, Marques-Teixeira J. Effects of age on the identification of emotions in facial expressions: a meta-analysis. PeerJ 2018; 6:e5278. [PMID: 30065878 PMCID: PMC6064197 DOI: 10.7717/peerj.5278] [Citation(s) in RCA: 53] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2018] [Accepted: 06/27/2018] [Indexed: 11/20/2022] Open
Abstract
Background Emotion identification is a fundamental component of social cognition. Although it is well established that a general cognitive decline occurs with advancing age, the effects of age on emotion identification is still unclear. A meta-analysis by Ruffman and colleagues (2008) explored this issue, but much research has been published since then, reporting inconsistent findings. Methods To examine age differences in the identification of facial expressions of emotion, we conducted a meta-analysis of 24 empirical studies (N = 1,033 older adults, N = 1,135 younger adults) published after 2008. Additionally, a meta-regression analysis was conducted to identify potential moderators. Results Results show that older adults less accurately identify facial expressions of anger, sadness, fear, surprise, and happiness compared to younger adults, strengthening the results obtained by Ruffman et al. (2008). However, meta-regression analyses indicate that effect sizes are moderated by sample characteristics and stimulus features. Importantly, the estimated effect size for the identification of fear and disgust increased for larger differences in the number of years of formal education between the two groups. Discussion We discuss several factors that might explain the age-related differences in emotion identification and suggest how brain changes may account for the observed pattern. Furthermore, moderator effects are interpreted and discussed.
Collapse
Affiliation(s)
- Ana R Gonçalves
- Laboratory of Neuropsychophysiology, Faculty of Psychology and Education Sciences, Universidade do Porto, Porto, Portugal
| | - Carina Fernandes
- Laboratory of Neuropsychophysiology, Faculty of Psychology and Education Sciences, Universidade do Porto, Porto, Portugal.,Faculty of Medicine, Universidade do Porto, Porto, Portugal.,Language Research Laboratory, Institute of Molecular Medicine, Faculty of Medicine, Universidade de Lisboa, Lisboa, Portugal
| | - Rita Pasion
- Laboratory of Neuropsychophysiology, Faculty of Psychology and Education Sciences, Universidade do Porto, Porto, Portugal
| | - Fernando Ferreira-Santos
- Laboratory of Neuropsychophysiology, Faculty of Psychology and Education Sciences, Universidade do Porto, Porto, Portugal
| | - Fernando Barbosa
- Laboratory of Neuropsychophysiology, Faculty of Psychology and Education Sciences, Universidade do Porto, Porto, Portugal
| | - João Marques-Teixeira
- Laboratory of Neuropsychophysiology, Faculty of Psychology and Education Sciences, Universidade do Porto, Porto, Portugal
| |
Collapse
|
30
|
Olderbak S, Wilhelm O, Hildebrandt A, Quoidbach J. Sex differences in facial emotion perception ability across the lifespan. Cogn Emot 2018; 33:579-588. [DOI: 10.1080/02699931.2018.1454403] [Citation(s) in RCA: 36] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
Affiliation(s)
- Sally Olderbak
- Institute for Psychology and Education, Ulm University, Ulm, Germany
| | - Oliver Wilhelm
- Institute for Psychology and Education, Ulm University, Ulm, Germany
| | - Andrea Hildebrandt
- Institute for Psychology, Ernst Moritz Arndt Universität Greifswald, Greifswald, Germany
| | - Jordi Quoidbach
- Department of People Management and Organisation, ESADE Business School, Barcelona, Spain
| |
Collapse
|
31
|
Grainger SA, Henry JD, Steinvik HR, Vanman EJ, Rendell PG, Labuschagne I. Intranasal oxytocin does not reduce age-related difficulties in social cognition. Horm Behav 2018; 99:25-34. [PMID: 29408521 DOI: 10.1016/j.yhbeh.2018.01.009] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/11/2017] [Revised: 01/31/2018] [Accepted: 01/31/2018] [Indexed: 10/18/2022]
Abstract
Oxytocin is a neuropeptide that plays a key role in social processing and there are several studies suggesting that intranasally administered oxytocin may enhance social cognitive abilities and visual attention in healthy and clinical groups. However, there are very few studies to date that have investigated the potential benefits of intranasal oxytocin (iOT) on older adults' social cognitive abilities. This is a surprising omission, because relative to their younger counterparts, older adults also exhibit a range of social cognitive difficulties and also show differences in the way they visually attend to social information. Therefore, we tested the effect of iOT (24 IU) versus a placebo spray on 59 older and 61 younger adults' social cognitive abilities and visual attention using a double-blind placebo-controlled within-groups design. While iOT provided no overall age-related benefit on social cognitive abilities, the key finding to emerge was that iOT improved ToM ability in both age-groups when the task had minimal contextual information, but not when the task had enriched contextual information. Interestingly, iOT had gender specific effects during a ToM task with minimal context. For males in both age-groups, iOT reduced gazing to the social aspects of the scenes (i.e., faces & bodies), and for females, iOT eliminated age differences in gaze patterns that were observed in the placebo condition. These effects on eye-gaze were not observed in a very similar ToM task that included more enriched contextual information. Overall, these findings highlight the interactive nature of iOT with task related factors (e.g., context), and are discussed in relation to the social salience hypothesis of oxytocin.
Collapse
|
32
|
Sen A, Isaacowitz D, Schirmer A. Age differences in vocal emotion perception: on the role of speaker age and listener sex. Cogn Emot 2017; 32:1189-1204. [DOI: 10.1080/02699931.2017.1393399] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Affiliation(s)
- Antarika Sen
- Neurobiology and Aging Programme, National University of Singapore, Singapore, Singapore
| | | | - Annett Schirmer
- Department of Psychology, The Chinese University of Hong Kong, Hong Kong, Hong Kong
- The Mind and Brain Institute, The Chinese University of Hong Kong, Hong Kong, Hong Kong
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
33
|
Effectiveness of a short audiovisual emotion recognition training program in adults. MOTIVATION AND EMOTION 2017. [DOI: 10.1007/s11031-017-9631-9] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
34
|
Isaacowitz DM, Livingstone KM, Castro VL. Aging and emotions: experience, regulation, and perception. Curr Opin Psychol 2017; 17:79-83. [PMID: 28950978 DOI: 10.1016/j.copsyc.2017.06.013] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2016] [Accepted: 06/25/2017] [Indexed: 01/14/2023]
Abstract
Whereas some theories suggest that emotion-related processes become more positive with age, recent empirical findings on affective experience, emotion regulation, and emotion perception depict a more nuanced picture. Though there is some evidence for positive age trajectories in affective experience, results are mixed for emotion regulation and largely negative for emotion perception. Thus, current findings suggest that the effects of age on emotion vary across different affective domains; age patterns are also influenced by different moderators, including contextual factors and individual differences.
Collapse
|
35
|
Grainger SA, Henry JD, Phillips LH, Vanman EJ, Allen R. Age Deficits in Facial Affect Recognition: The Influence of Dynamic Cues. J Gerontol B Psychol Sci Soc Sci 2015; 72:622-632. [DOI: 10.1093/geronb/gbv100] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2015] [Accepted: 10/12/2015] [Indexed: 11/14/2022] Open
|