1
|
Namba S, Saito A, Sato W. Computational analysis of value learning and value-driven detection of neutral faces by young and older adults. Front Psychol 2024; 15:1281857. [PMID: 38845772 PMCID: PMC11153859 DOI: 10.3389/fpsyg.2024.1281857] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2023] [Accepted: 05/07/2024] [Indexed: 06/09/2024] Open
Abstract
The rapid detection of neutral faces with emotional value plays an important role in social relationships for both young and older adults. Recent psychological studies have indicated that young adults show efficient value learning for neutral faces and the detection of "value-associated faces," while older adults show slightly different patterns of value learning and value-based detection of neutral faces. However, the mechanisms underlying these processes remain unknown. To investigate this, we applied hierarchical reinforcement learning and diffusion models to a value learning task and value-driven detection task that involved neutral faces; the tasks were completed by young and older adults. The results for the learning task suggested that the sensitivity of learning feedback might decrease with age. In the detection task, the younger adults accumulated information more efficiently than the older adults, and the perceptual time leading to motion onset was shorter in the younger adults. In younger adults only, the reward sensitivity during associative learning might enhance the accumulation of information during a visual search for neutral faces in a rewarded task. These results provide insight into the processing linked to efficient detection of faces associated with emotional values, and the age-related changes therein.
Collapse
Affiliation(s)
- Shushi Namba
- Psychological Process Team, Guardian Robot Project, RIKEN, Kyoto, Japan
- Department of Psychology, Hiroshima University, Hiroshima, Japan
| | - Akie Saito
- Psychological Process Team, Guardian Robot Project, RIKEN, Kyoto, Japan
| | - Wataru Sato
- Psychological Process Team, Guardian Robot Project, RIKEN, Kyoto, Japan
| |
Collapse
|
2
|
Tomberg C, Petagna M, de Selliers de Moranville LA. Horses (Equus caballus) facial micro-expressions: insight into discreet social information. Sci Rep 2023; 13:8625. [PMID: 37244937 DOI: 10.1038/s41598-023-35807-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2023] [Accepted: 05/24/2023] [Indexed: 05/29/2023] Open
Abstract
Facial micro-expressions are facial expressions expressed briefly (less than 500 ms) and involuntarily. Described only in humans, we investigated whether micro-expressions could also be expressed by non-human animal species. Using the Equine Facial action coding system (EquiFACS), an objective tool based on facial muscles actions, we demonstrated that a non-human species, Equus caballus, is expressing facial micro-expressions in a social context. The AU17, AD38 and AD1 were selectively modulated as micro-expression-but not as standard facial expression (all durations included)-in presence of a human experimenter. As standard facial expressions, they have been associated with pain or stress but our results didn't support this association for micro-expressions which may convey other information. Like in humans, neural mechanisms underlying the exhibit of micro-expressions may differ from those of standard facial expressions. We found that some micro-expressions could be related to attention and involved in the multisensory processing of the 'fixed attention' observed in horses' high attentional state. The micro-expressions could be used by horses as social information in an interspecies relationship. We hypothesize that facial micro-expressions could be a window on transient internal states of the animal and may provide subtle and discreet social signals.
Collapse
Affiliation(s)
- Claude Tomberg
- Faculty of Medicine, Université Libre de Bruxelles, 808, Route de Lennik, CP 630, 1070, Brussels, Belgium.
| | - Maxime Petagna
- Faculty of Medicine, Université Libre de Bruxelles, 808, Route de Lennik, CP 630, 1070, Brussels, Belgium
| | | |
Collapse
|
3
|
Collin CA, Chamberland J, LeBlanc M, Ranger A, Boutet I. Effects of Emotional Expression on Face Recognition May Be Accounted for by Image Similarity. SOCIAL COGNITION 2022. [DOI: 10.1521/soco.2022.40.3.282] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
We examined the degree to which differences in face recognition rates across emotional expression conditions varied concomitantly with differences in mean objective image similarity. Effects of emotional expression on face recognition performance were measured via an old/new recognition paradigm in which stimuli at both learning and testing had happy, neutral, and angry expressions. Results showed an advantage for faces learned with neutral expressions, as well as for angry faces at testing. Performance data was compared to three quantitative image-similarity indices. Findings showed that mean human performance was strongly correlated with mean image similarity, suggesting that the former may be at least partly explained by the latter. Our findings sound a cautionary note regarding the necessity of considering low-level stimulus properties as explanations for findings that otherwise may be prematurely attributed to higher order phenomena such as attention or emotional arousal.
Collapse
|
4
|
Kong Q, Currie N, Du K, Ruffman T. General cognitive decline does not account for older adults' worse emotion recognition and theory of mind. Sci Rep 2022; 12:6808. [PMID: 35473952 PMCID: PMC9043191 DOI: 10.1038/s41598-022-10716-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/25/2021] [Accepted: 04/01/2022] [Indexed: 12/03/2022] Open
Abstract
Older adults have both worse general cognition and worse social cognition. A frequent suggestion is that worse social cognition is due to worse general cognition. However, previous studies have often provided contradictory evidence. The current study examined this issue with a more extensive battery of tasks for both forms of cognition. We gave 47 young and 40 older adults three tasks to assess general cognition (processing speed, working memory, fluid intelligence) and three tasks to assess their social cognition (emotion and theory-of-mind). Older adults did worse on all tasks and there were correlations between general and social cognition. Although working memory and fluid intelligence were unique predictors of performance on the Emotion Photos task and the Eyes task, Age Group was a unique predictor on all three social cognition tasks. Thus, there were relations between the two forms of cognition but older adults continued to do worse than young adults even after accounting for general cognition. We argue that this pattern of results is due to some overlap in brain areas mediating general and social cognition, but also independence, and with a differential rate of decline in brain areas dedicated to general cognition versus social cognition.
Collapse
Affiliation(s)
- Qiuyi Kong
- Department of Psychology, University of Otago, Dunedin, New Zealand.
| | - Nicholas Currie
- Department of Psychology, University of Otago, Dunedin, New Zealand
| | - Kangning Du
- Department of Psychology, University of Otago, Dunedin, New Zealand
| | - Ted Ruffman
- Department of Psychology, University of Otago, Dunedin, New Zealand.
| |
Collapse
|
5
|
Liu H, Liu Y, Dong X, Liu H, Han B. Effect of Cognitive Control on Age-Related Positivity Effects in Attentional Processing - Evidence From an Event-Related Brain Potential Study. Front Psychol 2021; 12:755635. [PMID: 34925159 PMCID: PMC8671695 DOI: 10.3389/fpsyg.2021.755635] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2021] [Accepted: 10/11/2021] [Indexed: 11/13/2022] Open
Abstract
Studies investigating age-related positivity effects during facial emotion processing have yielded contradictory results. The present study aimed to elucidate the mechanisms of cognitive control during attentional processing of emotional faces among older adults. We used go/no-go detection tasks combined with event-related potentials and source localization to examine the effects of response inhibition on age-related positivity effects. Data were obtained from 23 older and 23 younger healthy participants. Behavioral results showed that the discriminability index (d') of older adults on fear trials was significantly greater than that of younger adults [t(44)=2.37, p=0.024, Cohen's d=0.70], whereas an opposite pattern was found in happy trials [t(44)=2.56, p=0.014, Cohen's d=0.75]. The electroencephalography results on the amplitude of the N170 at the left electrode positions showed that the fear-neutral face pairs were larger than the happy-neutral ones for the younger adults [t(22)=2.32, p=0.030, Cohen's d=0.48]; the older group's right hemisphere presented similar tendency, although the results were not statistically significant [t(22)=1.97, p=0.061, Cohen's d=0.41]. Further, the brain activity of the two hemispheres in older adults showed asymmetrical decrement. Our study demonstrated that the age-related "positivity effect" was not observed owing to the depletion of available cognitive resources at the early attentional stage. Moreover, bilateral activation of the two hemispheres may be important signals of normal aging.
Collapse
Affiliation(s)
- Haining Liu
- Department of Psychology, Chengde Medical University, Chengde, China.,Hebei Key Laboratory of Nerve Injury and Repair, Chengde Medical University, Chengde, China
| | - Yanli Liu
- Department of Biomedical Engineering, Chengde Medical University, Chengde, China
| | - Xianling Dong
- Department of Biomedical Engineering, Chengde Medical University, Chengde, China
| | - Haihong Liu
- Department of Psychology, Chengde Medical University, Chengde, China.,Centre for Research in Psychology and Human Well Being, Faculty of Social Sciences and Humanities, The National University of Malaysia, Bangi, Malaysia
| | - Buxin Han
- Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
6
|
Rodger H, Lao J, Stoll C, Richoz AR, Pascalis O, Dye M, Caldara R. The recognition of facial expressions of emotion in deaf and hearing individuals. Heliyon 2021; 7:e07018. [PMID: 34041389 PMCID: PMC8141778 DOI: 10.1016/j.heliyon.2021.e07018] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Revised: 03/25/2021] [Accepted: 05/04/2021] [Indexed: 11/25/2022] Open
Abstract
During real-life interactions, facial expressions of emotion are perceived dynamically with multimodal sensory information. In the absence of auditory sensory channel inputs, it is unclear how facial expressions are recognised and internally represented by deaf individuals. Few studies have investigated facial expression recognition in deaf signers using dynamic stimuli, and none have included all six basic facial expressions of emotion (anger, disgust, fear, happiness, sadness, and surprise) with stimuli fully controlled for their low-level visual properties, leaving the question of whether or not a dynamic advantage for deaf observers exists unresolved. We hypothesised, in line with the enhancement hypothesis, that the absence of auditory sensory information might have forced the visual system to better process visual (unimodal) signals, and predicted that this greater sensitivity to visual stimuli would result in better recognition performance for dynamic compared to static stimuli, and for deaf-signers compared to hearing non-signers in the dynamic condition. To this end, we performed a series of psychophysical studies with deaf signers with early-onset severe-to-profound deafness (dB loss >70) and hearing controls to estimate their ability to recognize the six basic facial expressions of emotion. Using static, dynamic, and shuffled (randomly permuted video frames of an expression) stimuli, we found that deaf observers showed similar categorization profiles and confusions across expressions compared to hearing controls (e.g., confusing surprise with fear). In contrast to our hypothesis, we found no recognition advantage for dynamic compared to static facial expressions for deaf observers. This observation shows that the decoding of dynamic facial expression emotional signals is not superior even in the deaf expert visual system, suggesting the existence of optimal signals in static facial expressions of emotion at the apex. Deaf individuals match hearing individuals in the recognition of facial expressions of emotion.
Collapse
Affiliation(s)
- Helen Rodger
- Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Junpeng Lao
- Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Chloé Stoll
- Laboratoire de Psychologie et de Neurocognition (CNRS-UMR5105), Université Grenoble-Alpes, France
| | | | - Olivier Pascalis
- Laboratoire de Psychologie et de Neurocognition (CNRS-UMR5105), Université Grenoble-Alpes, France
| | - Matthew Dye
- National Technical Institute for Deaf/Rochester Institute of Technology, Rochester, New York, USA
| | - Roberto Caldara
- Department of Psychology, University of Fribourg, Fribourg, Switzerland
| |
Collapse
|
7
|
|
8
|
Saito A, Sato W, Yoshikawa S. Older adults detect happy facial expressions less rapidly. ROYAL SOCIETY OPEN SCIENCE 2020; 7:191715. [PMID: 32269799 PMCID: PMC7137944 DOI: 10.1098/rsos.191715] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/30/2019] [Accepted: 02/21/2020] [Indexed: 06/11/2023]
Abstract
Previous experimental psychology studies based on visual search paradigms have reported that young adults detect emotional facial expressions more rapidly than emotionally neutral expressions. However, it remains unclear whether this holds in older adults. We investigated this by comparing the abilities of young and older adults to detect emotional and neutral facial expressions while controlling the visual properties of faces presented (termed anti-expressions) in a visual search task. Both age groups detected normal angry faces more rapidly than anti-angry faces. However, whereas young adults detected normal happy faces more rapidly than anti-happy faces, older adults did not. This suggests that older adults may not be easy to detect or focusing attention towards smiling faces appearing peripherally.
Collapse
Affiliation(s)
- Akie Saito
- Authors for correspondence: Akie Saito e-mail:
| | - Wataru Sato
- Authors for correspondence: Wataru Sato e-mail:
| | | |
Collapse
|
9
|
Murphy J, Millgate E, Geary H, Catmur C, Bird G. No effect of age on emotion recognition after accounting for cognitive factors and depression. Q J Exp Psychol (Hove) 2019; 72:2690-2704. [DOI: 10.1177/1747021819859514] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
Abstract
A decline in emotion recognition ability across the lifespan has been well documented. However, whether age predicts emotion recognition difficulties after accounting for potentially confounding factors which covary with age remains unclear. Although previous research suggested that age-related decline in emotion recognition ability may be partly a consequence of cognitive (fluid intelligence, processing speed) and affective (e.g., depression) factors, recent theories highlight a potential role for alexithymia (difficulty identifying and describing one’s emotions) and interoception (perception of the body’s internal state). This study therefore aimed to examine the recognition of anger and disgust across the adult lifespan in a group of 140 20–90-year-olds to see whether an effect of age would remain after controlling for a number of cognitive and affective factors potentially impacted by age. In addition, using an identity recognition control task, the study aimed to determine whether the factors accounting for the effects of age on emotion discrimination also contribute towards generalised face processing difficulties. Results revealed that discrimination of disgust and anger across the lifespan was predicted by processing speed and fluid intelligence, and negatively by depression. No effect of age was found after these factors were accounted for. Importantly, these effects were specific to emotion discrimination; only crystallised intelligence accounted for unique variance in identity discrimination. Contrary to expectations, although interoception and alexithymia were correlated with emotion discrimination abilities, these factors did not explain unique variance after accounting for other variables.
Collapse
Affiliation(s)
- Jennifer Murphy
- Social, Genetic and Developmental Psychiatry Centre, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, London, UK
| | - Edward Millgate
- Department of Psychology, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, London, UK
| | - Hayley Geary
- Department of Psychology, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, London, UK
| | - Caroline Catmur
- Department of Psychology, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, London, UK
| | - Geoffrey Bird
- Social, Genetic and Developmental Psychiatry Centre, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, London, UK
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| |
Collapse
|
10
|
Abstract
UNLABELLED ABSTRACTBackground:The current research sought to characterize current mood state profiles in healthy young versus older adults using 100-point visual analogue mood scales (VAMS), provide within-sample and new sample replication of age-group differences, assess sex differences, and compare with commonly used standardized symptom measures. METHODS In two studies, six word-only VAMS (happy, sad, calm, tense, energetic, and sleepy) were administered in a laboratory setting. In Study 1, 22 young and 29 older males completed the VAMS six times (twice per day at weekly intervals). In Study 2, 60 young (30 males) and 60 older (30 males) adults completed on one occasion the VAMS, Beck Depression Inventory-II, State-Trait Anxiety Inventory, and Pittsburgh Sleep Quality Index. RESULTS VAMS scores showed that older adults had a tendency to indicate feeling happier, less sad, calmer, less tense, more energetic, and less sleepy than young adults. This pattern occurred across assessment points and irrespective of sex, except for the tense VAMS, which showed higher scores in females than males in young but not older adults. The standardized measures showed significant age-group differences for Trait Anxiety only (lower in older than young adults). CONCLUSIONS These findings establish current mood state differences in young versus older adults. The absence of age-group differences in past studies may relate to the limited precision of the scales (only 7 points, in contrast to the 100-point scales used here).
Collapse
|
11
|
Richoz AR, Lao J, Pascalis O, Caldara R. Tracking the recognition of static and dynamic facial expressions of emotion across the life span. J Vis 2018; 18:5. [PMID: 30208425 DOI: 10.1167/18.9.5] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
The effective transmission and decoding of dynamic facial expressions of emotion is omnipresent and critical for adapted social interactions in everyday life. Thus, common intuition would suggest an advantage for dynamic facial expression recognition (FER) over the static snapshots routinely used in most experiments. However, although many studies reported an advantage in the recognition of dynamic over static expressions in clinical populations, results obtained from healthy participants are contrasted. To clarify this issue, we conducted a large cross-sectional study to investigate FER across the life span in order to determine if age is a critical factor to account for such discrepancies. More than 400 observers (age range 5-96) performed recognition tasks of the six basic expressions in static, dynamic, and shuffled (temporally randomized frames) conditions, normalized for the amount of energy sampled over time. We applied a Bayesian hierarchical step-linear model to capture the nonlinear relationship between age and FER for the different viewing conditions. Although replicating the typical accuracy profiles of FER, we determined the age at which peak efficiency was reached for each expression and found greater accuracy for most dynamic expressions across the life span. This advantage in the elderly population was driven by a significant decrease in performance for static images, which was twice as large as for the young adults. Our data posit the use of dynamic stimuli as being critical in the assessment of FER in the elderly population, inviting caution when drawing conclusions from the sole use of static face images to this aim.
Collapse
Affiliation(s)
- Anne-Raphaëlle Richoz
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland.,LPNC, University of Grenoble Alpes, Grenoble, France
| | - Junpeng Lao
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | | | - Roberto Caldara
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| |
Collapse
|
12
|
Mienaltowski A, Lemerise EA, Greer K, Burke L. Age-related differences in emotion matching are limited to low intensity expressions. AGING NEUROPSYCHOLOGY AND COGNITION 2018; 26:348-366. [PMID: 29471716 DOI: 10.1080/13825585.2018.1441363] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
Multi-label tasks confound age differences in perceptual and cognitive processes. We examined age differences in emotion perception with a technique that did not require verbal labels. Participants matched the emotion expressed by a target to two comparison stimuli, one neutral and one emotional. Angry, disgusted, fearful, happy, and sad facial expressions of varying intensity were used. Although older adults took longer to respond than younger adults, younger adults only outmatched older adults for the lowest intensity disgust and fear expressions. Some participants also completed an identity matching task in which target stimuli were matched on personal identity instead of emotion. Although irrelevant to the judgment, expressed emotion still created interference. All participants were less accurate when the apparent difference in expressive intensity of the matched stimuli was large, suggesting that salient emotion cues increased difficulty of identity matching. Age differences in emotion perception were limited to very low intensity expressions.
Collapse
Affiliation(s)
- Andrew Mienaltowski
- a Department of Psychological Sciences, Ogden College of Science and Engineering , Western Kentucky University , Bowling Green , KY , USA
| | - Elizabeth A Lemerise
- a Department of Psychological Sciences, Ogden College of Science and Engineering , Western Kentucky University , Bowling Green , KY , USA
| | - Kaitlyn Greer
- a Department of Psychological Sciences, Ogden College of Science and Engineering , Western Kentucky University , Bowling Green , KY , USA
| | - Lindsey Burke
- a Department of Psychological Sciences, Ogden College of Science and Engineering , Western Kentucky University , Bowling Green , KY , USA
| |
Collapse
|
13
|
Shen X, Wu Q, Zhao K, Fu X. Electrophysiological Evidence Reveals Differences between the Recognition of Microexpressions and Macroexpressions. Front Psychol 2016; 7:1346. [PMID: 27630610 PMCID: PMC5005928 DOI: 10.3389/fpsyg.2016.01346] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2016] [Accepted: 08/23/2016] [Indexed: 11/25/2022] Open
Abstract
Microexpressions are fleeting facial expressions that are important for judging people's true emotions. Little is known about the neural mechanisms underlying the recognition of microexpressions (with duration of less than 200 ms) and macroexpressions (with duration of greater than 200 ms). We used an affective priming paradigm in which a picture of a facial expression is the prime and an emotional word is the target, and electroencephalogram (EEG) and event-related potentials (ERPs) to examine neural activities associated with recognizing microexpressions and macroexpressions. The results showed that there were significant main effects of duration and valence for N170/vertex positive potential. The main effect of congruence for N400 is also significant. Further, sLORETA showed that the brain regions responsible for these significant differences included the inferior temporal gyrus and widespread regions of the frontal lobe. Furthermore, the results suggested that the left hemisphere was more involved than the right hemisphere in processing a microexpression. The main effect of duration for the event-related spectral perturbation (ERSP) was significant, and the theta oscillations (4 to 8 Hz) increased in recognizing expressions with a duration of 40 ms compared with 300 ms. Thus, there are different EEG/ERPs neural mechanisms for recognizing microexpressions compared to recognizing macroexpressions.
Collapse
Affiliation(s)
- Xunbing Shen
- Department of Psychology, Jiangxi University of Traditional Chinese MedicineNanchang, China
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of SciencesBeijing, China
| | - Qi Wu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of SciencesBeijing, China
- Department of Psychology, Hunan Normal UniversityChangsha, China
| | - Ke Zhao
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of SciencesBeijing, China
| | - Xiaolan Fu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of SciencesBeijing, China
| |
Collapse
|