1
|
Mannion R, Harikumar A, Morales-Calva F, Leal SL. A novel face-name mnemonic discrimination task with naturalistic stimuli. Neuropsychologia 2023; 189:108678. [PMID: 37661039 DOI: 10.1016/j.neuropsychologia.2023.108678] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2023] [Revised: 07/24/2023] [Accepted: 08/31/2023] [Indexed: 09/05/2023]
Abstract
Difficulty remembering faces and names is a common struggle for many people and gets more difficult as we age. Subtle changes in appearance from day to day, common facial characteristics across individuals, and overlap of names may contribute to the difficulty of learning face-name associations. Computational models suggest the hippocampus plays a key role in reducing interference across experiences with overlapping information by performing pattern separation, which enables us to encode similar experiences as distinct from one another. Thus, given the nature of overlapping features within face-name associative memory, hippocampal pattern separation may be an important underlying mechanism supporting this type of memory. Furthermore, cross-species approaches find that aging is associated with deficits in hippocampal pattern separation. Mnemonic discrimination tasks have been designed to tax hippocampal pattern separation and provide a more sensitive measure of age-related cognitive decline compared to traditional memory tasks. However, traditional face-name associative memory tasks do not parametrically vary overlapping features of faces and names to tax hippocampal pattern separation and often lack naturalistic facial features (e.g., hair, accessories, similarity of features, emotional expressions). Here, we developed a face-name mnemonic discrimination task where we varied face stimuli by similarity, race, sex, and emotional expression as well as the similarity of name stimuli. We tested a sample of healthy young and older adults on this task and found that both age groups showed worsening performance as face-name interference increased. Overall, older adults struggled to remember faces and face-name pairs more than young adults. However, while young adults remembered emotional faces better than neutral faces, older adults selectively remembered positive faces. Thus, the use of a face-name association memory task designed with varying levels of face-name interference as well as the inclusion of naturalistic face stimuli across race, sex, and emotional expressions provides a more nuanced approach relative to traditional face-name association tasks toward understanding age-related changes in memory.
Collapse
Affiliation(s)
- Renae Mannion
- Psychological Sciences, Rice University, 6500 Main St, Houston, TX, 77030, USA.
| | - Amritha Harikumar
- Psychological Sciences, Rice University, 6500 Main St, Houston, TX, 77030, USA.
| | | | - Stephanie L Leal
- Psychological Sciences, Rice University, 6500 Main St, Houston, TX, 77030, USA.
| |
Collapse
|
2
|
Lin H, Liang J. The priming effects of emotional vocal expressions on face encoding and recognition: An ERP study. Int J Psychophysiol 2023; 183:32-40. [PMID: 36375630 DOI: 10.1016/j.ijpsycho.2022.11.006] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2022] [Revised: 10/27/2022] [Accepted: 11/07/2022] [Indexed: 11/13/2022]
Abstract
Previous studies have suggested that emotional primes, presented as visual stimuli, influence face memory (e.g., encoding and recognition). However, due to stimulus-associated issues, whether emotional primes affect face encoding when the priming stimuli are presented in an auditory modality remains controversial. Moreover, no studies have investigated whether the effects of emotional auditory primes are maintained in later stages of face memory, such as face recognition. To address these issues, participants in the present study were asked to memorize angry and neutral faces. The faces were presented after a simple nonlinguistic interjection expressed with angry or neutral prosodies. Subsequently, participants completed an old/new recognition task in which only faces were presented. Event-related potential (ERP) results showed that during the encoding phase, all faces preceded by an angry vocal expression elicited larger N170 responses than faces preceded by a neutral vocal expression. Angry vocal expression also enhanced the late positive potential (LPP) responses specifically to angry faces. In the subsequent recognition phase, preceding angry vocal primes reduced early LPP responses to both angry and neutral faces and late LPP responses specifically to neutral faces. These findings suggest that the negative emotion of auditory primes influenced face encoding and recognition.
Collapse
Affiliation(s)
- Huiyan Lin
- Institute of Applied Psychology, School of Public Administration, Guangdong University of Finance, Guangzhou, China; Laboratory for Behavioral and Regional Finance, Guangdong University of Finance, Guangzhou, China.
| | - Jiafeng Liang
- School of Education, Guangdong University of Education, Guangzhou, China
| |
Collapse
|
3
|
The role of perceptual difficulty in visual hindsight bias for emotional faces. Psychon Bull Rev 2022:10.3758/s13423-022-02219-5. [DOI: 10.3758/s13423-022-02219-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/13/2022] [Indexed: 11/30/2022]
|
4
|
Collin CA, Chamberland J, LeBlanc M, Ranger A, Boutet I. Effects of Emotional Expression on Face Recognition May Be Accounted for by Image Similarity. SOCIAL COGNITION 2022. [DOI: 10.1521/soco.2022.40.3.282] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
We examined the degree to which differences in face recognition rates across emotional expression conditions varied concomitantly with differences in mean objective image similarity. Effects of emotional expression on face recognition performance were measured via an old/new recognition paradigm in which stimuli at both learning and testing had happy, neutral, and angry expressions. Results showed an advantage for faces learned with neutral expressions, as well as for angry faces at testing. Performance data was compared to three quantitative image-similarity indices. Findings showed that mean human performance was strongly correlated with mean image similarity, suggesting that the former may be at least partly explained by the latter. Our findings sound a cautionary note regarding the necessity of considering low-level stimulus properties as explanations for findings that otherwise may be prematurely attributed to higher order phenomena such as attention or emotional arousal.
Collapse
|
5
|
Macinska S, Jellema T. Memory for facial expressions on the autism spectrum: The influence of gaze direction and type of expression. Autism Res 2022; 15:870-880. [PMID: 35150078 DOI: 10.1002/aur.2682] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2021] [Revised: 01/19/2022] [Accepted: 01/24/2022] [Indexed: 11/10/2022]
Abstract
Face memory research in autism has largely neglected memory for facial expressions, in favor of memory for identity. This study in three experiments examined the role of gaze direction and type of expression on memory for facial expressions in relation to the autism spectrum. In the learning phase, four combinations of facial expressions (joy/anger) and gaze direction (toward/away), displayed by 16 different identities, were presented. In a subsequent surprise test the same identities were presented displaying neutral expressions, and the expression of each identity had to be recalled. In Experiment 1, typically-developed (TD) individuals with low and high Autism Quotient (AQ) scores were tested with three repetitions of each emotion/gaze combination, which did not produce any modulations. In Experiment 2, another group of TD individuals with low and high AQ scores were tested with eight repetitions, resulting in a "happy advantage" and a "direct gaze advantage", but no interactions. In Experiment 3, individuals with high-functioning autism (HFA) and a matched TD group were tested using eight repetitions. The HFA group revealed no emotion or gaze effects, while the matched TD group showed both a happy and a direct gaze advantage, and again no interaction. The results suggest that in autistic individuals the memory for facial expressions is intact, but is not modulated by the person's expression type and gaze direction. We discuss whether anomalous implicit learning of facial cues could have contributed to these findings, its relevance for social intuition, and its possible contribution to social deficits in autism. LAY SUMMARY: It has often been found that memory for someone's face (facial identity) is less good in autism. However, it is not yet known whether memory for someone's facial expression is also less good in autism. In this study, the memory for expressions of joy and anger was investigated in typically-developed (TD) individuals who possessed either few or many autistic-like traits (Experiments 1 and 2), and in individuals with high-functioning autism (Experiment 3). The gaze direction was also varied (directed either toward, or away from, the observer). We found that TD individuals best remembered expressions of joy, and remembered expressions of both joy and anger better when the gaze was directed at them. These effects did not depend on the extent to which they possessed autistic-like traits. Autistic participants remembered the facial expression of a previously encountered person as good as TD participants did. However, in contrast to the TD participants, the memory of autistic participants was not influenced by the expression type and gaze direction of the previously encountered persons. We discuss whether this may lead to difficulties in the development of social intuition, which in turn could give rise to difficulties in social interaction that are characteristic for autism.
Collapse
|
6
|
Human face and gaze perception is highly context specific and involves bottom-up and top-down neural processing. Neurosci Biobehav Rev 2021; 132:304-323. [PMID: 34861296 DOI: 10.1016/j.neubiorev.2021.11.042] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2021] [Revised: 11/24/2021] [Accepted: 11/24/2021] [Indexed: 11/21/2022]
Abstract
This review summarizes human perception and processing of face and gaze signals. Face and gaze signals are important means of non-verbal social communication. The review highlights that: (1) some evidence is available suggesting that the perception and processing of facial information starts in the prenatal period; (2) the perception and processing of face identity, expression and gaze direction is highly context specific, the effect of race and culture being a case in point. Culture affects by means of experiential shaping and social categorization the way in which information on face and gaze is collected and perceived; (3) face and gaze processing occurs in the so-called 'social brain'. Accumulating evidence suggests that the processing of facial identity, facial emotional expression and gaze involves two parallel and interacting pathways: a fast and crude subcortical route and a slower cortical pathway. The flow of information is bi-directional and includes bottom-up and top-down processing. The cortical networks particularly include the fusiform gyrus, superior temporal sulcus (STS), intraparietal sulcus, temporoparietal junction and medial prefrontal cortex.
Collapse
|
7
|
Silveira MV, Camargo JC, Aggio NM, Ribeiro GW, Cortez MD, Young ME, de Rose JC. The influence of training procedure and stimulus valence on the long-term maintenance of equivalence relations. Behav Processes 2021; 185:104343. [PMID: 33549809 DOI: 10.1016/j.beproc.2021.104343] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2020] [Revised: 01/25/2021] [Accepted: 01/26/2021] [Indexed: 10/22/2022]
Abstract
In the current research, we aimed at extending Silveira et al. (2016) results by verifying whether the long-term maintenance of the equivalence classes is influenced by stimulus valence and MTS training procedures. The delayed and simultaneous MTS were used to train two groups of participants in series of conditional relation trials involving pictures of humans' faces expressing familiar emotions (A) and abstract forms (B, C, and D). The participants that demonstrated the emergence of novel BD and DB relations and class-consistent derived transfer of functions returned to the laboratory thirty days later. Follow-up assessments were given in which the probability of class-consistent responses was higher for the happy class only for participants exposed to DMTS training. This result shows that the maintenance of equivalence classes cannot be accounted for only in terms of the affective valence of the familiar stimulus. The affective valence of the happy faces may have yoked with the properties of DMTS, favoring the maintenance of the happy class. Thereby, we discussed the role of mediating verbal behavior evoked selectively by the pictures of happy faces appearing as samples that may have persisted during the delay interval as a possible mechanism underlying performances of participants trained in DMTS procedure.
Collapse
Affiliation(s)
- Marcelo V Silveira
- Universidade Federal do ABC - Centro de Matemática, Computação e Cognição UFABC-CMCC, Campus São Bernardo do Campo, Brazil.
| | - Julio C Camargo
- Universidade Federal de São Carlos - Departamento de Psicologia, UFSCar-DPsi, Campus São Carlos, Brazil.
| | | | - Giovan W Ribeiro
- Universidade Federal de São Carlos - Departamento de Psicologia, UFSCar-DPsi, Campus São Carlos, Brazil
| | - Mariéle Diniz Cortez
- Universidade Federal de São Carlos - Departamento de Psicologia, UFSCar-DPsi, Campus São Carlos, Brazil
| | | | - Julio C de Rose
- Universidade Federal de São Carlos - Departamento de Psicologia, UFSCar-DPsi, Campus São Carlos, Brazil
| |
Collapse
|
8
|
Salmela VR, Ölander K, Muukkonen I, Bays PM. Recall of facial expressions and simple orientations reveals competition for resources at multiple levels of the visual hierarchy. J Vis 2019; 19:8. [PMID: 30897626 PMCID: PMC6432740 DOI: 10.1167/19.3.8] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
Many studies of visual working memory have tested humans' ability to reproduce primary visual features of simple objects, such as the orientation of a grating or the hue of a color patch, following a delay. A consistent finding of such studies is that precision of responses declines as the number of items in memory increases. Here we compared visual working memory for primary features and high-level objects. We presented participants with memory arrays consisting of oriented gratings, facial expressions, or a mixture of both. Precision of reproduction for all facial expressions declined steadily as the memory load was increased from one to five faces. For primary features, this decline and the specific distributions of error observed, have been parsimoniously explained in terms of neural population codes. We adapted the population coding model for circular variables to the non-circular and bounded parameter space used for expression estimation. Total population activity was held constant according to the principle of normalization and the intensity of expression was decoded by drawing samples from the Bayesian posterior distribution. The model fit the data well, showing that principles of population coding can be applied to model memory representations at multiple levels of the visual hierarchy. When both gratings and faces had to be remembered, an asymmetry was observed. Increasing the number of faces decreased precision of orientation recall, but increasing the number of gratings did not affect recall of expression, suggesting that memorizing faces involves the automatic encoding of low-level features, in addition to higher-level expression information.
Collapse
Affiliation(s)
- Viljami R Salmela
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland.,Department of Psychology, University of Cambridge, Cambridge, UK
| | - Kaisu Ölander
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
| | - Ilkka Muukkonen
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
| | - Paul M Bays
- Department of Psychology, University of Cambridge, Cambridge, UK
| |
Collapse
|
9
|
Representation of facial identity includes expression variability. Vision Res 2019; 157:123-131. [DOI: 10.1016/j.visres.2018.05.004] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2017] [Revised: 01/24/2018] [Accepted: 05/09/2018] [Indexed: 11/20/2022]
|
10
|
|
11
|
S. Cortes D, Laukka P, Lindahl C, Fischer H. Memory for faces and voices varies as a function of sex and expressed emotion. PLoS One 2017; 12:e0178423. [PMID: 28570691 PMCID: PMC5453523 DOI: 10.1371/journal.pone.0178423] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2016] [Accepted: 05/12/2017] [Indexed: 11/18/2022] Open
Abstract
We investigated how memory for faces and voices (presented separately and in combination) varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral). At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms) was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations). For the subjective sense of recollection (“remember” hits), neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy.
Collapse
Affiliation(s)
- Diana S. Cortes
- Department of Psychology, Stockholm University, Stockholm, Sweden
| | - Petri Laukka
- Department of Psychology, Stockholm University, Stockholm, Sweden
- * E-mail:
| | | | - Håkan Fischer
- Department of Psychology, Stockholm University, Stockholm, Sweden
| |
Collapse
|
12
|
Sava AA, Krolak-Salmon P, Delphin-Combe F, Cloarec M, Chainay H. Memory for faces with emotional expressions in Alzheimer's disease and healthy older participants: positivity effect is not only due to familiarity. AGING NEUROPSYCHOLOGY AND COGNITION 2016; 24:1-28. [PMID: 26873302 DOI: 10.1080/13825585.2016.1143444] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/17/2022]
Abstract
Young individuals better memorize initially seen faces with emotional rather than neutral expressions. Healthy older participants and Alzheimer's disease (AD) patients show better memory for faces with positive expressions. The socioemotional selectivity theory postulates that this positivity effect in memory reflects a general age-related preference for positive stimuli, subserving emotion regulation. Another explanation might be that older participants use compensatory strategies, often considering happy faces as previously seen. The question about the existence of this effect in tasks not permitting such compensatory strategies is still open. Thus, we compared the performance of healthy participants and AD patients for positive, neutral, and negative faces in such tasks. Healthy older participants and AD patients showed a positivity effect in memory, but there was no difference between emotional and neutral faces in young participants. Our results suggest that the positivity effect in memory is not entirely due to the sense of familiarity for smiling faces.
Collapse
Affiliation(s)
- Alina-Alexandra Sava
- a Institut de Psychologie, Laboratoire d'Etude de Mécanismes Cognitifs (EMC) , Université Lumière Lyon 2 , Lyon , France
| | - Pierre Krolak-Salmon
- b INSERM U1028 - CNRS UMR5292 , Centre de Recherche en Neurosciences de Lyon , Bron , France.,c Hospices civils de Lyon, CM2R , Hôpital gériatrique des Charpennes , Villeurbane , France
| | - Floriane Delphin-Combe
- c Hospices civils de Lyon, CM2R , Hôpital gériatrique des Charpennes , Villeurbane , France
| | - Morgane Cloarec
- a Institut de Psychologie, Laboratoire d'Etude de Mécanismes Cognitifs (EMC) , Université Lumière Lyon 2 , Lyon , France
| | - Hanna Chainay
- a Institut de Psychologie, Laboratoire d'Etude de Mécanismes Cognitifs (EMC) , Université Lumière Lyon 2 , Lyon , France
| |
Collapse
|
13
|
de Gelder B, Huis in ‘t Veld EMJ, Van den Stock J. The Facial Expressive Action Stimulus Test. A test battery for the assessment of face memory, face and object perception, configuration processing, and facial expression recognition. Front Psychol 2015; 6:1609. [PMID: 26579004 PMCID: PMC4624856 DOI: 10.3389/fpsyg.2015.01609] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2015] [Accepted: 10/05/2015] [Indexed: 11/13/2022] Open
Abstract
There are many ways to assess face perception skills. In this study, we describe a novel task battery FEAST (Facial Expressive Action Stimulus Test) developed to test recognition of identity and expressions of human faces as well as stimulus control categories. The FEAST consists of a neutral and emotional face memory task, a face and shoe identity matching task, a face and house part-to-whole matching task, and a human and animal facial expression matching task. The identity and part-to-whole matching tasks contain both upright and inverted conditions. The results provide reference data of a healthy sample of controls in two age groups for future users of the FEAST.
Collapse
Affiliation(s)
- Beatrice de Gelder
- Department of Cognitive Neuroscience, Maastricht UniversityMaastricht, Netherlands
- Department of Psychiatry and Mental Health, University of Cape TownCape Town, South Africa
| | - Elisabeth M. J. Huis in ‘t Veld
- Department of Cognitive Neuroscience, Maastricht UniversityMaastricht, Netherlands
- Department of Medical and Clinical Psychology, Tilburg UniversityTilburg, Netherlands
| | - Jan Van den Stock
- Laboratory for Translational Neuropsychiatry, Department of Neurosciences, KU LeuvenLeuven, Belgium
- Old Age Psychiatry, University Hospitals LeuvenLeuven, Belgium
| |
Collapse
|
14
|
Quaglino V, De Wever E, Maurage P. Relations Between Cognitive Abilities, Drinking Characteristics, and Emotional Recognition in Alcohol Dependence: A Preliminary Exploration. Alcohol Clin Exp Res 2015; 39:2032-8. [PMID: 26332272 DOI: 10.1111/acer.12841] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2015] [Accepted: 07/06/2015] [Indexed: 01/02/2023]
Abstract
BACKGROUND Alcohol dependence is characterized by wide-ranging cognitive impairments, but also by emotional facial expressions (EFEs) recognition deficits. Although they play a crucial role both in the development and in the maintenance of the disease, cognitive and emotional disorders have up to now been mostly explored separately. As a result, not much is known regarding their interactions. This study thus aims at exploring the relations between cognition and emotion in alcohol dependence, and more specifically between cognitive performance, drinking characteristics, and EFE recognition. METHODS About 26 recently detoxified alcohol-dependent individuals and 26 matched controls were tested for cognitive abilities (by means of a standardized neuropsychological battery) and for EFE recognition. RESULTS Alcohol-dependent individuals simultaneously presented altered performances for executive abilities and EFE recognition (particularly for disgust recognition). Moreover, a regression analysis showed that EFE performance was centrally related to episodic memory and cognitive flexibility. CONCLUSIONS These results clarify the relations between EFE recognition, cognitive abilities, and drinking characteristics in alcohol dependence and clearly suggest that cognitive factors should be taken into account in future studies exploring emotional processes in alcohol dependence. Specific cognitive programs should be developed to rehabilitate cognitive and emotional abilities simultaneously.
Collapse
Affiliation(s)
- Véronique Quaglino
- Research Center in Psychology: Cognition, Psyche and Organizations (CRP-CPO EA7273), Department of Psychology, University of Picardie Jules Verne, Amiens, France
| | - Elodie De Wever
- Research Center in Psychology: Cognition, Psyche and Organizations (CRP-CPO EA7273), Department of Psychology, University of Picardie Jules Verne, Amiens, France
| | - Pierre Maurage
- Laboratory for Experimental Psychopathology, Psychological Sciences Research Institute, Université Catholique de Louvain, Louvain-la-Neuve, Belgium
| |
Collapse
|
15
|
Chen W, Liu CH, Li H, Tong K, Ren N, Fu X. Facial expression at retrieval affects recognition of facial identity. Front Psychol 2015; 6:780. [PMID: 26106355 PMCID: PMC4460307 DOI: 10.3389/fpsyg.2015.00780] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2015] [Accepted: 05/25/2015] [Indexed: 11/22/2022] Open
Abstract
It is well known that memory can be modulated by emotional stimuli at the time of encoding and consolidation. For example, happy faces create better identity recognition than faces with certain other expressions. However, the influence of facial expression at the time of retrieval remains unknown in the literature. To separate the potential influence of expression at retrieval from its effects at earlier stages, we had participants learn neutral faces but manipulated facial expression at the time of memory retrieval in a standard old/new recognition task. The results showed a clear effect of facial expression, where happy test faces were identified more successfully than angry test faces. This effect is unlikely due to greater image similarity between the neural training face and the happy test face, because image analysis showed that the happy test faces are in fact less similar to the neutral training faces relative to the angry test faces. In the second experiment, we investigated whether this emotional effect is affected by the expression at the time of learning. We employed angry or happy faces as learning stimuli, and angry, happy, and neutral faces as test stimuli. The results showed that the emotional effect at retrieval is robust across different encoding conditions with happy or angry expressions. These findings indicate that emotional expressions do not only affect the stages of encoding and consolidation, but also the retrieval process in identity recognition.
Collapse
Affiliation(s)
- Wenfeng Chen
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences Beijing, China
| | - Chang Hong Liu
- Department of Psychology, Bournemouth University Poole, UK
| | - Huiyun Li
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences Beijing, China ; University of Chinese Academy of Sciences Beijing, China
| | - Ke Tong
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences Beijing, China ; University of Chinese Academy of Sciences Beijing, China
| | - Naixin Ren
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences Beijing, China ; University of Chinese Academy of Sciences Beijing, China
| | - Xiaolan Fu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences Beijing, China
| |
Collapse
|