1
|
Siklos-Whillans J, Itier RJ. Effects of Inversion and Fixation Location on the Processing of Face and House Stimuli - A Mass Univariate Analysis. Brain Topogr 2024; 37:972-992. [PMID: 39042323 DOI: 10.1007/s10548-024-01068-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/12/2024] [Accepted: 07/05/2024] [Indexed: 07/24/2024]
Abstract
Most Event Related Potential studies investigating the time course of visual processing have focused mainly on the N170 component. Stimulus orientation affects the N170 amplitude for faces but not for objects, a finding interpreted as reflecting holistic/configural processing for faces and featural processing for objects. Furthermore, while recent studies suggest where on the face people fixate impacts the N170, fixation location effects have not been investigated in objects. A data-driven mass univariate analysis (all time points and electrodes) was used to investigate the time course of inversion and fixation location effects on the neural processing of faces and houses. Strong and widespread orientation effects were found for both faces and houses, from 100-350ms post-stimulus onset, including P1 and N170 components, and later, a finding arguing against a lack of holistic processing for houses. While no clear fixation effect was found for houses, fixation location strongly impacted face processing early, reflecting retinotopic mapping around the C2 and P1 components, and during the N170-P2 interval. Face inversion effects were also largest for nasion fixation around 120ms. The results support the view that facial feature integration (1) depends on which feature is being fixated and where the other features are situated in the visual field, (2) occurs maximally during the P1-N170 interval when fixation is on the nasion and (3) continues past 200ms, suggesting the N170 peak, where weak effects were found, might be an inflexion point between processes rather than the end of a feature integration into a whole process.
Collapse
Affiliation(s)
- James Siklos-Whillans
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada
| | - Roxane J Itier
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada.
| |
Collapse
|
2
|
Wang G, Ma L, Wang L, Pang W. Independence Threat or Interdependence Threat? The Focusing Effect on Social or Physical Threat Modulates Brain Activity. Brain Sci 2024; 14:368. [PMID: 38672018 PMCID: PMC11047893 DOI: 10.3390/brainsci14040368] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2024] [Revised: 04/04/2024] [Accepted: 04/04/2024] [Indexed: 04/28/2024] Open
Abstract
OBJECTIVE The neural basis of threat perception has mostly been examined separately for social or physical threats. However, most of the threats encountered in everyday life are complex. The features of interactions between social and physiological threats under different attentional conditions are unclear. METHOD The present study explores this issue using an attention-guided paradigm based on ERP techniques. The screen displays social threats (face threats) and physical threats (action threats), instructing participants to concentrate on only one type of threat, thereby exploring brain activation characteristics. RESULTS It was found that action threats did not affect the processing of face threats in the face-attention condition, and electrophysiological evidence from the brain suggests a comparable situation to that when processing face threats alone, with higher amplitudes of the N170 and EPN (Early Posterior Negativity) components of anger than neutral emotions. However, when focusing on the action-attention condition, the brain was affected by face threats, as evidenced by a greater N190 elicited by stimuli containing threatening emotions, regardless of whether the action was threatening or not. This trend was also reflected in EPN. CONCLUSIONS The current study reveals important similarities and differences between physical and social threats, suggesting that the brain has a greater processing advantage for social threats.
Collapse
Affiliation(s)
- Guan Wang
- The School of Psychology and Cognitive Science, East China Normal University, Shanghai 200062, China
- School of Education Science, Huaiyin Normal University, Huaian 223300, China
| | - Lian Ma
- School of Computer Science and Technology, Huaiyin Normal University, Huaian 223300, China
| | - Lili Wang
- School of Education Science, Huaiyin Normal University, Huaian 223300, China
| | - Weiguo Pang
- The School of Psychology and Cognitive Science, East China Normal University, Shanghai 200062, China
| |
Collapse
|
3
|
Luo X, Zhao D, Gao Y, Yang Z, Wang D, Mei G. Implicit weight bias: shared neural substrates for overweight and angry facial expressions revealed by cross-adaptation. Cereb Cortex 2024; 34:bhae128. [PMID: 38566513 DOI: 10.1093/cercor/bhae128] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2023] [Revised: 03/07/2024] [Accepted: 03/12/2024] [Indexed: 04/04/2024] Open
Abstract
The perception of facial expression plays a crucial role in social communication, and it is known to be influenced by various facial cues. Previous studies have reported both positive and negative biases toward overweight individuals. It is unclear whether facial cues, such as facial weight, bias facial expression perception. Combining psychophysics and event-related potential technology, the current study adopted a cross-adaptation paradigm to examine this issue. The psychophysical results of Experiments 1A and 1B revealed a bidirectional cross-adaptation effect between overweight and angry faces. Adapting to overweight faces decreased the likelihood of perceiving ambiguous emotional expressions as angry compared to adapting to normal-weight faces. Likewise, exposure to angry faces subsequently caused normal-weight faces to appear thinner. These findings were corroborated by bidirectional event-related potential results, showing that adaptation to overweight faces relative to normal-weight faces modulated the event-related potential responses of emotionally ambiguous facial expression (Experiment 2A); vice versa, adaptation to angry faces relative to neutral faces modulated the event-related potential responses of ambiguous faces in facial weight (Experiment 2B). Our study provides direct evidence associating overweight faces with facial expression, suggesting at least partly common neural substrates for the perception of overweight and angry faces.
Collapse
Affiliation(s)
- Xu Luo
- School of Psychology, Guizhou Normal University, Huaxi University Town, Guian New District, Guiyang 550025China
| | - Danning Zhao
- School of Psychology, Guizhou Normal University, Huaxi University Town, Guian New District, Guiyang 550025China
| | - Yi Gao
- School of Psychology, Georgia Institute of Technology, 654 Cherry St NW, Atlanta, GA 30332, United States
| | - Zhihao Yang
- School of Psychology, Guizhou Normal University, Huaxi University Town, Guian New District, Guiyang 550025China
| | - Da Wang
- School of Psychology, Guizhou Normal University, Huaxi University Town, Guian New District, Guiyang 550025China
| | - Gaoxing Mei
- School of Psychology, Guizhou Normal University, Huaxi University Town, Guian New District, Guiyang 550025China
| |
Collapse
|
4
|
Itier RJ, Durston AJ. Mass-univariate analysis of scalp ERPs reveals large effects of gaze fixation location during face processing that only weakly interact with face emotional expression. Sci Rep 2023; 13:17022. [PMID: 37813928 PMCID: PMC10562468 DOI: 10.1038/s41598-023-44355-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2023] [Accepted: 10/06/2023] [Indexed: 10/11/2023] Open
Abstract
Decoding others' facial expressions is critical for social functioning. To clarify the neural correlates of expression perception depending on where we look on the face, three combined gaze-contingent ERP experiments were analyzed using robust mass-univariate statistics. Regardless of task, fixation location impacted face processing from 50 to 350 ms, maximally around 120 ms, reflecting retinotopic mapping around C2 and P1 components. Fixation location also impacted majorly the N170-P2 interval while weak effects were seen at the face-sensitive N170 peak. Results question the widespread assumption that faces are processed holistically into an indecomposable perceptual whole around the N170. Rather, face processing is a complex and view-dependent process that continues well beyond the N170. Expression and fixation location interacted weakly during the P1-N170 interval, supporting a role for the mouth and left eye in fearful and happy expression decoding. Expression effects were weakest at the N170 peak but strongest around P2, especially for fear, reflecting task-independent affective processing. Results suggest N170 reflects a transition between processes rather than the maximum of a holistic face processing stage. Focus on this peak should be replaced by data-driven analyses of the epoch using robust statistics to fully unravel the early visual processing of faces and their affective content.
Collapse
Affiliation(s)
- Roxane J Itier
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada.
| | - Amie J Durston
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada
| |
Collapse
|
5
|
Righi S, Benedetti V, Giganti F, Turano MT, Raduazzo G, Viggiano MP. Anxiety is not the right choice! Individual differences in trait anxiety modulate biases in pseudoneglect. Front Hum Neurosci 2023; 17:1201898. [PMID: 37600557 PMCID: PMC10434218 DOI: 10.3389/fnhum.2023.1201898] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Accepted: 07/10/2023] [Indexed: 08/22/2023] Open
Abstract
Pseudoneglect, the tendency to display a leftward perceptual bias, is consistently observed in line bisection tasks. Some studies have shown that pseudoneglect is sensitive to emotions. This emotion-related modulation is likely related to valence-dependent hemispheric lateralization, although the results do not converge. A possible explanation for these inconsistencies could be individual differences in emotional tone. Considering that negative and positive emotions produce different basic activations of the two hemispheres, emotional characteristics of the subjects, such as trait anxiety, could in fact modulate the pseudoneglect phenomenon. To verify this, high- and low-anxiety participants were asked to centrally bisect horizontal lines delimited by neutral or emotional (happy and sad) faces. In line with previous studies, results here showed a decrease in the leftward bisection error in the presence of happy faces, indicating a greater involvement of the left hemisphere in processing positive emotional stimuli. In addition, trait anxiety influenced the magnitude of the visual bias. High-anxiety subjects, compared to low-anxiety subjects, showed a general bias in visual attention toward the left space as a function of emotional valence. Results are discussed within the framework of valence-dependent hemispheric specialization and the relative degree of activation. In sum, our data highlight the relevance of considering emotional individual differences in studying the pseudoneglect phenomenon.
Collapse
Affiliation(s)
- Stefania Righi
- Department of Neurofarba, University of Florence, Florence, Italy
| | - Viola Benedetti
- Department of Neurofarba, University of Florence, Florence, Italy
| | - Fiorenza Giganti
- Department of Neurofarba, University of Florence, Florence, Italy
| | | | - Greta Raduazzo
- Department of Neurofarba, University of Florence, Florence, Italy
| | | |
Collapse
|
6
|
Impact of face outline, parafoveal feature number and feature type on early face perception in a gaze-contingent paradigm: A mass-univariate re-analysis of ERP data. NEUROIMAGE: REPORTS 2022. [DOI: 10.1016/j.ynirp.2022.100148] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/07/2022]
|
7
|
Naumann S, Bayer M, Dziobek I. Preschoolers' Sensitivity to Negative and Positive Emotional Facial Expressions: An ERP Study. Front Psychol 2022; 13:828066. [PMID: 35712205 PMCID: PMC9197498 DOI: 10.3389/fpsyg.2022.828066] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2021] [Accepted: 04/28/2022] [Indexed: 11/13/2022] Open
Abstract
The study examined processing differences for facial expressions (happy, angry, or neutral) and their repetition with early (P1, N170) and late (P3) event-related potentials (ERPs) in young children (N = 33). EEG was recorded while children observed sequentially presented pairs of facial expressions, which were either the same (repeated trials) or differed in their emotion (novel trials). We also correlated ERP amplitude differences with parental and child measures of socio-emotional competence (emotion recognition, empathy). P1 amplitudes were increased for angry and happy as compared to neutral expressions. We also detected larger P3 amplitudes for angry expressions as compared to happy or neutral expressions. Repetition effects were evident at early and late processing stages marked by reduced P1 amplitudes for repeated vs. novel happy expressions, but enhanced P3 amplitudes for repeated vs. novel facial expressions. N170 amplitudes were neither modulated by facial expressions nor their repetition. None of the repetition effects were associated with measures of socio-emotional competence. Taken together, negative facial expressions led to increased neural activations in early and later processing stages, indicative of enhanced saliency to potential threating stimuli in young children. Processing of repeated facial expression seem to be differential for early and late neural stages: Reduced activation was detected at early neural processing stages particularly for happy faces, indicative of effective processing for an emotion, which is most familiar within this age range. Contrary to our hypothesis, enhanced activity for repeated vs. novel expression independent of a particular emotion were detected at later processing stages, which may be linked to the creation of new memory traces. Early and late repetition effects are discussed in light of developmental and perceptual differences as well as task-specific load.
Collapse
Affiliation(s)
- Sandra Naumann
- Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Berlin, Germany
- Department of Psychology, Institute of Life Sciences, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Mareike Bayer
- Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Berlin, Germany
- Department of Psychology, Institute of Life Sciences, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Isabel Dziobek
- Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Berlin, Germany
- Department of Psychology, Institute of Life Sciences, Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|
8
|
Ren J, Yao Q, Tian M, Li F, Chen Y, Chen Q, Xiang J, Shi J. Altered effective connectivity in migraine patients during emotional stimuli: a multi-frequency magnetoencephalography study. J Headache Pain 2022; 23:6. [PMID: 35032999 PMCID: PMC8903691 DOI: 10.1186/s10194-021-01379-4] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2021] [Accepted: 12/27/2021] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Migraine is a common and disabling primary headache, which is associated with a wide range of psychiatric comorbidities. However, the mechanisms of emotion processing in migraine are not fully understood yet. The present study aimed to investigate the neural network during neutral, positive, and negative emotional stimuli in the migraine patients. METHODS A total of 24 migraine patients and 24 age- and sex-matching healthy controls were enrolled in this study. Neuromagnetic brain activity was recorded using a whole-head magnetoencephalography (MEG) system upon exposure to human facial expression stimuli. MEG data were analyzed in multi-frequency ranges from 1 to 100 Hz. RESULTS The migraine patients exhibited a significant enhancement in the effective connectivity from the prefrontal lobe to the temporal cortex during the negative emotional stimuli in the gamma frequency (30-90 Hz). Graph theory analysis revealed that the migraine patients had an increased degree and clustering coefficient of connectivity in the delta frequency range (1-4 Hz) upon exposure to positive emotional stimuli and an increased degree of connectivity in the delta frequency range (1-4 Hz) upon exposure to negative emotional stimuli. Clinical correlation analysis showed that the history, attack frequency, duration, and neuropsychological scales of the migraine patients had a negative correlation with the network parameters in certain frequency ranges. CONCLUSIONS The results suggested that the individuals with migraine showed deviant effective connectivity in viewing the human facial expressions in multi-frequencies. The prefrontal-temporal pathway might be related to the altered negative emotional modulation in migraine. These findings suggested that migraine might be characterized by more universal altered cerebral processing of negative stimuli. Since the significant result in this study was frequency-specific, more independent replicative studies are needed to confirm these results, and to elucidate the neurocircuitry underlying the association between migraine and emotional conditions.
Collapse
Affiliation(s)
- Jing Ren
- Department of Neurology, The Affiliated Brain Hospital of Nanjing Medical University, Nanjing, 210029, Jiangsu, China
| | - Qun Yao
- Department of Neurology, The Affiliated Brain Hospital of Nanjing Medical University, Nanjing, 210029, Jiangsu, China
| | - Minjie Tian
- Department of Neurology, The Affiliated Brain Hospital of Nanjing Medical University, Nanjing, 210029, Jiangsu, China
| | - Feng Li
- Department of Neurology, The Affiliated Brain Hospital of Nanjing Medical University, Nanjing, 210029, Jiangsu, China
| | - Yueqiu Chen
- Department of Neurology, The Affiliated Brain Hospital of Nanjing Medical University, Nanjing, 210029, Jiangsu, China
| | - Qiqi Chen
- MEG Center, The Affiliated Brain Hospital of Nanjing Medical University, Nanjing, 210029, Jiangsu, China
| | - Jing Xiang
- MEG Center, Division of Neurology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, 45220, USA
| | - Jingping Shi
- Department of Neurology, The Affiliated Brain Hospital of Nanjing Medical University, Nanjing, 210029, Jiangsu, China.
| |
Collapse
|
9
|
Human face and gaze perception is highly context specific and involves bottom-up and top-down neural processing. Neurosci Biobehav Rev 2021; 132:304-323. [PMID: 34861296 DOI: 10.1016/j.neubiorev.2021.11.042] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2021] [Revised: 11/24/2021] [Accepted: 11/24/2021] [Indexed: 11/21/2022]
Abstract
This review summarizes human perception and processing of face and gaze signals. Face and gaze signals are important means of non-verbal social communication. The review highlights that: (1) some evidence is available suggesting that the perception and processing of facial information starts in the prenatal period; (2) the perception and processing of face identity, expression and gaze direction is highly context specific, the effect of race and culture being a case in point. Culture affects by means of experiential shaping and social categorization the way in which information on face and gaze is collected and perceived; (3) face and gaze processing occurs in the so-called 'social brain'. Accumulating evidence suggests that the processing of facial identity, facial emotional expression and gaze involves two parallel and interacting pathways: a fast and crude subcortical route and a slower cortical pathway. The flow of information is bi-directional and includes bottom-up and top-down processing. The cortical networks particularly include the fusiform gyrus, superior temporal sulcus (STS), intraparietal sulcus, temporoparietal junction and medial prefrontal cortex.
Collapse
|
10
|
Matt S, Dzhelyova M, Maillard L, Lighezzolo-Alnot J, Rossion B, Caharel S. The rapid and automatic categorization of facial expression changes in highly variable natural images. Cortex 2021; 144:168-184. [PMID: 34666300 DOI: 10.1016/j.cortex.2021.08.005] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Revised: 07/08/2021] [Accepted: 08/09/2021] [Indexed: 01/23/2023]
Abstract
Emotional expressions are quickly and automatically read from human faces under natural viewing conditions. Yet, categorization of facial expressions is typically measured in experimental contexts with homogenous sets of face stimuli. Here we evaluated how the 6 basic facial emotions (Fear, Disgust, Happiness, Anger, Surprise or Sadness) can be rapidly and automatically categorized with faces varying in head orientation, lighting condition, identity, gender, age, ethnic origin and background context. High-density electroencephalography was recorded in 17 participants viewing 50 s sequences with natural variable images of neutral-expression faces alternating at a 6 Hz rate. Every five stimuli (1.2 Hz), variable natural images of one of the six basic expressions were presented. Despite the wide physical variability across images, a significant F/5 = 1.2 Hz response and its harmonics (e.g., 2F/5 = 2.4 Hz, etc.) was observed for all expression changes at the group-level and in every individual participant. Facial categorization responses were found mainly over occipito-temporal sites, with distinct hemispheric lateralization and cortical topographies according to the different expressions. Specifically, a stronger response was found to Sadness categorization, especially over the left hemisphere, as compared to Fear and Happiness, together with a right hemispheric dominance for categorization of Fearful faces. Importantly, these differences were specific to upright faces, ruling out the contribution of low-level visual cues. Overall, these observations point to robust rapid and automatic facial expression categorization processes in the human brain.
Collapse
Affiliation(s)
- Stéphanie Matt
- Université de Lorraine, 2LPN, Nancy, France; Université de Lorraine, Laboratoire INTERPSY, Nancy, France.
| | - Milena Dzhelyova
- Université Catholique de Louvain, Institute of Research in Psychological Science, Louvain-la-Neuve, Belgium.
| | - Louis Maillard
- Université de Lorraine, CNRS, CRAN, Nancy, France; Université de Lorraine, CHRU-Nancy, Service de Neurologie, Nancy, France.
| | | | - Bruno Rossion
- Université Catholique de Louvain, Institute of Research in Psychological Science, Louvain-la-Neuve, Belgium; Université de Lorraine, CNRS, CRAN, Nancy, France; Université de Lorraine, CHRU-Nancy, Service de Neurologie, Nancy, France.
| | - Stéphanie Caharel
- Université de Lorraine, 2LPN, Nancy, France; Institut Universitaire de France, Paris, France.
| |
Collapse
|
11
|
The early processing of fearful and happy facial expressions is independent of task demands - Support from mass univariate analyses. Brain Res 2021; 1765:147505. [PMID: 33915164 DOI: 10.1016/j.brainres.2021.147505] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2020] [Revised: 04/06/2021] [Accepted: 04/22/2021] [Indexed: 11/20/2022]
Abstract
Most ERP studies on facial expressions of emotion have yielded inconsistent results regarding the time course of emotion effects and their possible modulation by task demands. Most studies have used classical statistical methods with a high likelihood of type I and type II errors, which can be limited with Mass Univariate statistics. FMUT and LIMO are currently the only two available toolboxes for Mass Univariate analysis of ERP data and use different fundamental statistics. Yet, no direct comparison of their output has been performed on the same dataset. Given the current push to transition to robust statistics to increase results replicability, here we compared the output of these toolboxes on data previously analyzed using classic approaches (Itier & Neath-Tavares, 2017). The early (0-352 ms) processing of fearful, happy, and neutral faces was investigated under three tasks in a within-subject design that also controlled gaze fixation location. Both toolboxes revealed main effects of emotion and task but neither yielded an interaction between the two, confirming the early processing of fear and happy expressions is largely independent of task demands. Both toolboxes found virtually no difference between neutral and happy expressions, while fearful (compared to neutral and happy) expressions modulated the N170 and EPN but elicited maximum effects after the N170 peak, around 190 ms. Similarities and differences in the spatial and temporal extent of these effects are discussed in comparison to the published classical analysis and the rest of the ERP literature.
Collapse
|
12
|
Mazzi C, Massironi G, Sanchez-Lopez J, De Togni L, Savazzi S. Face Recognition Deficits in a Patient With Alzheimer's Disease: Amnesia or Agnosia? The Importance of Electrophysiological Markers for Differential Diagnosis. Front Aging Neurosci 2021; 12:580609. [PMID: 33408626 PMCID: PMC7779478 DOI: 10.3389/fnagi.2020.580609] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2020] [Accepted: 11/30/2020] [Indexed: 11/13/2022] Open
Abstract
Face recognition deficits are frequently reported in Alzheimer's disease (AD) and often attributed to memory impairment. However, it has been hypothesized that failure in identifying familiar people could also be due to deficits in higher-level perceptual processes, since there is evidence showing a reduced inversion effect for faces but not for cars in AD. To address the involvement of these higher processes, we investigated event-related potential (ERP) neural correlates of faces in a patient with AD showing a face recognition deficit. Eight healthy participants were tested as a control group. Participants performed different tasks following the stimulus presentation. In experiment 1, they should indicate whether the stimulus was either a face or a house or a scrambled image. In experiments 2 and 3, they should discriminate between upright and inverted faces (in experiment 2, stimuli were faces with neutral or fearful expressions, while in experiment 3, stimuli were famous or unfamiliar faces). Electrophysiological results reveal that the typical face-specific modulation of the N170 component, which is thought to reflect the structural encoding of faces, was not present in patient MCG, despite being affected by the emotional content of the face implicitly processed by MCG. Conversely, the N400 component, which is thought to reflect the recruitment of the memory trace of the face identity, was found to be implicitly modulated in MCG. These results may identify a possible role for gnosic processes in face recognition deficits in AD and suggest the importance of adopting an integrated approach to the AD diagnosis while considering electrophysiological markers.
Collapse
Affiliation(s)
- Chiara Mazzi
- Perception and Awareness (PandA) Lab, University of Verona, Verona, Italy.,Department of Neuroscience, Biomedicine and Movement Sciences, University of Verona, Verona, Italy
| | - Gloria Massironi
- Center for Cognitive Decline and Dementia, ULSS 9 Scaligera, Verona, Italy
| | - Javier Sanchez-Lopez
- Centro de Investigacion en Ciencias Cognitivas, Universidad Autonoma del Estado de Morelos, Cuernavaca, Mexico
| | - Laura De Togni
- Center for Cognitive Decline and Dementia, ULSS 9 Scaligera, Verona, Italy
| | - Silvia Savazzi
- Perception and Awareness (PandA) Lab, University of Verona, Verona, Italy.,Department of Neuroscience, Biomedicine and Movement Sciences, University of Verona, Verona, Italy
| |
Collapse
|
13
|
Turano MT, Giganti F, Gavazzi G, Lamberto S, Gronchi G, Giovannelli F, Peru A, Viggiano MP. Spatially Filtered Emotional Faces Dominate during Binocular Rivalry. Brain Sci 2020; 10:brainsci10120998. [PMID: 33348612 PMCID: PMC7767193 DOI: 10.3390/brainsci10120998] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2020] [Revised: 12/12/2020] [Accepted: 12/15/2020] [Indexed: 11/16/2022] Open
Abstract
The present investigation explores the role of bottom-up and top-down factors in the recognition of emotional facial expressions during binocular rivalry. We manipulated spatial frequencies (SF) and emotive features and asked subjects to indicate whether the emotional or the neutral expression was dominant during binocular rivalry. Controlling the bottom-up saliency with a computational model, physically comparable happy and fearful faces were presented dichoptically with neutral faces. The results showed the dominance of emotional faces over neutral ones. In particular, happy faces were reported more frequently as the first dominant percept even in the presence of coarse information (at a low SF level: 2-6 cycle/degree). Following current theories of emotion processing, the results provide further support for the influence of positive compared to negative meaning on binocular rivalry and, for the first time, showed that individuals perceive the affective quality of happiness even in the absence of details in the visual display. Furthermore, our findings represent an advance in knowledge regarding the association between the high- and low-level mechanisms behind binocular rivalry.
Collapse
Affiliation(s)
- Maria Teresa Turano
- Department of Neuroscience, Psychology, Drug Research & Child’s Health, University of Florence, 50100 Florence, Italy; (M.T.T.); (F.G.); (S.L.); (G.G.); (F.G.); (A.P.)
- Fondazione Turano Onlus, 00195 Roma, Italy
| | - Fiorenza Giganti
- Department of Neuroscience, Psychology, Drug Research & Child’s Health, University of Florence, 50100 Florence, Italy; (M.T.T.); (F.G.); (S.L.); (G.G.); (F.G.); (A.P.)
| | - Gioele Gavazzi
- Diagnostic and Nuclear Research Institute, IRCCS SDN, 80121 Napoli, Italy;
| | - Simone Lamberto
- Department of Neuroscience, Psychology, Drug Research & Child’s Health, University of Florence, 50100 Florence, Italy; (M.T.T.); (F.G.); (S.L.); (G.G.); (F.G.); (A.P.)
| | - Giorgio Gronchi
- Department of Neuroscience, Psychology, Drug Research & Child’s Health, University of Florence, 50100 Florence, Italy; (M.T.T.); (F.G.); (S.L.); (G.G.); (F.G.); (A.P.)
| | - Fabio Giovannelli
- Department of Neuroscience, Psychology, Drug Research & Child’s Health, University of Florence, 50100 Florence, Italy; (M.T.T.); (F.G.); (S.L.); (G.G.); (F.G.); (A.P.)
| | - Andrea Peru
- Department of Neuroscience, Psychology, Drug Research & Child’s Health, University of Florence, 50100 Florence, Italy; (M.T.T.); (F.G.); (S.L.); (G.G.); (F.G.); (A.P.)
| | - Maria Pia Viggiano
- Department of Neuroscience, Psychology, Drug Research & Child’s Health, University of Florence, 50100 Florence, Italy; (M.T.T.); (F.G.); (S.L.); (G.G.); (F.G.); (A.P.)
- Correspondence: ; Tel.: +39-0552755053
| |
Collapse
|
14
|
Consistent behavioral and electrophysiological evidence for rapid perceptual discrimination among the six human basic facial expressions. COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2020; 20:928-948. [PMID: 32918269 DOI: 10.3758/s13415-020-00811-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/23/2023]
Abstract
The extent to which the six basic human facial expressions perceptually differ from one another remains controversial. For instance, despite the importance of rapidly decoding fearful faces, this expression often is confused with other expressions, such as Surprise in explicit behavioral categorization tasks. We quantified implicit visual discrimination among rapidly presented facial expressions with an oddball periodic visual stimulation approach combined with electroencephalography (EEG), testing for the relationship with behavioral explicit measures of facial emotion discrimination. We report robust facial expression discrimination responses bilaterally over the occipito-temporal cortex for each pairwise expression change. While fearful faces presented as repeated stimuli led to the smallest deviant responses from all other basic expressions, deviant fearful faces were well discriminated overall and to a larger extent than expressions of Sadness and Anger. Expressions of Happiness did not differ quantitatively as much in EEG as for behavioral subjective judgments, suggesting that the clear dissociation between happy and other expressions, typically observed in behavioral studies, reflects higher-order processes. However, this expression differed from all others in terms of scalp topography, pointing to a qualitative rather than quantitative difference. Despite this difference, overall, we report for the first time a tight relationship of the similarity matrices across facial expressions obtained for implicit EEG responses and behavioral explicit measures collected under the same temporal constraints, paving the way for new approaches of understanding facial expression discrimination in developmental, intercultural, and clinical populations.
Collapse
|
15
|
Roberge A, Duncan J, Fiset D, Brisson B. Dual-Task Interference on Early and Late Stages of Facial Emotion Detection Is Revealed by Human Electrophysiology. Front Hum Neurosci 2019; 13:391. [PMID: 31780912 PMCID: PMC6856761 DOI: 10.3389/fnhum.2019.00391] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2019] [Accepted: 10/21/2019] [Indexed: 11/29/2022] Open
Abstract
Rapid and accurate processing of potential social threats is paramount to social thriving, and provides a clear evolutionary advantage. Though automatic processing of facial expressions has been assumed for some time, some researchers now question the extent to which this is the case. Here, we provide electrophysiological data from a psychological refractory period (PRP) dual-task paradigm in which participants had to decide whether a target face exhibited a neutral or fearful expression, as overlap with a concurrent auditory tone categorization task was experimentally manipulated. Specifically, we focused on four event-related potentials (ERP) linked to emotional face processing, covering distinct processing stages and topography: the early posterior negativity (EPN), early frontal positivity (EFP), late positive potential (LPP), and also the face-sensitive N170. As expected, there was an emotion modulation of each ERP. Most importantly, there was a significant attenuation of this emotional response proportional to the degree of task overlap for each component, except the N170. In fact, when the central overlap was greatest, this emotion-specific amplitude was statistically null for the EFP and LPP, and only marginally different from zero for the EPN. N170 emotion modulation was, on the other hand, unaffected by central overlap. Thus, our results show that emotion-specific ERPs for three out of four processing stages—i.e., perceptual encoding (EPN), emotion detection (EFP), or content evaluation (LPP)—are attenuated and even eliminated by central resource scarcity. Models assuming automatic processing should be revised to account for these results.
Collapse
Affiliation(s)
- Amélie Roberge
- Département de Psychologie, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada
- Département de Psychoéducation et de Psychologie, Université du Québec en Outaouais, Gatineau, QC, Canada
| | - Justin Duncan
- Département de Psychoéducation et de Psychologie, Université du Québec en Outaouais, Gatineau, QC, Canada
- Département de Psychologie, Université du Québec à Montréal, Montreal, QC, Canada
| | - Daniel Fiset
- Département de Psychoéducation et de Psychologie, Université du Québec en Outaouais, Gatineau, QC, Canada
| | - Benoit Brisson
- Département de Psychologie, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada
- *Correspondence: Benoit Brisson
| |
Collapse
|
16
|
Burra N, Kerzel D. Task Demands Modulate Effects of Threatening Faces on Early Perceptual Encoding. Front Psychol 2019; 10:2400. [PMID: 31708839 PMCID: PMC6821787 DOI: 10.3389/fpsyg.2019.02400] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2019] [Accepted: 10/08/2019] [Indexed: 12/03/2022] Open
Abstract
The threat capture hypothesis states that threatening stimuli are automatically processed with higher priority than non-threatening stimuli, irrespective of observer intentions or focus of attention. We evaluated the threat capture hypothesis with respect to the early perceptual stages of face processing. We focused on an electrophysiological marker of face processing (the lateralized N170) in response to neutral, happy, and angry facial expressions displayed in competition with a non-face stimulus (a house). We evaluated how effects of facial expression on the lateralized N170 were modulated by task demands. In the pixel task, participants were required to identify the gender of the face, which made the face task-relevant and entailed structural encoding of the face stimulus. In the pixel task, participants identified the location of a missing pixel in the fixation cross, which made the face task-irrelevant and placed it outside the focus of attention. When faces were relevant, the lateralized N170 to angry faces was enhanced compared to happy and neutral faces. When faces were irrelevant, facial expression had no effect. These results reveal the critical role of task demands on the preference for threatening faces, indicating that top-down, voluntary processing modulates the prioritization of threat.
Collapse
Affiliation(s)
- Nicolas Burra
- Faculté de Psychologie et des Sciences de l'Éducation, Université de Genève, Geneva, Switzerland
| | - Dirk Kerzel
- Faculté de Psychologie et des Sciences de l'Éducation, Université de Genève, Geneva, Switzerland
| |
Collapse
|
17
|
de Lissa P, McArthur G, Hawelka S, Palermo R, Mahajan Y, Degno F, Hutzler F. Peripheral preview abolishes N170 face-sensitivity at fixation: Using fixation-related potentials to investigate dynamic face processing. VISUAL COGNITION 2019. [DOI: 10.1080/13506285.2019.1676855] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Affiliation(s)
- Peter de Lissa
- iBMLab, Department of Psychology, University of Fribourg, Fribourg, Switzerland
- Department of Cognitive Science, Macquarie University, Sydney, Australia
| | - Genevieve McArthur
- Department of Cognitive Science, Macquarie University, Sydney, Australia
| | - Stefan Hawelka
- Centre for Cognitive Neuroscience, Salzburg University, Salzburg, Austria
| | - Romina Palermo
- School of Psychological Science, University of Western Australia, Perth, Australia
| | - Yatin Mahajan
- The MARCS Institute, University of Western Sydney, Australia
| | - Federica Degno
- School of Psychology, University of Central Lancashire, Preston, UK
| | - Florian Hutzler
- Centre for Cognitive Neuroscience, Salzburg University, Salzburg, Austria
| |
Collapse
|
18
|
Stoll C, Rodger H, Lao J, Richoz AR, Pascalis O, Dye M, Caldara R. Quantifying Facial Expression Intensity and Signal Use in Deaf Signers. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2019; 24:346-355. [PMID: 31271428 DOI: 10.1093/deafed/enz023] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/09/2018] [Revised: 04/30/2019] [Accepted: 05/03/2019] [Indexed: 06/09/2023]
Abstract
We live in a world of rich dynamic multisensory signals. Hearing individuals rapidly and effectively integrate multimodal signals to decode biologically relevant facial expressions of emotion. Yet, it remains unclear how facial expressions are decoded by deaf adults in the absence of an auditory sensory channel. We thus compared early and profoundly deaf signers (n = 46) with hearing nonsigners (n = 48) on a psychophysical task designed to quantify their recognition performance for the six basic facial expressions of emotion. Using neutral-to-expression image morphs and noise-to-full signal images, we quantified the intensity and signal levels required by observers to achieve expression recognition. Using Bayesian modeling, we found that deaf observers require more signal and intensity to recognize disgust, while reaching comparable performance for the remaining expressions. Our results provide a robust benchmark for the intensity and signal use in deafness and novel insights into the differential coding of facial expressions of emotion between hearing and deaf individuals.
Collapse
Affiliation(s)
- Chloé Stoll
- Laboratoire de Psychologie et de Neurocognition (CNRS-UMR5105), Université Grenoble-Alpes
- Laboratory for Investigative Neurophysiology, Centre Hospitalier Universitaire Vaudois and University of Lausanne
| | - Helen Rodger
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg
| | - Junpeng Lao
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg
| | - Anne-Raphaëlle Richoz
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg
| | - Olivier Pascalis
- Laboratoire de Psychologie et de Neurocognition (CNRS-UMR5105), Université Grenoble-Alpes
| | - Matthew Dye
- National Technical Institute for Deaf/Rochester Institute of Technology
| | - Roberto Caldara
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg
| |
Collapse
|
19
|
Zhu C, Yin M, Chen X, Zhang J, Liu D. Ecological micro-expression recognition characteristics of young adults with subthreshold depression. PLoS One 2019; 14:e0216334. [PMID: 31042784 PMCID: PMC6493753 DOI: 10.1371/journal.pone.0216334] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2018] [Accepted: 04/18/2019] [Indexed: 11/19/2022] Open
Abstract
The micro-expression (ME) processing characteristics of patients with depression has been studied but has not been investigated in people with subthreshold depression. Based on this, by adopting the ecological MEs recognition paradigm, this study aimed to explore ME recognition in people with subthreshold depression. A 4 (background expression: happy, neutral, sad and fearful) × 4 (ME: happy, neutral, sad, and fearful) study was designed; two groups of participants (experimental group with subthreshold depression vs. healthy control group, 32 participants in each group) were asked to complete the ecological ME recognition task, and the corresponding accuracy (ACC) and reaction time (RT) were analyzed. Results: (1) Under different background conditions, recognizing happy MEs had the highest ACC and shortest RT. (2) There was no significant difference in the ACC and RT between experimental and control groups. (3)In different contexts, individuals with subthreshold depression tended to misjudge neutral, sad, and fearful MEs as happy, while neutral MEs were misjudged as sad and fearful. (4) The performance of individuals with subthreshold depression in the ecological ME recognition task were influenced by the type of ME; they showed highest ACC and shortest RT when recognizing happy MEs (vs. the other MEs). Conclusions: (1) The performance of individuals’ ecological ME recognition were influenced by the background expression, and this embodied the need for ecological ME recognition. (2) Individuals with subthreshold depression showed normal ecological ME recognition ability. (3) In terms of misjudgment, individuals with subthreshold depression showed both positive and negative bias, when completing the ecological ME recognition task. (4) Compared with the other MEs, happy MEs showed an advantage recognition effect for individuals with subthreshold depression who completed the ecological ME recognition task.
Collapse
Affiliation(s)
- Chuanlin Zhu
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu, China
| | - Ming Yin
- Department of Criminal Investigation, Jiangsu Police Institute, Nanjing, Jiangsu, China
| | - Xinyun Chen
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu, China
| | - Jianxin Zhang
- School of Humanities, Jiangnan University, Wuxi, Jiangsu, China
| | - Dianzhi Liu
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu, China
- * E-mail:
| |
Collapse
|
20
|
Smith FW, Smith ML. Decoding the dynamic representation of facial expressions of emotion in explicit and incidental tasks. Neuroimage 2019; 195:261-271. [PMID: 30940611 DOI: 10.1016/j.neuroimage.2019.03.065] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2018] [Revised: 03/18/2019] [Accepted: 03/27/2019] [Indexed: 11/24/2022] Open
Abstract
Faces transmit a wealth of important social signals. While previous studies have elucidated the network of cortical regions important for perception of facial expression, and the associated temporal components such as the P100, N170 and EPN, it is still unclear how task constraints may shape the representation of facial expression (or other face categories) in these networks. In the present experiment, we used Multivariate Pattern Analysis (MVPA) with EEG to investigate the neural information available across time about two important face categories (expression and identity) when those categories are either perceived under explicit (e.g. decoding facial expression category from the EEG when task is on expression) or incidental task contexts (e.g. decoding facial expression category from the EEG when task is on identity). Decoding of both face categories, across both task contexts, peaked in time-windows spanning 91-170 ms (across posterior electrodes). Peak decoding of expression, however, was not affected by task context whereas peak decoding of identity was significantly reduced under incidental processing conditions. In addition, errors in EEG decoding correlated with errors in behavioral categorization under explicit processing for both expression and identity, however under incidental conditions only errors in EEG decoding of expression correlated with behavior. Furthermore, decoding time-courses and the spatial pattern of informative electrodes showed consistently better decoding of identity under explicit conditions at later-time periods, with weak evidence for similar effects for decoding of expression at isolated time-windows. Taken together, these results reveal differences and commonalities in the processing of face categories under explicit Vs incidental task contexts and suggest that facial expressions are processed to a richer degree under incidental processing conditions, consistent with prior work indicating the relative automaticity by which emotion is processed. Our work further demonstrates the utility in applying multivariate decoding analyses to EEG for revealing the dynamics of face perception.
Collapse
Affiliation(s)
- Fraser W Smith
- School of Psychology, University of East Anglia, Norwich, UK.
| | - Marie L Smith
- School of Psychological Sciences, Birkbeck College, University of London, London, UK
| |
Collapse
|
21
|
Szabó E, Galambos A, Kocsel N, Édes AE, Pap D, Zsombók T, Kozák LR, Bagdy G, Kökönyei G, Juhász G. Association between migraine frequency and neural response to emotional faces: An fMRI study. NEUROIMAGE-CLINICAL 2019; 22:101790. [PMID: 31146320 PMCID: PMC6462777 DOI: 10.1016/j.nicl.2019.101790] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/20/2018] [Revised: 02/12/2019] [Accepted: 03/23/2019] [Indexed: 01/03/2023]
Abstract
Previous studies have demonstrated that migraine is associated with enhanced perception and altered cerebral processing of sensory stimuli. More recently, it has been suggested that this sensory hypersensitivity might reflect a more general enhanced response to aversive emotional stimuli. Using functional magnetic resonance imaging and emotional face stimuli (fearful, happy and sad faces), we compared whole-brain activation between 41 migraine patients without aura in interictal period and 49 healthy controls. Migraine patients showed increased neural activation to fearful faces compared to neutral faces in the right middle frontal gyrus and frontal pole relative to healthy controls. We also found that higher attack frequency in migraine patients was related to increased activation mainly in the right primary somatosensory cortex (corresponding to the face area) to fearful expressions and in the right dorsal striatal regions to happy faces. In both analyses, activation differences remained significant after controlling for anxiety and depressive symptoms. These findings indicate that enhanced response to emotional stimuli might explain the migraine trigger effect of psychosocial stressors that gradually leads to increased somatosensory response to emotional clues and thus contributes to the progression or chronification of migraine. First fMRI study to explore neural response to emotional faces in migraine patients Migraine patients showed increased activation to fear in the right frontal regions Migraine frequency was related to enhanced activation to fearful and happy faces Activation in the right S1 and dorsal striatum was linked to migraine frequency Sensitivity to emotional stimuli might have a role in triggering migraine
Collapse
Affiliation(s)
- Edina Szabó
- Doctoral School of Psychology, ELTE Eötvös Loránd University, Izabella street 46, H-1064 Budapest, Hungary; Institute of Psychology, ELTE Eötvös Loránd University, Izabella street 46, H-1064 Budapest, Hungary; MTA-SE Neuropsychopharmacology and Neurochemistry Research Group, Hungarian Academy of Sciences, Semmelweis University, Üllői Street 26, H-1085 Budapest, Hungary.
| | - Attila Galambos
- Doctoral School of Psychology, ELTE Eötvös Loránd University, Izabella street 46, H-1064 Budapest, Hungary; Institute of Psychology, ELTE Eötvös Loránd University, Izabella street 46, H-1064 Budapest, Hungary; MTA-SE Neuropsychopharmacology and Neurochemistry Research Group, Hungarian Academy of Sciences, Semmelweis University, Üllői Street 26, H-1085 Budapest, Hungary.
| | - Natália Kocsel
- Doctoral School of Psychology, ELTE Eötvös Loránd University, Izabella street 46, H-1064 Budapest, Hungary; Institute of Psychology, ELTE Eötvös Loránd University, Izabella street 46, H-1064 Budapest, Hungary; SE-NAP2 Genetic Brain Imaging Migraine Research Group, Hungarian Academy of Sciences, Semmelweis University, Üllői Street 26, H-1085 Budapest, Hungary; Department of Pharmacodynamics, Faculty of Pharmacy, Semmelweis University, Nagyvárad square 4, H-1089 Budapest, Hungary.
| | - Andrea Edit Édes
- SE-NAP2 Genetic Brain Imaging Migraine Research Group, Hungarian Academy of Sciences, Semmelweis University, Üllői Street 26, H-1085 Budapest, Hungary; Department of Pharmacodynamics, Faculty of Pharmacy, Semmelweis University, Nagyvárad square 4, H-1089 Budapest, Hungary.
| | - Dorottya Pap
- Department of Neurology, Faculty of Medicine, Semmelweis University, Balassa street 6, H-1083 Budapest, Hungary
| | - Terézia Zsombók
- MR Research Center, Semmelweis University, Balassa street 6, H-1083 Budapest, Hungary
| | - Lajos Rudolf Kozák
- Neuroscience and Psychiatry Unit, The University of Manchester and Manchester Academic Health Sciences Centre, Stopford Building, Oxford Road, Manchester, United Kingdom.
| | - György Bagdy
- MTA-SE Neuropsychopharmacology and Neurochemistry Research Group, Hungarian Academy of Sciences, Semmelweis University, Üllői Street 26, H-1085 Budapest, Hungary; Department of Pharmacodynamics, Faculty of Pharmacy, Semmelweis University, Nagyvárad square 4, H-1089 Budapest, Hungary.
| | - Gyöngyi Kökönyei
- Institute of Psychology, ELTE Eötvös Loránd University, Izabella street 46, H-1064 Budapest, Hungary; SE-NAP2 Genetic Brain Imaging Migraine Research Group, Hungarian Academy of Sciences, Semmelweis University, Üllői Street 26, H-1085 Budapest, Hungary; Department of Pharmacodynamics, Faculty of Pharmacy, Semmelweis University, Nagyvárad square 4, H-1089 Budapest, Hungary.
| | - Gabriella Juhász
- SE-NAP2 Genetic Brain Imaging Migraine Research Group, Hungarian Academy of Sciences, Semmelweis University, Üllői Street 26, H-1085 Budapest, Hungary; Department of Pharmacodynamics, Faculty of Pharmacy, Semmelweis University, Nagyvárad square 4, H-1089 Budapest, Hungary; Neuroscience and Psychiatry Unit, The University of Manchester and Manchester Academic Health Sciences Centre, Stopford Building, Oxford Road, Manchester, United Kingdom.
| |
Collapse
|
22
|
Richoz AR, Lao J, Pascalis O, Caldara R. Tracking the recognition of static and dynamic facial expressions of emotion across the life span. J Vis 2018; 18:5. [PMID: 30208425 DOI: 10.1167/18.9.5] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
The effective transmission and decoding of dynamic facial expressions of emotion is omnipresent and critical for adapted social interactions in everyday life. Thus, common intuition would suggest an advantage for dynamic facial expression recognition (FER) over the static snapshots routinely used in most experiments. However, although many studies reported an advantage in the recognition of dynamic over static expressions in clinical populations, results obtained from healthy participants are contrasted. To clarify this issue, we conducted a large cross-sectional study to investigate FER across the life span in order to determine if age is a critical factor to account for such discrepancies. More than 400 observers (age range 5-96) performed recognition tasks of the six basic expressions in static, dynamic, and shuffled (temporally randomized frames) conditions, normalized for the amount of energy sampled over time. We applied a Bayesian hierarchical step-linear model to capture the nonlinear relationship between age and FER for the different viewing conditions. Although replicating the typical accuracy profiles of FER, we determined the age at which peak efficiency was reached for each expression and found greater accuracy for most dynamic expressions across the life span. This advantage in the elderly population was driven by a significant decrease in performance for static images, which was twice as large as for the young adults. Our data posit the use of dynamic stimuli as being critical in the assessment of FER in the elderly population, inviting caution when drawing conclusions from the sole use of static face images to this aim.
Collapse
Affiliation(s)
- Anne-Raphaëlle Richoz
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland.,LPNC, University of Grenoble Alpes, Grenoble, France
| | - Junpeng Lao
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | | | - Roberto Caldara
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| |
Collapse
|