1
|
Itier RJ, Durston AJ. Mass-univariate analysis of scalp ERPs reveals large effects of gaze fixation location during face processing that only weakly interact with face emotional expression. Sci Rep 2023; 13:17022. [PMID: 37813928 PMCID: PMC10562468 DOI: 10.1038/s41598-023-44355-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2023] [Accepted: 10/06/2023] [Indexed: 10/11/2023] Open
Abstract
Decoding others' facial expressions is critical for social functioning. To clarify the neural correlates of expression perception depending on where we look on the face, three combined gaze-contingent ERP experiments were analyzed using robust mass-univariate statistics. Regardless of task, fixation location impacted face processing from 50 to 350 ms, maximally around 120 ms, reflecting retinotopic mapping around C2 and P1 components. Fixation location also impacted majorly the N170-P2 interval while weak effects were seen at the face-sensitive N170 peak. Results question the widespread assumption that faces are processed holistically into an indecomposable perceptual whole around the N170. Rather, face processing is a complex and view-dependent process that continues well beyond the N170. Expression and fixation location interacted weakly during the P1-N170 interval, supporting a role for the mouth and left eye in fearful and happy expression decoding. Expression effects were weakest at the N170 peak but strongest around P2, especially for fear, reflecting task-independent affective processing. Results suggest N170 reflects a transition between processes rather than the maximum of a holistic face processing stage. Focus on this peak should be replaced by data-driven analyses of the epoch using robust statistics to fully unravel the early visual processing of faces and their affective content.
Collapse
Affiliation(s)
- Roxane J Itier
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada.
| | - Amie J Durston
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada
| |
Collapse
|
2
|
Shi Y, Kang J, Sommer W, Cao X. The development of processing second-order spatial relations of faces in Chinese preschoolers. J Exp Child Psychol 2023; 232:105678. [PMID: 37004264 DOI: 10.1016/j.jecp.2023.105678] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2022] [Revised: 03/07/2023] [Accepted: 03/07/2023] [Indexed: 04/03/2023]
Abstract
Second-order relational information processing is the perception of the relative distance between facial features. Previous studies ignored the effect of different spatial manipulations on second-order sensitivity in face processing, and little is known about its developmental trajectory in East Asian populations, who have stronger holistic face processing than Western populations. We addressed these gaps in the literature through an experiment with four groups of Chinese preschool children (aged 3-6 years; n = 157) and a group of adults (n = 25). The participants were presented with face pairs displaying features with various spatial distance manipulations (Change 1: changes in the spacing between eyes; Change 2: nose-mouth spacing changes; Change 3: a combination of Changes 1 and 2) using a simultaneous two-alternative forced-choice task. Second-order sensitivity was already present in 3-year-old children across all manipulations and became more pronounced in 4-year-old children. Second-order sensitivity to the spatial distance between the eyes (i.e., Changes 1 and 3) among 4-year-olds was higher than that of 3-year-olds and was similar to that of adults, suggesting a key increase of this sensitivity from 3 to 4 years of age. Regarding the Change 2 condition, preschoolers aged 5 and 6 years had higher sensitivity than 3-year-olds; however, all preschoolers' sensitivity was inferior to that of adults. These findings show that the development of Chinese preschoolers' sensitivity for detecting spatial relations between the eyes might be faster than that for detecting nose-mouth spacing, supporting the importance of eyes in face processing.
Collapse
|
3
|
Bruchmann M, Mertens L, Schindler S, Straube T. Potentiated early neural responses to fearful faces are not driven by specific face parts. Sci Rep 2023; 13:4613. [PMID: 36944705 PMCID: PMC10030637 DOI: 10.1038/s41598-023-31752-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2022] [Accepted: 03/16/2023] [Indexed: 03/23/2023] Open
Abstract
Prioritized processing of fearful compared to neutral faces is reflected in increased amplitudes of components of the event-related potential (ERP). It is unknown whether specific face parts drive these modulations. Here, we investigated the contributions of face parts on ERPs to task-irrelevant fearful and neutral faces using an ERP-dependent facial decoding technique and a large sample of participants (N = 83). Classical ERP analyses showed typical and robust increases of N170 and EPN amplitudes by fearful relative to neutral faces. Facial decoding further showed that the absolute amplitude of these components, as well as the P1, was driven by the low-frequency contrast of specific face parts. However, the difference between fearful and neutral faces was not driven by any specific face part, as supported by Bayesian statistics. Furthermore, there were no correlations between trait anxiety and main effects or interactions. These results suggest that increased N170 and EPN amplitudes to task-irrelevant fearful compared to neutral faces are not driven by specific facial regions but represent a holistic face processing effect.
Collapse
Affiliation(s)
- Maximilian Bruchmann
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Von-Esmarch-Str. 52, 48149, Münster, Germany.
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster, Germany.
| | - Léa Mertens
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Von-Esmarch-Str. 52, 48149, Münster, Germany
| | - Sebastian Schindler
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Von-Esmarch-Str. 52, 48149, Münster, Germany
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster, Germany
| | - Thomas Straube
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Von-Esmarch-Str. 52, 48149, Münster, Germany
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster, Germany
| |
Collapse
|
4
|
Effect of perceived eye gaze on the N170 component – A systematic review. Neurosci Biobehav Rev 2022; 143:104913. [DOI: 10.1016/j.neubiorev.2022.104913] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2022] [Revised: 10/03/2022] [Accepted: 10/09/2022] [Indexed: 11/06/2022]
|
5
|
Impact of face outline, parafoveal feature number and feature type on early face perception in a gaze-contingent paradigm: A mass-univariate re-analysis of ERP data. NEUROIMAGE: REPORTS 2022. [DOI: 10.1016/j.ynirp.2022.100148] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/07/2022]
|
6
|
Li S, Ding R, Zhao D, Zhou X, Zhan B, Luo W. Processing of emotions expressed through eye regions attenuates attentional blink. Int J Psychophysiol 2022; 182:1-11. [DOI: 10.1016/j.ijpsycho.2022.07.010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2021] [Revised: 07/13/2022] [Accepted: 07/27/2022] [Indexed: 10/16/2022]
|
7
|
Coss RG, Charles EP. The Saliency of Snake Scales and Leopard Rosettes to Infants: Its Relevance to Graphical Patterns Portrayed in Prehistoric Art. Front Psychol 2021; 12:763436. [PMID: 34880813 PMCID: PMC8645795 DOI: 10.3389/fpsyg.2021.763436] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2021] [Accepted: 10/27/2021] [Indexed: 12/17/2022] Open
Abstract
Geometrically arranged spots and crosshatched incised lines are frequently portrayed in prehistoric cave and mobiliary art. Two experiments examined the saliency of snake scales and leopard rosettes to infants that are perceptually analogous to these patterns. Experiment 1 examined the investigative behavior of 23 infants at three daycare facilities. Four plastic jars (15×14.5cm) with snake scales, leopard rosettes, geometric plaid, and plain patterns printed on yellowish-orange paper inside were placed individually on the floor on separate days during playtime. Fourteen 7–15-month-old infants approached each jar hesitantly and poked it before handling it for five times, the criterion selected for statistical analyses of poking frequency. The jars with snake scales and leopard rosettes yielded reliably higher poking frequencies than the geometric plaid and plain jars. The second experiment examined the gaze and grasping behavior of 15 infants (spanning 5months of age) seated on the laps of their mothers in front of a table. For paired comparisons, the experimenter pushed two of four upright plastic cylinders (13.5×5.5cm) with virtually the same colored patterns simultaneously toward each infant for 6s. Video recordings indicated that infants gazed significantly longer at the cylinders with snake scales and leopard rosettes than the geometric plaid and plain cylinders prior to grasping them. Logistic regression of gaze duration predicting cylinder choice for grasping indicated that seven of 24 paired comparisons were not significant, all of which involved choices of cylinders with snake scales and leopard rosettes that diverted attention before reaching. Evidence that these biological patterns are salient to infants during an early period of brain development might characterize the integration of subcortical and neocortical visual processes known to be involved in snake recognition. In older individuals, memorable encounters with snakes and leopards coupled with the saliency of snake scales and leopard rosettes possibly biased artistic renditions of similar patterns during prehistoric times.
Collapse
Affiliation(s)
- Richard G Coss
- Psychology Department, University of California, Davis, Davis, CA, United States
| | - Eric P Charles
- Psychology Department, University of California, Davis, Davis, CA, United States
| |
Collapse
|
8
|
Human face and gaze perception is highly context specific and involves bottom-up and top-down neural processing. Neurosci Biobehav Rev 2021; 132:304-323. [PMID: 34861296 DOI: 10.1016/j.neubiorev.2021.11.042] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2021] [Revised: 11/24/2021] [Accepted: 11/24/2021] [Indexed: 11/21/2022]
Abstract
This review summarizes human perception and processing of face and gaze signals. Face and gaze signals are important means of non-verbal social communication. The review highlights that: (1) some evidence is available suggesting that the perception and processing of facial information starts in the prenatal period; (2) the perception and processing of face identity, expression and gaze direction is highly context specific, the effect of race and culture being a case in point. Culture affects by means of experiential shaping and social categorization the way in which information on face and gaze is collected and perceived; (3) face and gaze processing occurs in the so-called 'social brain'. Accumulating evidence suggests that the processing of facial identity, facial emotional expression and gaze involves two parallel and interacting pathways: a fast and crude subcortical route and a slower cortical pathway. The flow of information is bi-directional and includes bottom-up and top-down processing. The cortical networks particularly include the fusiform gyrus, superior temporal sulcus (STS), intraparietal sulcus, temporoparietal junction and medial prefrontal cortex.
Collapse
|
9
|
Hudson A, Durston AJ, McCrackin SD, Itier RJ. Emotion, Gender and Gaze Discrimination Tasks do not Differentially Impact the Neural Processing of Angry or Happy Facial Expressions-a Mass Univariate ERP Analysis. Brain Topogr 2021; 34:813-833. [PMID: 34596796 DOI: 10.1007/s10548-021-00873-x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2021] [Accepted: 09/20/2021] [Indexed: 10/20/2022]
Abstract
Facial expression processing is a critical component of social cognition yet, whether it is influenced by task demands at the neural level remains controversial. Past ERP studies have found mixed results with classic statistical analyses, known to increase both Type I and Type II errors, which Mass Univariate statistics (MUS) control better. However, MUS open-access toolboxes can use different fundamental statistics, which may lead to inconsistent results. Here, we compared the output of two MUS toolboxes, LIMO and FMUT, on the same data recorded during the processing of angry and happy facial expressions investigated under three tasks in a within-subjects design. Both toolboxes revealed main effects of emotion during the N170 timing and main effects of task during later time points typically associated with the LPP component. Neither toolbox yielded an interaction between the two factors at the group level, nor at the individual level in LIMO, confirming that the neural processing of these two face expressions is largely independent from task demands. Behavioural data revealed main effects of task on reaction time and accuracy, but no influence of expression or an interaction between the two. Expression processing and task demands are discussed in the context of the consistencies and discrepancies between the two toolboxes and existing literature.
Collapse
Affiliation(s)
- Anna Hudson
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada
| | - Amie J Durston
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada
| | - Sarah D McCrackin
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada
| | - Roxane J Itier
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada.
| |
Collapse
|
10
|
McCrackin SD, Itier RJ. I can see it in your eyes: Perceived gaze direction impacts ERP and behavioural measures of affective theory of mind. Cortex 2021; 143:205-222. [PMID: 34455372 DOI: 10.1016/j.cortex.2021.05.024] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2020] [Revised: 04/12/2021] [Accepted: 05/21/2021] [Indexed: 10/20/2022]
Abstract
Looking at someone's eyes is thought to be important for affective theory of mind (aTOM), our ability to infer their emotional state. However, it is unknown whether an individual's gaze direction influences our aTOM judgements and what the time course of this influence might be. We presented participants with sentences describing individuals in positive, negative or neutral scenarios, followed by direct or averted gaze neutral face pictures of those individuals. Participants made aTOM judgements about each person's mental state, including their affective valence and arousal, and we investigated whether the face gaze direction impacted those judgements. Participants rated that gazers were feeling more positive when they displayed direct gaze as opposed to averted gaze, and that they were feeling more aroused during negative contexts when gaze was averted as opposed to direct. Event-related potentials associated with face perception and affective processing were examined using mass-univariate analyses to track the time-course of this eye-gaze and affective processing interaction at a neural level. Both positive and negative trials were differentiated from neutral trials at many stages of processing. This included the early N200 and EPN components, believed to reflect automatic emotion areas activation and attentional selection respectively. This also included the later P300 and LPP components, thought to reflect elaborative cognitive appraisal of emotional content. Critically, sentence valence and gaze direction interacted over these later components, which may reflect the incorporation of eye-gaze in the cognitive evaluation of another's emotional state. The results suggest that gaze perception directly impacts aTOM processes, and that altered eye-gaze processing in clinical populations may contribute to associated aTOM impairments.
Collapse
Affiliation(s)
| | - Roxane J Itier
- Department of Psychology, University of Waterloo, Waterloo, Canada.
| |
Collapse
|
11
|
Stephenson LJ, Edwards SG, Bayliss AP. From Gaze Perception to Social Cognition: The Shared-Attention System. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2021; 16:553-576. [PMID: 33567223 PMCID: PMC8114330 DOI: 10.1177/1745691620953773] [Citation(s) in RCA: 36] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
When two people look at the same object in the environment and are aware of each other's attentional state, they find themselves in a shared-attention episode. This can occur through intentional or incidental signaling and, in either case, causes an exchange of information between the two parties about the environment and each other's mental states. In this article, we give an overview of what is known about the building blocks of shared attention (gaze perception and joint attention) and focus on bringing to bear new findings on the initiation of shared attention that complement knowledge about gaze following and incorporate new insights from research into the sense of agency. We also present a neurocognitive model, incorporating first-, second-, and third-order social cognitive processes (the shared-attention system, or SAS), building on previous models and approaches. The SAS model aims to encompass perceptual, cognitive, and affective processes that contribute to and follow on from the establishment of shared attention. These processes include fundamental components of social cognition such as reward, affective evaluation, agency, empathy, and theory of mind.
Collapse
|
12
|
The early processing of fearful and happy facial expressions is independent of task demands - Support from mass univariate analyses. Brain Res 2021; 1765:147505. [PMID: 33915164 DOI: 10.1016/j.brainres.2021.147505] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2020] [Revised: 04/06/2021] [Accepted: 04/22/2021] [Indexed: 11/20/2022]
Abstract
Most ERP studies on facial expressions of emotion have yielded inconsistent results regarding the time course of emotion effects and their possible modulation by task demands. Most studies have used classical statistical methods with a high likelihood of type I and type II errors, which can be limited with Mass Univariate statistics. FMUT and LIMO are currently the only two available toolboxes for Mass Univariate analysis of ERP data and use different fundamental statistics. Yet, no direct comparison of their output has been performed on the same dataset. Given the current push to transition to robust statistics to increase results replicability, here we compared the output of these toolboxes on data previously analyzed using classic approaches (Itier & Neath-Tavares, 2017). The early (0-352 ms) processing of fearful, happy, and neutral faces was investigated under three tasks in a within-subject design that also controlled gaze fixation location. Both toolboxes revealed main effects of emotion and task but neither yielded an interaction between the two, confirming the early processing of fear and happy expressions is largely independent of task demands. Both toolboxes found virtually no difference between neutral and happy expressions, while fearful (compared to neutral and happy) expressions modulated the N170 and EPN but elicited maximum effects after the N170 peak, around 190 ms. Similarities and differences in the spatial and temporal extent of these effects are discussed in comparison to the published classical analysis and the rest of the ERP literature.
Collapse
|
13
|
Liu D, Wang Y, Lu F, Shu D, Zhang J, Zhu C, Luo W. Emotional valence modulates arithmetic strategy execution in priming paradigm: an event-related potential study. Exp Brain Res 2021; 239:1151-1163. [PMID: 33555381 DOI: 10.1007/s00221-021-06048-1] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2020] [Accepted: 01/22/2021] [Indexed: 01/31/2023]
Abstract
Combined with the prime paradigm, the present study aimed to explore the influence of emotion (anger, fear, happiness, and neutral) on performing multiplication estimation. Participants were asked to complete a two-digit multiplication estimation task using the down-up strategy (e.g., doing 20 × 80 = 1600 for 24 × 79). Behavioral results showed that the reaction time for completing multiplication estimation tasks under happy conditions was shorter than for those under anger and fear, and it was shorter under neutral than under fearful conditions. The ERP results showed that about 100 ms after multiplication estimation task onset, multiplication estimation execution in the context of happiness (vs. neutral) elicited smaller P1 amplitudes; about 170 ms after multiplication estimation task onset, the N170 amplitudes elicited by multiplication estimation strategy execution under different emotional priming conditions showed no significant differences. The above results showed that the impact of emotion priming demonstrates the occurrence of a dynamic process when participants use a specified strategy to complete the multiplication estimation task. The present study revealed that emotional valence modulated arithmetic strategy execution, suggesting the role of different emotions should be fully considered in similar study.
Collapse
Affiliation(s)
- Dianzhi Liu
- Department of Psychology, School of Education, Soochow University, Suzhou, 215123, China
| | - Yun Wang
- School of Foreign Languages, Suzhou University of Science and Technology, Suzhou, 215009, China
| | - Feng Lu
- Department of Psychology, School of Education, Soochow University, Suzhou, 215123, China
| | - Deming Shu
- Department of Psychology, School of Education, Soochow University, Suzhou, 215123, China
| | - Jianxin Zhang
- School of Humanities, Jiangnan University, Wuxi, 214122, Jiangsu, China
| | - Chuanlin Zhu
- School of Educational Science, Yangzhou University, Yangzhou, 225002, China.
| | - Wenbo Luo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China.
| |
Collapse
|
14
|
Schindler S, Bruchmann M, Gathmann B, Moeck R, Straube T. Effects of low-level visual information and perceptual load on P1 and N170 responses to emotional expressions. Cortex 2020; 136:14-27. [PMID: 33450599 DOI: 10.1016/j.cortex.2020.12.011] [Citation(s) in RCA: 46] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2020] [Revised: 08/28/2020] [Accepted: 12/09/2020] [Indexed: 10/22/2022]
Abstract
Emotional facial expressions lead to modulations of early event-related potentials (ERPs). However, it has so far remained unclear how far these modulations represent face-specific effects rather than differences in low-level visual features, and to which extent they depend on available processing resources. To examine these questions, we conducted two preregistered independent experiments (N = 40 in each experiment) using different variants of a novel task that manipulates peripheral perceptual load across levels but keeps overall visual stimulation constant. At the display center, we presented task-irrelevant angry, neutral, and happy faces and their Fourier phase-scrambled versions, which preserved low-level visual features. The results of both studies showed load-independent P1 and N170 emotional expression effects. Importantly, by using Bayesian analyses we could confirm that these facial expression effects were face-independent for the P1 but not for the N170 component. We conclude that firstly, ERP modulations during the P1 interval strongly depend on low-level visual information, while the N170 modulation requires the processing of figural facial expression features. Secondly, both P1 and N170 modulations appear to be immune to a large range of variations in perceptual load.
Collapse
Affiliation(s)
- Sebastian Schindler
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Germany; Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Germany.
| | - Maximilian Bruchmann
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Germany; Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Germany
| | - Bettina Gathmann
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Germany
| | - Robert Moeck
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Germany
| | - Thomas Straube
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Germany; Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Germany
| |
Collapse
|
15
|
Feeling through another's eyes: Perceived gaze direction impacts ERP and behavioural measures of positive and negative affective empathy. Neuroimage 2020; 226:117605. [PMID: 33271267 DOI: 10.1016/j.neuroimage.2020.117605] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2020] [Revised: 11/06/2020] [Accepted: 11/25/2020] [Indexed: 12/19/2022] Open
Abstract
Looking at the eyes informs us about the thoughts and emotions of those around us, and impacts our own emotional state. However, it is unknown how perceiving direct and averted gaze impacts our ability to share the gazer's positive and negative emotions, abilities referred to as positive and negative affective empathy. We presented 44 participants with contextual sentences describing positive, negative and neutral events happening to other people (e.g. "Her newborn was saved/killed/fed yesterday afternoon."). These were designed to elicit positive, negative, or little to no empathy, and were followed by direct or averted gaze images of the individuals described. Participants rated their affective empathy for the individual and their own emotional valence on each trial. Event-related potentials time-locked to face-onset and associated with empathy and emotional processing were recorded to investigate whether they were modulated by gaze direction. Relative to averted gaze, direct gaze was associated with increased positive valence in the positive and neutral conditions and with increased positive empathy ratings. A similar pattern was found at the neural level, using robust mass-univariate statistics. The N100, thought to reflect an automatic activation of emotion areas, was modulated by gaze in the affective empathy conditions, with opposite effect directions in positive and negative conditions.. The P200, an ERP component sensitive to positive stimuli, was modulated by gaze direction only in the positive empathy condition. Positive and negative trials were processed similarly at the early N200 processing stage, but later diverged, with only negative trials modulating the EPN, P300 and LPP components. These results suggest that positive and negative affective empathy are associated with distinct time-courses, and that perceived gaze direction uniquely modulates positive empathy, highlighting the importance of studying empathy with face stimuli.
Collapse
|
16
|
Schindler S, Bruchmann M, Steinweg AL, Moeck R, Straube T. Attentional conditions differentially affect early, intermediate and late neural responses to fearful and neutral faces. Soc Cogn Affect Neurosci 2020; 15:765-774. [PMID: 32701163 PMCID: PMC7511883 DOI: 10.1093/scan/nsaa098] [Citation(s) in RCA: 41] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2020] [Revised: 06/17/2020] [Accepted: 07/11/2020] [Indexed: 11/14/2022] Open
Abstract
The processing of fearful facial expressions is prioritized by the human brain. This priority is maintained across various information processing stages as evident in early, intermediate and late components of event-related potentials (ERPs). However, emotional modulations are inconsistently reported for these different processing stages. In this pre-registered study, we investigated how feature-based attention differentially affects ERPs to fearful and neutral faces in 40 participants. The tasks required the participants to discriminate either the orientation of lines overlaid onto the face, the sex of the face or the face's emotional expression, increasing attention to emotion-related features. We found main effects of emotion for the N170, early posterior negativity (EPN) and late positive potential (LPP). While N170 emotional modulations were task-independent, interactions of emotion and task were observed for the EPN and LPP. While EPN emotion effects were found in the sex and emotion tasks, the LPP emotion effect was mainly driven by the emotion task. This study shows that early responses to fearful faces are task-independent (N170) and likely based on low-level and configural information while during later processing stages, attention to the face (EPN) or-more specifically-to the face's emotional expression (LPP) is crucial for reliable amplified processing of emotional faces.
Collapse
Affiliation(s)
- Sebastian Schindler
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Münster D-48149, Germany
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster D-48149, Germany
| | - Maximilian Bruchmann
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Münster D-48149, Germany
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster D-48149, Germany
| | - Anna-Lena Steinweg
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Münster D-48149, Germany
| | - Robert Moeck
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Münster D-48149, Germany
| | - Thomas Straube
- Institute of Medical Psychology and Systems Neuroscience, University of Muenster, Münster D-48149, Germany
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster D-48149, Germany
| |
Collapse
|
17
|
Schindler S, Bublatzky F. Attention and emotion: An integrative review of emotional face processing as a function of attention. Cortex 2020; 130:362-386. [DOI: 10.1016/j.cortex.2020.06.010] [Citation(s) in RCA: 86] [Impact Index Per Article: 21.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Revised: 05/28/2020] [Accepted: 06/29/2020] [Indexed: 11/25/2022]
|
18
|
Time-dependent effects of perceptual load on processing fearful and neutral faces. Neuropsychologia 2020; 146:107529. [DOI: 10.1016/j.neuropsychologia.2020.107529] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2019] [Revised: 05/27/2020] [Accepted: 06/08/2020] [Indexed: 11/20/2022]
|
19
|
Wilcockson TD, Burns EJ, Xia B, Tree J, Crawford TJ. Atypically heterogeneous vertical first fixations to faces in a case series of people with developmental prosopagnosia. VISUAL COGNITION 2020. [DOI: 10.1080/13506285.2020.1797968] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Affiliation(s)
- Thomas D.W. Wilcockson
- School of Sport, Exercise and Health Sciences, Loughborough University, Loughborough, UK
- Psychology Department, Lancaster University, Lancaster, UK
| | | | - Baiqiang Xia
- Faculty of Information Technology and Electrical Engineering, University of Oulu, Finland
| | - Jeremy Tree
- Psychology Department, Swansea University, Swansea, UK
| | | |
Collapse
|
20
|
Rösler L, Rubo M, Gamer M. Artificial Faces Predict Gaze Allocation in Complex Dynamic Scenes. Front Psychol 2020; 10:2877. [PMID: 31920893 PMCID: PMC6930810 DOI: 10.3389/fpsyg.2019.02877] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2019] [Accepted: 12/04/2019] [Indexed: 11/13/2022] Open
Abstract
Both low-level physical saliency and social information, as presented by human heads or bodies, are known to drive gaze behavior in free-viewing tasks. Researchers have previously made use of a great variety of face stimuli, ranging from photographs of real humans to schematic faces, frequently without systematically differentiating between the two. In the current study, we used a Generalized Linear Mixed Model (GLMM) approach to investigate to what extent schematic artificial faces can predict gaze when they are presented alone or in competition with real human faces. Relative differences in predictive power became apparent, while GLMMs suggest substantial effects for real and artificial faces in all conditions. Artificial faces were accordingly less predictive than real human faces but still contributed significantly to gaze allocation. These results help to further our understanding of how social information guides gaze in complex naturalistic scenes.
Collapse
Affiliation(s)
- Lara Rösler
- Department of Psychology, Julius-Maximilians-Universität Würzburg, Würzburg, Germany
| | - Marius Rubo
- Department of Psychology, Julius-Maximilians-Universität Würzburg, Würzburg, Germany
| | - Matthias Gamer
- Department of Psychology, Julius-Maximilians-Universität Würzburg, Würzburg, Germany
| |
Collapse
|
21
|
From eye to face: The impact of face outline, feature number, and feature saliency on the early neural response to faces. Brain Res 2019; 1722:146343. [PMID: 31336099 DOI: 10.1016/j.brainres.2019.146343] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2019] [Revised: 07/12/2019] [Accepted: 07/19/2019] [Indexed: 11/22/2022]
Abstract
The LIFTED model of early face perception postulates that the face-sensitive N170 event-related potential may reflect underlying neural inhibition mechanisms which serve to regulate holistic and featural processing. It remains unclear, however, what specific factors impact these neural inhibition processes. Here, N170 peak responses were recorded whilst adults maintained fixation on a single eye using a gaze-contingent paradigm, and the presence/absence of a face outline, as well as the number and type of parafoveal features within the outline, were manipulated. N170 amplitudes and latencies were reduced when a single eye was fixated within a face outline compared to fixation on the same eye in isolation, demonstrating that the simple presence of a face outline is sufficient to elicit a shift towards a more face-like neural response. A monotonic decrease in the N170 amplitude and latency was observed with increasing numbers of parafoveal features, and the type of feature(s) present in parafovea further modulated this early face response. These results support the idea of neural inhibition exerted by parafoveal features onto the foveated feature as a function of the number, and possibly the nature, of parafoveal features. Specifically, the results suggest the use of a feature saliency framework (eyes > mouth > nose) at the neural level, such that the parafoveal eye may play a role in down-regulating the response to the other eye (in fovea) more so than the nose or the mouth. These results confirm the importance of parafoveal features and the face outline in the neural inhibition mechanism, and provide further support for a feature saliency mechanism guiding early face perception.
Collapse
|
22
|
McCrackin SD, Itier RJ. Perceived Gaze Direction Differentially Affects Discrimination of Facial Emotion, Attention, and Gender - An ERP Study. Front Neurosci 2019; 13:517. [PMID: 31178686 PMCID: PMC6543003 DOI: 10.3389/fnins.2019.00517] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2018] [Accepted: 05/06/2019] [Indexed: 12/16/2022] Open
Abstract
The perception of eye-gaze is thought to be a key component of our everyday social interactions. While the neural correlates of direct and averted gaze processing have been investigated, there is little consensus about how these gaze directions may be processed differently as a function of the task being performed. In a within-subject design, we examined how perception of direct and averted gaze affected performance on tasks requiring participants to use directly available facial cues to infer the individuals' emotional state (emotion discrimination), direction of attention (attention discrimination) and gender (gender discrimination). Neural activity was recorded throughout the three tasks using EEG, and ERPs time-locked to face onset were analyzed. Participants were most accurate at discriminating emotions with direct gaze faces, but most accurate at discriminating attention with averted gaze faces, while gender discrimination was not affected by gaze direction. At the neural level, direct and averted gaze elicited different patterns of activation depending on the task over frontal sites, from approximately 220-290 ms. More positive amplitudes were seen for direct than averted gaze in the emotion discrimination task. In contrast, more positive amplitudes were seen for averted gaze than for direct gaze in the gender discrimination task. These findings are among the first direct evidence that perceived gaze direction modulates neural activity differently depending on task demands, and that at the behavioral level, specific gaze directions functionally overlap with emotion and attention discrimination, precursors to more elaborated theory of mind processes.
Collapse
Affiliation(s)
| | - Roxane J. Itier
- Department of Psychology, University of Waterloo, Waterloo, ON, Canada
| |
Collapse
|
23
|
Joint Modulation of Facial Expression Processing by Contextual Congruency and Task Demands. Brain Sci 2019; 9:brainsci9050116. [PMID: 31109022 PMCID: PMC6562852 DOI: 10.3390/brainsci9050116] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2019] [Revised: 05/10/2019] [Accepted: 05/15/2019] [Indexed: 11/16/2022] Open
Abstract
Faces showing expressions of happiness or anger were presented together with sentences that described happiness-inducing or anger-inducing situations. Two main variables were manipulated: (i) congruency between contexts and expressions (congruent/incongruent) and (ii) the task assigned to the participant, discriminating the emotion shown by the target face (emotion task) or judging whether the expression shown by the face was congruent or not with the context (congruency task). Behavioral and electrophysiological results (event-related potentials (ERP)) showed that processing facial expressions was jointly influenced by congruency and task demands. ERP results revealed task effects at frontal sites, with larger positive amplitudes between 250–450 ms in the congruency task, reflecting the higher cognitive effort required by this task. Effects of congruency appeared at latencies and locations corresponding to the early posterior negativity (EPN) and late positive potential (LPP) components that have previously been found to be sensitive to emotion and affective congruency. The magnitude and spatial distribution of the congruency effects varied depending on the task and the target expression. These results are discussed in terms of the modulatory role of context on facial expression processing and the different mechanisms underlying the processing of expressions of positive and negative emotions.
Collapse
|