1
|
Siklos-Whillans J, Itier RJ. Effects of Inversion and Fixation Location on the Processing of Face and House Stimuli - A Mass Univariate Analysis. Brain Topogr 2024; 37:972-992. [PMID: 39042323 DOI: 10.1007/s10548-024-01068-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/12/2024] [Accepted: 07/05/2024] [Indexed: 07/24/2024]
Abstract
Most Event Related Potential studies investigating the time course of visual processing have focused mainly on the N170 component. Stimulus orientation affects the N170 amplitude for faces but not for objects, a finding interpreted as reflecting holistic/configural processing for faces and featural processing for objects. Furthermore, while recent studies suggest where on the face people fixate impacts the N170, fixation location effects have not been investigated in objects. A data-driven mass univariate analysis (all time points and electrodes) was used to investigate the time course of inversion and fixation location effects on the neural processing of faces and houses. Strong and widespread orientation effects were found for both faces and houses, from 100-350ms post-stimulus onset, including P1 and N170 components, and later, a finding arguing against a lack of holistic processing for houses. While no clear fixation effect was found for houses, fixation location strongly impacted face processing early, reflecting retinotopic mapping around the C2 and P1 components, and during the N170-P2 interval. Face inversion effects were also largest for nasion fixation around 120ms. The results support the view that facial feature integration (1) depends on which feature is being fixated and where the other features are situated in the visual field, (2) occurs maximally during the P1-N170 interval when fixation is on the nasion and (3) continues past 200ms, suggesting the N170 peak, where weak effects were found, might be an inflexion point between processes rather than the end of a feature integration into a whole process.
Collapse
Affiliation(s)
- James Siklos-Whillans
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada
| | - Roxane J Itier
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada.
| |
Collapse
|
2
|
Del Bianco T, Lai MC, Mason L, Johnson MH, Charman T, Loth E, Banaschewski T, Buitelaar J, Murphy DGM, Jones EJH. Sex differences in social brain neural responses in autism: temporal profiles of configural face-processing within data-driven time windows. Sci Rep 2024; 14:14038. [PMID: 38890406 PMCID: PMC11189412 DOI: 10.1038/s41598-024-64387-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2023] [Accepted: 06/07/2024] [Indexed: 06/20/2024] Open
Abstract
Face-processing timing differences may underlie visual social attention differences between autistic and non-autistic people, and males and females. This study investigates the timing of the effects of neurotype and sex on face-processing, and their dependence on age. We analysed EEG data during upright and inverted photographs of faces from 492 participants from the Longitudinal European Autism Project (141 neurotypical males, 76 neurotypical females, 202 autistic males, 73 autistic females; age 6-30 years). We detected timings of sex/diagnosis effects on event-related potential amplitudes at the posterior-temporal channel P8 with Bootstrapped Cluster-based Permutation Analysis and conducted Growth Curve Analysis (GCA) to investigate the timecourse and dependence on age of neural signals. The periods of influence of neurotype and sex overlapped but differed in onset (respectively, 260 and 310 ms post-stimulus), with sex effects lasting longer. GCA revealed a smaller and later amplitude peak in autistic female children compared to non-autistic female children; this difference decreased in adolescence and was not significant in adulthood. No age-dependent neurotype difference was significant in males. These findings indicate that sex and neurotype influence longer latency face processing and implicates cognitive rather than perceptual processing. Sex may have more overarching effects than neurotype on configural face processing.
Collapse
Affiliation(s)
- Teresa Del Bianco
- Centre for Brain and Cognitive Development, Henry Wellcome Building, Birkbeck University of London, Malet Street, London, WC1E 7HX, UK.
- School of Social Sciences and Professions, London Metropolitan University, London, UK.
| | - Meng-Chuan Lai
- Margaret and Wallace McCain Centre for Child, Youth & Family Mental Health and Azrieli Adult Neurodevelopmental Centre, Campbell Family Mental Health Research Institute, Centre for Addiction and Mental Health, Toronto, ON, Canada
- Department of Psychiatry, Temerty Faculty of Medicine, University of Toronto, Toronto, ON, Canada
- Department of Psychiatry, The Hospital for Sick Children, Toronto, ON, Canada
- Autism Research Centre, Department of Psychiatry, University of Cambridge, Cambridge, UK
- Department of Psychiatry, National Taiwan University Hospital and College of Medicine, Taipei, Taiwan
| | - Luke Mason
- Centre for Brain and Cognitive Development, Henry Wellcome Building, Birkbeck University of London, Malet Street, London, WC1E 7HX, UK
- Institute of Psychiatry, Psychology & Neuroscience, King's College London, London, UK
| | - Mark H Johnson
- Department of Psychology, University of Cambridge, Cambridge, UK
| | - Tony Charman
- Institute of Psychiatry, Psychology & Neuroscience, King's College London, London, UK
| | - Eva Loth
- Institute of Psychiatry, Psychology & Neuroscience, King's College London, London, UK
| | | | - Jan Buitelaar
- Donders Center of Medical Neurosciences, Radboud University, Nijmegen, The Netherlands
| | - Declan G M Murphy
- Institute of Psychiatry, Psychology & Neuroscience, King's College London, London, UK
| | - Emily J H Jones
- Centre for Brain and Cognitive Development, Henry Wellcome Building, Birkbeck University of London, Malet Street, London, WC1E 7HX, UK
| |
Collapse
|
3
|
Itier RJ, Durston AJ. Mass-univariate analysis of scalp ERPs reveals large effects of gaze fixation location during face processing that only weakly interact with face emotional expression. Sci Rep 2023; 13:17022. [PMID: 37813928 PMCID: PMC10562468 DOI: 10.1038/s41598-023-44355-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2023] [Accepted: 10/06/2023] [Indexed: 10/11/2023] Open
Abstract
Decoding others' facial expressions is critical for social functioning. To clarify the neural correlates of expression perception depending on where we look on the face, three combined gaze-contingent ERP experiments were analyzed using robust mass-univariate statistics. Regardless of task, fixation location impacted face processing from 50 to 350 ms, maximally around 120 ms, reflecting retinotopic mapping around C2 and P1 components. Fixation location also impacted majorly the N170-P2 interval while weak effects were seen at the face-sensitive N170 peak. Results question the widespread assumption that faces are processed holistically into an indecomposable perceptual whole around the N170. Rather, face processing is a complex and view-dependent process that continues well beyond the N170. Expression and fixation location interacted weakly during the P1-N170 interval, supporting a role for the mouth and left eye in fearful and happy expression decoding. Expression effects were weakest at the N170 peak but strongest around P2, especially for fear, reflecting task-independent affective processing. Results suggest N170 reflects a transition between processes rather than the maximum of a holistic face processing stage. Focus on this peak should be replaced by data-driven analyses of the epoch using robust statistics to fully unravel the early visual processing of faces and their affective content.
Collapse
Affiliation(s)
- Roxane J Itier
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada.
| | - Amie J Durston
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada
| |
Collapse
|
4
|
Impact of face outline, parafoveal feature number and feature type on early face perception in a gaze-contingent paradigm: A mass-univariate re-analysis of ERP data. NEUROIMAGE: REPORTS 2022. [DOI: 10.1016/j.ynirp.2022.100148] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/07/2022]
|
5
|
Hudson A, Durston AJ, McCrackin SD, Itier RJ. Emotion, Gender and Gaze Discrimination Tasks do not Differentially Impact the Neural Processing of Angry or Happy Facial Expressions-a Mass Univariate ERP Analysis. Brain Topogr 2021; 34:813-833. [PMID: 34596796 DOI: 10.1007/s10548-021-00873-x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2021] [Accepted: 09/20/2021] [Indexed: 10/20/2022]
Abstract
Facial expression processing is a critical component of social cognition yet, whether it is influenced by task demands at the neural level remains controversial. Past ERP studies have found mixed results with classic statistical analyses, known to increase both Type I and Type II errors, which Mass Univariate statistics (MUS) control better. However, MUS open-access toolboxes can use different fundamental statistics, which may lead to inconsistent results. Here, we compared the output of two MUS toolboxes, LIMO and FMUT, on the same data recorded during the processing of angry and happy facial expressions investigated under three tasks in a within-subjects design. Both toolboxes revealed main effects of emotion during the N170 timing and main effects of task during later time points typically associated with the LPP component. Neither toolbox yielded an interaction between the two factors at the group level, nor at the individual level in LIMO, confirming that the neural processing of these two face expressions is largely independent from task demands. Behavioural data revealed main effects of task on reaction time and accuracy, but no influence of expression or an interaction between the two. Expression processing and task demands are discussed in the context of the consistencies and discrepancies between the two toolboxes and existing literature.
Collapse
Affiliation(s)
- Anna Hudson
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada
| | - Amie J Durston
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada
| | - Sarah D McCrackin
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada
| | - Roxane J Itier
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada.
| |
Collapse
|
6
|
McCrackin SD, Itier RJ. I can see it in your eyes: Perceived gaze direction impacts ERP and behavioural measures of affective theory of mind. Cortex 2021; 143:205-222. [PMID: 34455372 DOI: 10.1016/j.cortex.2021.05.024] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2020] [Revised: 04/12/2021] [Accepted: 05/21/2021] [Indexed: 10/20/2022]
Abstract
Looking at someone's eyes is thought to be important for affective theory of mind (aTOM), our ability to infer their emotional state. However, it is unknown whether an individual's gaze direction influences our aTOM judgements and what the time course of this influence might be. We presented participants with sentences describing individuals in positive, negative or neutral scenarios, followed by direct or averted gaze neutral face pictures of those individuals. Participants made aTOM judgements about each person's mental state, including their affective valence and arousal, and we investigated whether the face gaze direction impacted those judgements. Participants rated that gazers were feeling more positive when they displayed direct gaze as opposed to averted gaze, and that they were feeling more aroused during negative contexts when gaze was averted as opposed to direct. Event-related potentials associated with face perception and affective processing were examined using mass-univariate analyses to track the time-course of this eye-gaze and affective processing interaction at a neural level. Both positive and negative trials were differentiated from neutral trials at many stages of processing. This included the early N200 and EPN components, believed to reflect automatic emotion areas activation and attentional selection respectively. This also included the later P300 and LPP components, thought to reflect elaborative cognitive appraisal of emotional content. Critically, sentence valence and gaze direction interacted over these later components, which may reflect the incorporation of eye-gaze in the cognitive evaluation of another's emotional state. The results suggest that gaze perception directly impacts aTOM processes, and that altered eye-gaze processing in clinical populations may contribute to associated aTOM impairments.
Collapse
Affiliation(s)
| | - Roxane J Itier
- Department of Psychology, University of Waterloo, Waterloo, Canada.
| |
Collapse
|
7
|
Alzueta E, Kessel D, Capilla A. The upside-down self: One's own face recognition is affected by inversion. Psychophysiology 2021; 58:e13919. [PMID: 34383323 DOI: 10.1111/psyp.13919] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2020] [Revised: 07/18/2021] [Accepted: 07/27/2021] [Indexed: 11/30/2022]
Abstract
One's own face is recognized more efficiently than any other face, although the neural mechanisms underlying this phenomenon remain poorly understood. Considering the extensive visual experience that we have with our own face, some authors have proposed that self-face recognition involves a more analytical perceptual strategy (i.e., based on face features) than other familiar faces, which are commonly processed holistically (i.e., as a whole). However, this hypothesis has not yet been tested with brain activity data. In the present study, we employed an inversion paradigm combined with event-related potential (ERP) recordings to investigate whether the self-face is processed more analytically. Sixteen healthy participants were asked to identify their own face and a familiar face regardless of its orientation, which could either be upright or inverted. ERP analysis revealed an enhanced amplitude and a delayed latency for the N170 component when faces were presented in an inverted orientation. Critically, both the self and a familiar face were equally vulnerable to the inversion effect, suggesting that the self-face is not processed more analytically than a familiar face. In addition, we replicated the recent finding that the attention-related P200 component is a specific neural index of self-face recognition. Overall, our results suggest that the advantage for self-face processing might be better explained by the engagement of self-related attentional mechanisms than by the use of a more analytical visuoperceptual strategy.
Collapse
Affiliation(s)
- Elisabet Alzueta
- Departamento de Psicología Biológica y de la Salud, Universidad Autónoma de Madrid, Madrid, Spain.,Center for Health Sciences, SRI International, Menlo Park, California, USA
| | - Dominique Kessel
- Departamento de Psicología Biológica y de la Salud, Universidad Autónoma de Madrid, Madrid, Spain
| | - Almudena Capilla
- Departamento de Psicología Biológica y de la Salud, Universidad Autónoma de Madrid, Madrid, Spain
| |
Collapse
|
8
|
Wehrman J, Sorensen S, De Lissa P, Badcock N. EPOC outside the shield: comparing the performance of a consumer-grade EEG device in shielded and unshielded environments. Biomed Phys Eng Express 2021; 7. [PMID: 33482647 DOI: 10.1088/2057-1976/abdf37] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2020] [Accepted: 01/22/2021] [Indexed: 11/12/2022]
Abstract
Low-cost, portable electroencephalography (EEG) devices have become commercially available in the last 10 years. One such system, Emotiv's EPOC, has been modified to allow event-related potential (ERP) research. Although the EPOC has been shown to provide data comparable to research-grade equipment and has been used in real-world settings, how EPOC performs without the electrical shielding, commonly used in research-grade laboratories, is yet to be systematically tested. In the current article we address this gap by conducting a simple EEG experiment in shielded and unshielded contexts. Participants (n = 13, mean age = 23.2 years, SD = 7.9) monitored the presentation of human versus wristwatch faces, responding whether the images were inverted or not. This method elicited the face-sensitive N170 ERP. In both shielded and unshielded contexts, the N170 amplitude was larger when participants viewed human faces and peaked later when a human face was inverted. More importantly, Bayesian analysis showed no difference in the N170 measured in the shielded and unshielded contexts. Further, the signal recorded in both contexts was highly correlated. The EPOC appears to reliably record EEG signals without a purpose-built electrically-shielded room.
Collapse
Affiliation(s)
- Jordan Wehrman
- Cognitive Science, Macquarie University, Macquarie University, Sydney, New South Wales, 2109, AUSTRALIA
| | - Sidsel Sorensen
- Macquarie University, Macquarie University, Sydney, New South Wales, 2122, AUSTRALIA
| | - Peter De Lissa
- Universitat Freiburg, Macqaurie University, Freiburg, 2122, GERMANY
| | - Nicholas Badcock
- University of Western Australia, Macquarie University, Perth, New South Wales, 2122, AUSTRALIA
| |
Collapse
|
9
|
Anzures G, Mildort M. Do perceptual expertise and implicit racial bias predict early face-sensitive ERP responses? Brain Cogn 2020; 147:105671. [PMID: 33360041 DOI: 10.1016/j.bandc.2020.105671] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2020] [Revised: 11/30/2020] [Accepted: 12/04/2020] [Indexed: 11/16/2022]
Abstract
Studies examining the visual perception of face race have revealed mixed findings regarding the presence or direction of effects on early face-sensitive event-related potential (ERP) components. Few studies have examined how early ERP components are influenced by individual differences in bottom-up and top-down processes involved in face perception, and how such factors might interact to influence early face-sensitive ERP components has yet to be investigated. Thus, the current study examined whether P100, N170, and P200 responses can be predicted by individual differences in own- and other-race face recognition, implicit racial bias, and their interaction. Race effects were observed in the P100, N170, and P200 responses. Other-race face recognition, implicit racial biases, and their interaction explained a significant amount of unique variability in N170 responses when viewing other-race faces. Responses to own-race faces were minimally influenced with only implicit racial bias predicting a significant amount of unique variability in N170 latency when viewing own-race faces. Face recognition, implicit racial bias, or their interaction did not predict P100 responses. The current findings suggest that face recognition abilities and its interaction with implicit racial bias modulate the early stages of other-race face processing.
Collapse
Affiliation(s)
- Gizelle Anzures
- Department of Psychology, Florida Atlantic University, Boca Raton, FL 33431, USA; FAU Brain Institute, Florida Atlantic University, Boca Raton, FL 33431, USA; Center for Complex Systems and Brain Sciences, Florida Atlantic University, Boca Raton, FL 33431, USA.
| | - Melissa Mildort
- Department of Psychology, Florida Atlantic University, Boca Raton, FL 33431, USA
| |
Collapse
|
10
|
Feeling through another's eyes: Perceived gaze direction impacts ERP and behavioural measures of positive and negative affective empathy. Neuroimage 2020; 226:117605. [PMID: 33271267 DOI: 10.1016/j.neuroimage.2020.117605] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2020] [Revised: 11/06/2020] [Accepted: 11/25/2020] [Indexed: 12/19/2022] Open
Abstract
Looking at the eyes informs us about the thoughts and emotions of those around us, and impacts our own emotional state. However, it is unknown how perceiving direct and averted gaze impacts our ability to share the gazer's positive and negative emotions, abilities referred to as positive and negative affective empathy. We presented 44 participants with contextual sentences describing positive, negative and neutral events happening to other people (e.g. "Her newborn was saved/killed/fed yesterday afternoon."). These were designed to elicit positive, negative, or little to no empathy, and were followed by direct or averted gaze images of the individuals described. Participants rated their affective empathy for the individual and their own emotional valence on each trial. Event-related potentials time-locked to face-onset and associated with empathy and emotional processing were recorded to investigate whether they were modulated by gaze direction. Relative to averted gaze, direct gaze was associated with increased positive valence in the positive and neutral conditions and with increased positive empathy ratings. A similar pattern was found at the neural level, using robust mass-univariate statistics. The N100, thought to reflect an automatic activation of emotion areas, was modulated by gaze in the affective empathy conditions, with opposite effect directions in positive and negative conditions.. The P200, an ERP component sensitive to positive stimuli, was modulated by gaze direction only in the positive empathy condition. Positive and negative trials were processed similarly at the early N200 processing stage, but later diverged, with only negative trials modulating the EPN, P300 and LPP components. These results suggest that positive and negative affective empathy are associated with distinct time-courses, and that perceived gaze direction uniquely modulates positive empathy, highlighting the importance of studying empathy with face stimuli.
Collapse
|
11
|
Attention is prioritised for proximate and approaching fearful faces. Cortex 2020; 134:52-64. [PMID: 33249300 DOI: 10.1016/j.cortex.2020.10.011] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2020] [Revised: 06/26/2020] [Accepted: 10/09/2020] [Indexed: 11/23/2022]
Abstract
Attention is an important function that allows us to selectively enhance the processing of relevant stimuli in our environment. Fittingly, a number of studies have revealed that potentially threatening/fearful stimuli capture attention more efficiently. Interestingly, in separate fMRI studies, threatening stimuli situated close to viewers were found to enhance brain activity in fear-relevant areas more than stimuli that were further away. Despite these observations, few studies have examined the effect of personal distance on attentional capture by emotional stimuli. Using electroencephalography (EEG), the current investigation addressed this question by investigating attentional capture of emotional faces that were either looming/receding, or were situated at different distances from the viewer. In Experiment 1, participants carried out an incidental task while looming or receding fearful and neutral faces were presented bilaterally. A significant lateralised N170 and N2pc were found for a looming upright fearful face, however no significant components were found for a looming upright neutral face or inverted fearful and neutral faces. In Experiment 2, participants made gender judgements of emotional faces that appeared on a screen situated within or beyond peripersonal space (respectively 50 cm or 120 cm). Although response times did not differ, significantly more errors were made when faces appeared in near as opposed to far space. Importantly, ERPs revealed a significant N2pc for fearful faces presented in peripersonal distance, compared to the far distance. Our findings show that personal distance markedly affects neural responses to emotional stimuli, with increased attention towards fearful upright faces that appear in close distance.
Collapse
|
12
|
de Lissa P, McArthur G, Hawelka S, Palermo R, Mahajan Y, Degno F, Hutzler F. Peripheral preview abolishes N170 face-sensitivity at fixation: Using fixation-related potentials to investigate dynamic face processing. VISUAL COGNITION 2019. [DOI: 10.1080/13506285.2019.1676855] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Affiliation(s)
- Peter de Lissa
- iBMLab, Department of Psychology, University of Fribourg, Fribourg, Switzerland
- Department of Cognitive Science, Macquarie University, Sydney, Australia
| | - Genevieve McArthur
- Department of Cognitive Science, Macquarie University, Sydney, Australia
| | - Stefan Hawelka
- Centre for Cognitive Neuroscience, Salzburg University, Salzburg, Austria
| | - Romina Palermo
- School of Psychological Science, University of Western Australia, Perth, Australia
| | - Yatin Mahajan
- The MARCS Institute, University of Western Sydney, Australia
| | - Federica Degno
- School of Psychology, University of Central Lancashire, Preston, UK
| | - Florian Hutzler
- Centre for Cognitive Neuroscience, Salzburg University, Salzburg, Austria
| |
Collapse
|
13
|
From eye to face: The impact of face outline, feature number, and feature saliency on the early neural response to faces. Brain Res 2019; 1722:146343. [PMID: 31336099 DOI: 10.1016/j.brainres.2019.146343] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2019] [Revised: 07/12/2019] [Accepted: 07/19/2019] [Indexed: 11/22/2022]
Abstract
The LIFTED model of early face perception postulates that the face-sensitive N170 event-related potential may reflect underlying neural inhibition mechanisms which serve to regulate holistic and featural processing. It remains unclear, however, what specific factors impact these neural inhibition processes. Here, N170 peak responses were recorded whilst adults maintained fixation on a single eye using a gaze-contingent paradigm, and the presence/absence of a face outline, as well as the number and type of parafoveal features within the outline, were manipulated. N170 amplitudes and latencies were reduced when a single eye was fixated within a face outline compared to fixation on the same eye in isolation, demonstrating that the simple presence of a face outline is sufficient to elicit a shift towards a more face-like neural response. A monotonic decrease in the N170 amplitude and latency was observed with increasing numbers of parafoveal features, and the type of feature(s) present in parafovea further modulated this early face response. These results support the idea of neural inhibition exerted by parafoveal features onto the foveated feature as a function of the number, and possibly the nature, of parafoveal features. Specifically, the results suggest the use of a feature saliency framework (eyes > mouth > nose) at the neural level, such that the parafoveal eye may play a role in down-regulating the response to the other eye (in fovea) more so than the nose or the mouth. These results confirm the importance of parafoveal features and the face outline in the neural inhibition mechanism, and provide further support for a feature saliency mechanism guiding early face perception.
Collapse
|
14
|
Huber-Huber C, Buonocore A, Dimigen O, Hickey C, Melcher D. The peripheral preview effect with faces: Combined EEG and eye-tracking suggests multiple stages of trans-saccadic predictive and non-predictive processing. Neuroimage 2019; 200:344-362. [PMID: 31260837 DOI: 10.1016/j.neuroimage.2019.06.059] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2018] [Revised: 05/23/2019] [Accepted: 06/25/2019] [Indexed: 02/06/2023] Open
Abstract
The world appears stable despite saccadic eye-movements. One possible explanation for this phenomenon is that the visual system predicts upcoming input across saccadic eye-movements based on peripheral preview of the saccadic target. We tested this idea using concurrent electroencephalography (EEG) and eye-tracking. Participants made cued saccades to peripheral upright or inverted face stimuli that changed orientation (invalid preview) or maintained orientation (valid preview) while the saccade was completed. Experiment 1 demonstrated better discrimination performance and a reduced fixation-locked N170 component (fN170) with valid than with invalid preview, demonstrating integration of pre- and post-saccadic information. Moreover, the early fixation-related potentials (FRP) showed a preview face inversion effect suggesting that some pre-saccadic input was represented in the brain until around 170 ms post fixation-onset. Experiment 2 replicated Experiment 1 and manipulated the proportion of valid and invalid trials to test whether the preview effect reflects context-based prediction across trials. A whole-scalp Bayes factor analysis showed that this manipulation did not alter the fN170 preview effect but did influence the face inversion effect before the saccade. The pre-saccadic inversion effect declined earlier in the mostly invalid block than in the mostly valid block, which is consistent with the notion of pre-saccadic expectations. In addition, in both studies, we found strong evidence for an interaction between the pre-saccadic preview stimulus and the post-saccadic target as early as 50 ms (Experiment 2) or 90 ms (Experiment 1) into the new fixation. These findings suggest that visual stability may involve three temporal stages: prediction about the saccadic target, integration of pre-saccadic and post-saccadic information at around 50-90 ms post fixation onset, and post-saccadic facilitation of rapid categorization.
Collapse
Affiliation(s)
- Christoph Huber-Huber
- Center for Mind/Brain Sciences (CIMeC), University of Trento, Corso Bettini 31, Rovereto, TN, 38068, Italy.
| | - Antimo Buonocore
- Werner Reichardt Centre for Integrative Neuroscience, Tuebingen University, Otfried-Müller-Straße 25, Tuebingen, 72076, Germany; Hertie Institute for Clinical Brain Research, Tuebingen University, Tuebingen, 72076, Germany
| | - Olaf Dimigen
- Department of Psychology, Humboldt-Universität zu Berlin, Unter Den Linden 6, 10099, Berlin, Germany
| | - Clayton Hickey
- Center for Mind/Brain Sciences (CIMeC), University of Trento, Corso Bettini 31, Rovereto, TN, 38068, Italy
| | - David Melcher
- Center for Mind/Brain Sciences (CIMeC), University of Trento, Corso Bettini 31, Rovereto, TN, 38068, Italy
| |
Collapse
|
15
|
McCrackin SD, Itier RJ. Perceived Gaze Direction Differentially Affects Discrimination of Facial Emotion, Attention, and Gender - An ERP Study. Front Neurosci 2019; 13:517. [PMID: 31178686 PMCID: PMC6543003 DOI: 10.3389/fnins.2019.00517] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2018] [Accepted: 05/06/2019] [Indexed: 12/16/2022] Open
Abstract
The perception of eye-gaze is thought to be a key component of our everyday social interactions. While the neural correlates of direct and averted gaze processing have been investigated, there is little consensus about how these gaze directions may be processed differently as a function of the task being performed. In a within-subject design, we examined how perception of direct and averted gaze affected performance on tasks requiring participants to use directly available facial cues to infer the individuals' emotional state (emotion discrimination), direction of attention (attention discrimination) and gender (gender discrimination). Neural activity was recorded throughout the three tasks using EEG, and ERPs time-locked to face onset were analyzed. Participants were most accurate at discriminating emotions with direct gaze faces, but most accurate at discriminating attention with averted gaze faces, while gender discrimination was not affected by gaze direction. At the neural level, direct and averted gaze elicited different patterns of activation depending on the task over frontal sites, from approximately 220-290 ms. More positive amplitudes were seen for direct than averted gaze in the emotion discrimination task. In contrast, more positive amplitudes were seen for averted gaze than for direct gaze in the gender discrimination task. These findings are among the first direct evidence that perceived gaze direction modulates neural activity differently depending on task demands, and that at the behavioral level, specific gaze directions functionally overlap with emotion and attention discrimination, precursors to more elaborated theory of mind processes.
Collapse
Affiliation(s)
| | - Roxane J. Itier
- Department of Psychology, University of Waterloo, Waterloo, ON, Canada
| |
Collapse
|
16
|
Joint Modulation of Facial Expression Processing by Contextual Congruency and Task Demands. Brain Sci 2019; 9:brainsci9050116. [PMID: 31109022 PMCID: PMC6562852 DOI: 10.3390/brainsci9050116] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2019] [Revised: 05/10/2019] [Accepted: 05/15/2019] [Indexed: 11/16/2022] Open
Abstract
Faces showing expressions of happiness or anger were presented together with sentences that described happiness-inducing or anger-inducing situations. Two main variables were manipulated: (i) congruency between contexts and expressions (congruent/incongruent) and (ii) the task assigned to the participant, discriminating the emotion shown by the target face (emotion task) or judging whether the expression shown by the face was congruent or not with the context (congruency task). Behavioral and electrophysiological results (event-related potentials (ERP)) showed that processing facial expressions was jointly influenced by congruency and task demands. ERP results revealed task effects at frontal sites, with larger positive amplitudes between 250–450 ms in the congruency task, reflecting the higher cognitive effort required by this task. Effects of congruency appeared at latencies and locations corresponding to the early posterior negativity (EPN) and late positive potential (LPP) components that have previously been found to be sensitive to emotion and affective congruency. The magnitude and spatial distribution of the congruency effects varied depending on the task and the target expression. These results are discussed in terms of the modulatory role of context on facial expression processing and the different mechanisms underlying the processing of expressions of positive and negative emotions.
Collapse
|
17
|
Stacchi L, Liu-Shuang J, Ramon M, Caldara R. Reliability of individual differences in neural face identity discrimination. Neuroimage 2019; 189:468-475. [DOI: 10.1016/j.neuroimage.2019.01.023] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2018] [Revised: 12/19/2018] [Accepted: 01/09/2019] [Indexed: 11/27/2022] Open
|
18
|
Hashemi A, Pachai MV, Bennett PJ, Sekuler AB. The role of horizontal facial structure on the N170 and N250. Vision Res 2019; 157:12-23. [DOI: 10.1016/j.visres.2018.02.006] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2017] [Revised: 02/01/2018] [Accepted: 02/03/2018] [Indexed: 10/17/2022]
|
19
|
Neural Representations of Faces Are Tuned to Eye Movements. J Neurosci 2019; 39:4113-4123. [PMID: 30867260 DOI: 10.1523/jneurosci.2968-18.2019] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2018] [Revised: 02/07/2019] [Accepted: 03/05/2019] [Indexed: 01/23/2023] Open
Abstract
Eye movements provide a functional signature of how human vision is achieved. Many recent studies have consistently reported robust idiosyncratic visual sampling strategies during face recognition. Whether these interindividual differences are mirrored by idiosyncratic neural responses remains unknown. To this aim, we first tracked eye movements of male and female observers during face recognition. Additionally, for every observer we obtained an objective index of neural face discrimination through EEG that was recorded while they fixated different facial information. We found that foveation of facial features fixated longer during face recognition elicited stronger neural face discrimination responses across all observers. This relationship occurred independently of interindividual differences in preferential facial information sampling (e.g., eye vs mouth lookers), and started as early as the first fixation. Our data show that eye movements play a functional role during face processing by providing the neural system with the information that is diagnostic to a specific observer. The effective processing of identity involves idiosyncratic, rather than universal face representations.SIGNIFICANCE STATEMENT When engaging in face recognition, observers deploy idiosyncratic fixation patterns to sample facial information. Whether these individual differences concur with idiosyncratic face-sensitive neural responses remains unclear. To address this issue, we recorded observers' fixation patterns, as well as their neural face discrimination responses elicited during fixation of 10 different locations on the face, corresponding to different types of facial information. Our data reveal a clear interplay between individuals' face-sensitive neural responses and their idiosyncratic eye-movement patterns during identity processing, which emerges as early as the first fixation. Collectively, our findings favor the existence of idiosyncratic, rather than universal face representations.
Collapse
|
20
|
Dupuis-Roy N, Faghel-Soubeyrand S, Gosselin F. Time course of the use of chromatic and achromatic facial information for sex categorization. Vision Res 2018; 157:36-43. [PMID: 30201473 DOI: 10.1016/j.visres.2018.08.004] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2018] [Revised: 07/29/2018] [Accepted: 08/29/2018] [Indexed: 11/27/2022]
Abstract
The most useful facial features for sex categorization are the eyes, the eyebrows, and the mouth. Dupuis-Roy et al. reported a large positive correlation between the use of the mouth region and rapid correct answers [Journal of Vision 9 (2009) 1-8]. Given the chromatic information in this region, they hypothesized that the extraction of chromatic and achromatic cues may have different time courses. Here, we tested this hypothesis directly: 110 participants categorized the sex of 300 face images whose chromatic and achromatic content was partially revealed through time (200 ms) and space using randomly located spatio-temporal Gaussian apertures (i.e. the Bubbles technique). This also allowed us to directly compare, for the first time, the relative importance of chromatic and achromatic facial cues for sex categorization. Results showed that face-sex categorization relies mostly on achromatic (luminance) information concentrated in the eye and eyebrow regions, especially the left eye and eyebrow. Additional analyses indicated that chromatic information located in the mouth/philtrum region was used earlier-peaking as early as 35 ms after stimulus onset-than achromatic information in the eye regions-peaking between 165 and 176 ms after stimulus onset-as was speculated by Dupuis-Roy et al. A non-linear analysis failed to support Yip and Sinha's proposal that processing of chromatic variations can improve subsequent processing of achromatic spatial cues, possibly via surface segmentation [Perception 31 (2002) 995-1003]. Instead, we argue that the brain prioritizes chromatic information to compensate for the sluggishness of chromatic processing in early visual areas, and allow chromatic and achromatic information to reach higher-level visual areas simultaneously.
Collapse
Affiliation(s)
- N Dupuis-Roy
- Département de psychologie, Université de Montréal, Canada
| | | | - F Gosselin
- Département de psychologie, Université de Montréal, Canada.
| |
Collapse
|
21
|
One versus two eyes makes a difference! Early face perception is modulated by featural fixation and feature context. Cortex 2018; 109:35-49. [PMID: 30286305 DOI: 10.1016/j.cortex.2018.08.025] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2017] [Revised: 07/11/2018] [Accepted: 08/30/2018] [Indexed: 11/23/2022]
Abstract
The N170 event-related potential component is an early marker of face perception that is particularly sensitive to isolated eye regions and to eye fixations within a face. Here, this eye sensitivity was tested further by measuring the N170 to isolated facial features and to the same features fixated within a face, using a gaze-contingent procedure. The neural response to single isolated eyes and eye regions (two eyes) was also compared. Pixel intensity and contrast were controlled at the global (image) and local (featural) levels. Consistent with previous findings, larger N170 amplitudes were elicited when the left or right eye was fixated within a face, compared to the mouth or nose, demonstrating that the N170 eye sensitivity reflects higher-order perceptual processes and not merely low-level perceptual effects. The N170 was also largest and most delayed for isolated features, compared to equivalent fixations within a face. Specifically, mouth fixation yielded the largest amplitude difference, and nose fixation yielded the largest latency difference between these two contexts, suggesting the N170 may reflect a complex interplay between holistic and featural processes. Critically, eye regions elicited consistently larger and shorter N170 responses compared to single eyes, with enhanced responses for contralateral eye content, irrespective of eye or nasion fixation. These results confirm the importance of the eyes in early face perception, and provide novel evidence of an increased sensitivity to the presence of two symmetric eyes compared to only one eye, consistent with a neural eye region detector rather than an eye detector per se.
Collapse
|
22
|
Itier RJ, Preston F. Increased Early Sensitivity to Eyes in Mouthless Faces: In Support of the LIFTED Model of Early Face Processing. Brain Topogr 2018; 31:972-984. [PMID: 29987641 DOI: 10.1007/s10548-018-0663-6] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2018] [Accepted: 07/06/2018] [Indexed: 10/28/2022]
Abstract
The N170 ERP component is a central neural marker of early face perception usually thought to reflect holistic processing. However, it is also highly sensitive to eyes presented in isolation and to fixation on the eyes within a full face. The lateral inhibition face template and eye detector (LIFTED) model (Nemrodov et al. in NeuroImage 97:81-94, 2014) integrates these views by proposing a neural inhibition mechanism that perceptually glues features into a whole, in parallel to the activity of an eye detector that accounts for the eye sensitivity. The LIFTED model was derived from a large number of results obtained with intact and eyeless faces presented upright and inverted. The present study provided a control condition to the original design by replacing eyeless with mouthless faces, hereby enabling testing of specific predictions derived from the model. Using the same gaze-contingent approach, we replicated the N170 eye sensitivity regardless of face orientation. Furthermore, when eyes were fixated in upright faces, the N170 was larger for mouthless compared to intact faces, while inverted mouthless faces elicited smaller amplitude than intact inverted faces when fixation was on the mouth and nose. The results are largely in line with the LIFTED model, in particular with the idea of an inhibition mechanism involved in holistic processing of upright faces and the lack of such inhibition in processing inverted faces. Some modifications to the original model are also proposed based on these results.
Collapse
Affiliation(s)
- Roxane J Itier
- Department of Psychology, University of Waterloo, 200 University Ave W, Waterloo, ON, N2L 3G1, Canada.
| | - Frank Preston
- Department of Psychology, University of Waterloo, 200 University Ave W, Waterloo, ON, N2L 3G1, Canada
| |
Collapse
|
23
|
McCrackin SD, Itier RJ. Is it about me? Time-course of self-relevance and valence effects on the perception of neutral faces with direct and averted gaze. Biol Psychol 2018. [DOI: 10.1016/j.biopsycho.2018.03.003] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/03/2023]
|
24
|
Robinson JE, Breakspear M, Young AW, Johnston PJ. Dose‐dependent modulation of the visually evoked N1/N170 by perceptual surprise: a clear demonstration of prediction‐error signalling. Eur J Neurosci 2018; 52:4442-4452. [DOI: 10.1111/ejn.13920] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2017] [Revised: 03/15/2018] [Accepted: 03/20/2018] [Indexed: 11/29/2022]
Affiliation(s)
- Jonathan E. Robinson
- Queensland University of Technology Victoria Park Road Kelvin Grove Qld 4059 Australia
- QIMR Berghofer Medical Research Institute Herston Qld Australia
| | | | | | - Patrick J. Johnston
- Queensland University of Technology Victoria Park Road Kelvin Grove Qld 4059 Australia
- QIMR Berghofer Medical Research Institute Herston Qld Australia
| |
Collapse
|
25
|
Itier RJ, Neath-Tavares KN. Effects of task demands on the early neural processing of fearful and happy facial expressions. Brain Res 2017; 1663:38-50. [PMID: 28315309 PMCID: PMC5756067 DOI: 10.1016/j.brainres.2017.03.013] [Citation(s) in RCA: 43] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2016] [Revised: 02/16/2017] [Accepted: 03/10/2017] [Indexed: 11/17/2022]
Abstract
Task demands shape how we process environmental stimuli but their impact on the early neural processing of facial expressions remains unclear. In a within-subject design, ERPs were recorded to the same fearful, happy and neutral facial expressions presented during a gender discrimination, an explicit emotion discrimination and an oddball detection tasks, the most studied tasks in the field. Using an eye tracker, fixation on the face nose was enforced using a gaze-contingent presentation. Task demands modulated amplitudes from 200 to 350ms at occipito-temporal sites spanning the EPN component. Amplitudes were more negative for fearful than neutral expressions starting on N170 from 150 to 350ms, with a temporo-occipital distribution, whereas no clear effect of happy expressions was seen. Task and emotion effects never interacted in any time window or for the ERP components analyzed (P1, N170, EPN). Thus, whether emotion is explicitly discriminated or irrelevant for the task at hand, neural correlates of fearful and happy facial expressions seem immune to these task demands during the first 350ms of visual processing.
Collapse
|
26
|
Neath-Tavares KN, Itier RJ. Neural processing of fearful and happy facial expressions during emotion-relevant and emotion-irrelevant tasks: A fixation-to-feature approach. Biol Psychol 2016; 119:122-40. [PMID: 27430934 DOI: 10.1016/j.biopsycho.2016.07.013] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2015] [Revised: 07/12/2016] [Accepted: 07/12/2016] [Indexed: 10/21/2022]
Abstract
Research suggests an important role of the eyes and mouth for discriminating facial expressions of emotion. A gaze-contingent procedure was used to test the impact of fixation to facial features on the neural response to fearful, happy and neutral facial expressions in an emotion discrimination (Exp.1) and an oddball detection (Exp.2) task. The N170 was the only eye-sensitive ERP component, and this sensitivity did not vary across facial expressions. In both tasks, compared to neutral faces, responses to happy expressions were seen as early as 100-120ms occipitally, while responses to fearful expressions started around 150ms, on or after the N170, at both occipital and lateral-posterior sites. Analyses of scalp topographies revealed different distributions of these two emotion effects across most of the epoch. Emotion processing interacted with fixation location at different times between tasks. Results suggest a role of both the eyes and mouth in the neural processing of fearful expressions and of the mouth in the processing of happy expressions, before 350ms.
Collapse
|
27
|
daSilva EB, Crager K, Geisler D, Newbern P, Orem B, Puce A. Something to sink your teeth into: The presence of teeth augments ERPs to mouth expressions. Neuroimage 2016; 127:227-241. [DOI: 10.1016/j.neuroimage.2015.12.020] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2015] [Revised: 12/11/2015] [Accepted: 12/12/2015] [Indexed: 01/11/2023] Open
|
28
|
Neath KN, Itier RJ. Fixation to features and neural processing of facial expressions in a gender discrimination task. Brain Cogn 2015; 99:97-111. [PMID: 26277653 DOI: 10.1016/j.bandc.2015.05.007] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2014] [Revised: 05/06/2015] [Accepted: 05/19/2015] [Indexed: 11/27/2022]
Abstract
Early face encoding, as reflected by the N170 ERP component, is sensitive to fixation to the eyes. Whether this sensitivity varies with facial expressions of emotion and can also be seen on other ERP components such as P1 and EPN, was investigated. Using eye-tracking to manipulate fixation on facial features, we found the N170 to be the only eye-sensitive component and this was true for fearful, happy and neutral faces. A different effect of fixation to features was seen for the earlier P1 that likely reflected general sensitivity to face position. An early effect of emotion (∼120 ms) for happy faces was seen at occipital sites and was sustained until ∼350 ms post-stimulus. For fearful faces, an early effect was seen around 80 ms followed by a later effect appearing at ∼150 ms until ∼300 ms at lateral posterior sites. Results suggests that in this emotion-irrelevant gender discrimination task, processing of fearful and happy expressions occurred early and largely independently of the eye-sensitivity indexed by the N170. Processing of the two emotions involved different underlying brain networks active at different times.
Collapse
|
29
|
Simpson EA, Suomi SJ, Paukner A. Evolutionary relevance and experience contribute to face discrimination in infant macaques ( Macaca mulatta). JOURNAL OF COGNITION AND DEVELOPMENT 2015; 17:285-299. [PMID: 27212893 DOI: 10.1080/15248372.2015.1048863] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
In human children and adults, familiar face types-typically own-age and own-species faces-are discriminated better than other face types; however, human infants do not appear to exhibit an own-age bias, but instead better discriminate adult faces, which they see more often. There are two possible explanations for this pattern: Perceptual attunement, which predicts advantages in discrimination for the most-experienced face types; additionally or alternatively, there may be an experience-independent bias for infants to discriminate own-species faces, an adaptation for evolutionarily relevant faces. These possibilities have not been disentangled in studies thus far, which did not control infants' early experiences with faces. In the present study, we tested these predictions in infant macaques (Macaca mulatta) reared under controlled environments, not exposed to adult conspecifics. We measured newborns' (15-25 days; n = 27) and 6- to 7-month-olds' (n = 35) discrimination of human and macaque faces of three ages-young infants, old infants, and adults-in a visual paired comparison task. We found that 6- to 7-month-olds were the best at discriminating adult macaque faces; however, in the first few seconds of looking, additionally discriminated familiar face types-same-aged peer and adult human faces-highlighting the importance of experience with certain face categories. The present data suggest that macaque infants possess both experience-independent and experientially tuned face biases. In human infants, early face skills may likewise be driven by both experience and evolutionary relevance; future studies should consider both of these factors.
Collapse
Affiliation(s)
- Elizabeth A Simpson
- Laboratory of Comparative Ethology, Eunice Kennedy Shriver National Institute of Child Health and Human Development, National Institutes of Health, Department of Health and Human Services, Poolesville, MD, USA; Dipartimento di Neuroscienze, Università di Parma, Parma, Italy
| | - Stephen J Suomi
- Laboratory of Comparative Ethology, Eunice Kennedy Shriver National Institute of Child Health and Human Development, National Institutes of Health, Department of Health and Human Services, Poolesville, MD, USA
| | - Annika Paukner
- Laboratory of Comparative Ethology, Eunice Kennedy Shriver National Institute of Child Health and Human Development, National Institutes of Health, Department of Health and Human Services, Poolesville, MD, USA
| |
Collapse
|
30
|
Fisher K, Towler J, Eimer M. Effects of contrast inversion on face perception depend on gaze location: Evidence from the N170 component. Cogn Neurosci 2015; 7:128-37. [DOI: 10.1080/17588928.2015.1053441] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Affiliation(s)
- Katie Fisher
- Department of Psychological Sciences, Birkbeck College, University of London, London, UK
| | - John Towler
- Department of Psychological Sciences, Birkbeck College, University of London, London, UK
| | - Martin Eimer
- Department of Psychological Sciences, Birkbeck College, University of London, London, UK
| |
Collapse
|
31
|
de Lissa P, Sörensen S, Badcock N, Thie J, McArthur G. Measuring the face-sensitive N170 with a gaming EEG system: A validation study. J Neurosci Methods 2015; 253:47-54. [PMID: 26057115 DOI: 10.1016/j.jneumeth.2015.05.025] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2015] [Revised: 05/09/2015] [Accepted: 05/26/2015] [Indexed: 11/30/2022]
Abstract
BACKGROUND The N170 is a "face-sensitive" event-related potential (ERP) that occurs at around 170ms over occipito-temporal brain regions. The N170's potential to provide insight into the neural processing of faces in certain populations (e.g., children and adults with cognitive impairments) is limited by its measurement in scientific laboratories that can appear threatening to some people. NEW METHOD The advent of cheap, easy-to-use portable gaming EEG systems provides an opportunity to record EEG in new contexts and populations. This study tested the validity of the face-sensitive N170 ERP measured with an adapted commercial EEG system (the Emotiv EPOC) that is used at home by gamers. RESULTS The N170 recorded through both the gaming EEG system and the research EEG system exhibited face-sensitivity, with larger mean amplitudes in response to the face stimuli than the non-face stimuli, and a delayed N170 peak in response to face inversion. COMPARISON WITH EXISTING METHOD The EPOC system produced very similar N170 ERPs to a research-grade Neuroscan system, and was capable of recording face-sensitivity in the N170, validating its use as research tool in this arena. CONCLUSIONS This opens new possibilities for measuring the face-sensitive N170 ERP in people who cannot travel to a traditional ERP laboratory (e.g., elderly people in care), who cannot tolerate laboratory conditions (e.g., people with autism), or who need to be tested in situ for practical or experimental reasons (e.g., children in schools).
Collapse
Affiliation(s)
- Peter de Lissa
- Department of Cognitive Science, ARC Centre of Excellence in Cognition and its Disorders, Macquarie University, Sydney, NSW, Australia.
| | - Sidsel Sörensen
- Department of Cognitive Science, ARC Centre of Excellence in Cognition and its Disorders, Macquarie University, Sydney, NSW, Australia
| | - Nicholas Badcock
- Department of Cognitive Science, ARC Centre of Excellence in Cognition and its Disorders, Macquarie University, Sydney, NSW, Australia
| | - Johnson Thie
- School of Electrical and Information Engineering, University of Sydney, Sydney, NSW, Australia
| | - Genevieve McArthur
- Department of Cognitive Science, ARC Centre of Excellence in Cognition and its Disorders, Macquarie University, Sydney, NSW, Australia
| |
Collapse
|
32
|
The N170 and face perception in psychiatric and neurological disorders: A systematic review. Clin Neurophysiol 2014; 126:1141-1158. [PMID: 25306210 DOI: 10.1016/j.clinph.2014.09.015] [Citation(s) in RCA: 81] [Impact Index Per Article: 8.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2014] [Revised: 09/08/2014] [Accepted: 09/10/2014] [Indexed: 12/17/2022]
Abstract
OBJECTIVE To systematically evaluate evidence for configural and affective face processing abnormalities as measured by the N170 and Vertex Positive Potential (VPP) event-related potential components, and analogous M170 magnetoencephalography (MEG) component, in neurological and psychiatric disorders. METHODS 1251 unique articles were identified using PsychINFO and PubMed databases. Sixty-seven studies were selected for review, which employed various tasks to measure the N170, M170 or VPP; the 13 neurological/psychiatric conditions were Attention-Deficit Hyperactivity Disorder (ADHD), Alcohol Dependence, Alzheimer's Disease, Autism Spectrum Disorders (ASDs), Bipolar Disorder, Bulimia Nervosa, Fibromyalgia, Huntington's Disease, Major Depressive Disorder, Parkinson's Disease, Prosopagnosia, Schizophrenia and Social Phobia. RESULTS Smaller N170 and VPP amplitudes to faces compared to healthy controls were consistently reported in Schizophrenia but not in ASDs. In Schizophrenia N170 and VPP measures were not correlated with clinical symptoms. Findings from other disorders were highly inconsistent; however, reported group differences were almost always smaller amplitudes or slower latencies to emotional faces in disordered groups regardless of diagnosis. CONCLUSIONS Results suggest that N170/VPP abnormalities index non-specific facial affect processing dysfunction in these neurological and psychiatric conditions, reflecting social impairments being broadly characteristic of these groups. SIGNIFICANCE The N170 and analogous components hold promise as diagnostic and treatment monitoring biomarkers for social dysfunction.
Collapse
|