1
|
Cassidy LC, Bethell EJ, Brockhausen RR, Boretius S, Treue S, Pfefferle D. The Dot-Probe Attention Bias Task as a Method to Assess Psychological Well-Being after Anesthesia: A Study with Adult Female Long-Tailed Macaques (Macaca fascicularis). Eur Surg Res 2023; 64:37-53. [PMID: 34915502 PMCID: PMC9909723 DOI: 10.1159/000521440] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2021] [Accepted: 11/28/2021] [Indexed: 11/19/2022]
Abstract
Understanding the impact routine research and laboratory procedures have on animals is crucial to improving their well-being and to the success and reproducibility of the research they are involved in. Cognitive measures of welfare offer insight into animals' internal psychological state, but require validation. Attention bias - the tendency to attend to one type of information over another - is a cognitive phenomenon documented in humans and animals that is known to be modulated by affective state (i.e., emotions). Hence, changes in attention bias may offer researchers a deeper perspective of their animals' psychological well-being. The dot-probe task is an established method for quantifying attention bias in humans (by measuring reaction time to a dot-probe replacing pairs of stimuli), but has yet to be validated in animals. We developed a dot-probe task for long-tailed macaques (Macaca fascicularis) to determine if the task can detect changes in attention bias following anesthesia, a context known to modulate attention and trigger physiological arousal in macaques. Our task included the following features: stimulus pairs of threatening and neutral facial expressions of conspecifics and their scrambled counterparts, two stimuli durations (100 and 1,000 ms), and counterbalancing of the dot-probe's position on the touchscreen (left and right) and location relative to the threatening stimulus. We tested 8 group-housed adult females on different days relative to being anesthetized (baseline and 1-, 3-, 7-, and 14-days after). At baseline, monkeys were vigilant to threatening content when stimulus pairs were presented for 100 ms, but not 1,000 ms. On the day immediately following anesthesia, we found evidence that attention bias changed to an avoidance of threatening content. Attention bias returned to threat vigilance by the third day postanesthesia and remained so up to the last day of testing (14-days after anesthesia). We also found that attention bias was independent of the type of stimuli pair (i.e., whole face vs. scrambled counterparts), suggesting that the scrambled stimuli retained aspects of the original stimuli. Nevertheless, whole faces were more salient to the monkeys as responses to these trials were generally slower than to scrambled stimulus pairs. Overall, our study suggests it is feasible to detect changes in attention bias following anesthesia using the dot-probe task in nonhuman primates. Our results also reveal important aspects of stimulus preparation and experimental design.
Collapse
Affiliation(s)
- Lauren C Cassidy
- Welfare and Cognition Group, Cognitive Neuroscience Laboratory, German Primate Center-Leibniz Institute for Primate Research, Goettingen, Germany
- Leibniz-Science Campus Primate Cognition, German Primate Center, University of Goettingen, Goettingen, Germany
| | - Emily J Bethell
- Liverpool John Moores University, Research Centre in Evolutionary Anthropology and Palaeoecology, Liverpool, UK
- Liverpool John Moores University, Research Centre in Brain and Behaviour, Liverpool, UK
| | - Ralf R Brockhausen
- Welfare and Cognition Group, Cognitive Neuroscience Laboratory, German Primate Center-Leibniz Institute for Primate Research, Goettingen, Germany
| | - Susann Boretius
- Leibniz-Science Campus Primate Cognition, German Primate Center, University of Goettingen, Goettingen, Germany
- Functional Imaging Laboratory, German Primate Center-Leibniz Institute for Primate Research, Goettingen, Germany
| | - Stefan Treue
- Welfare and Cognition Group, Cognitive Neuroscience Laboratory, German Primate Center-Leibniz Institute for Primate Research, Goettingen, Germany
- Leibniz-Science Campus Primate Cognition, German Primate Center, University of Goettingen, Goettingen, Germany
| | - Dana Pfefferle
- Welfare and Cognition Group, Cognitive Neuroscience Laboratory, German Primate Center-Leibniz Institute for Primate Research, Goettingen, Germany
- Leibniz-Science Campus Primate Cognition, German Primate Center, University of Goettingen, Goettingen, Germany
| |
Collapse
|
2
|
Naples AJ, Foss-Feig JH, Wolf JM, Srihari VH, McPartland JC. Predictability modulates neural response to eye contact in ASD. Mol Autism 2022; 13:42. [PMID: 36309762 PMCID: PMC9618208 DOI: 10.1186/s13229-022-00519-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2021] [Accepted: 09/26/2022] [Indexed: 12/31/2022] Open
Abstract
BACKGROUND Deficits in establishing and maintaining eye-contact are early and persistent vulnerabilities of autism spectrum disorder (ASD), and the neural bases of these deficits remain elusive. A promising hypothesis is that social features of autism may reflect difficulties in making predictions about the social world under conditions of uncertainty. However, no research in ASD has examined how predictability impacts the neural processing of eye-contact in naturalistic interpersonal interactions. METHOD We used eye tracking to facilitate an interactive social simulation wherein onscreen faces would establish eye-contact when the participant looked at them. In Experiment One, receipt of eye-contact was unpredictable; in Experiment Two, receipt of eye-contact was predictable. Neural response to eye-contact was measured via the N170 and P300 event-related potentials (ERPs). Experiment One included 23 ASD and 46 typically developing (TD) adult participants. Experiment Two included 25 ASD and 43 TD adult participants. RESULTS When receipt of eye-contact was unpredictable, individuals with ASD showed increased N170 and increased, but non-specific, P300 responses. The magnitude of the N170 responses correlated with measures of sensory and anxiety symptomology, such that increased response to eye-contact was associated with increased symptomology. However, when receipt of eye-contact was predictable, individuals with ASD, relative to controls, exhibited slower N170s and no differences in the amplitude of N170 or P300. LIMITATIONS Our ASD sample was composed of adults with IQ > 70 and included only four autistic women. Thus, further research is needed to evaluate how these results generalize across the spectrum of age, sex, and cognitive ability. Additionally, as analyses were exploratory, some findings failed to survive false-discovery rate adjustment. CONCLUSIONS Neural response to eye-contact in ASD ranged from attenuated to hypersensitive depending on the predictability of the social context. These findings suggest that the vulnerabilities in eye-contact during social interactions in ASD may arise from differences in anticipation and expectation of eye-contact in addition to the perception of gaze alone.
Collapse
Affiliation(s)
- Adam J Naples
- Child Study Center, Yale University School of Medicine, New Haven, CT, USA.
| | - Jennifer H Foss-Feig
- Department of Psychiatry, Mount Sinai Icahn School of Medicine, New York, NY, USA
- Seaver Autism Center for Research and Treatment Mount Sinai Icahn School of Medicine, New York, NY, USA
| | - Julie M Wolf
- Child Study Center, Yale University School of Medicine, New Haven, CT, USA
| | - Vinod H Srihari
- Department of Psychiatry, Yale University School of Medicine, New Haven, CT, USA
| | - James C McPartland
- Child Study Center, Yale University School of Medicine, New Haven, CT, USA.
- Center for Brain and Mind Health, Yale University School of Medicine, New Haven, CT, USA.
| |
Collapse
|
3
|
EmBody/EmFace as a new open tool to assess emotion recognition from body and face expressions. Sci Rep 2022; 12:14165. [PMID: 35986068 PMCID: PMC9391359 DOI: 10.1038/s41598-022-17866-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2022] [Accepted: 08/02/2022] [Indexed: 01/29/2023] Open
Abstract
Nonverbal expressions contribute substantially to social interaction by providing information on another person’s intentions and feelings. While emotion recognition from dynamic facial expressions has been widely studied, dynamic body expressions and the interplay of emotion recognition from facial and body expressions have attracted less attention, as suitable diagnostic tools are scarce. Here, we provide validation data on a new open source paradigm enabling the assessment of emotion recognition from both 3D-animated emotional body expressions (Task 1: EmBody) and emotionally corresponding dynamic faces (Task 2: EmFace). Both tasks use visually standardized items depicting three emotional states (angry, happy, neutral), and can be used alone or together. We here demonstrate successful psychometric matching of the EmBody/EmFace items in a sample of 217 healthy subjects with excellent retest reliability and validity (correlations with the Reading-the-Mind-in-the-Eyes-Test and Autism-Spectrum Quotient, no correlations with intelligence, and given factorial validity). Taken together, the EmBody/EmFace is a novel, effective (< 5 min per task), highly standardized and reliably precise tool to sensitively assess and compare emotion recognition from body and face stimuli. The EmBody/EmFace has a wide range of potential applications in affective, cognitive and social neuroscience, and in clinical research studying face- and body-specific emotion recognition in patient populations suffering from social interaction deficits such as autism, schizophrenia, or social anxiety.
Collapse
|
4
|
Abstract
This article demonstrates how researchers from both the sciences and the humanities can learn from Charles Darwin’s mixed methodology. We identify two basic challenges that face emotion research in the sciences, namely a mismatch between experiment design and the complexity of life that we aim to explain, and problematic efforts to bridge the gap, including invalid inferences from constrained study designs, and equivocal use of terms like “sympathy” and “empathy” that poorly reflect such methodological constraints. We argue that Darwin’s mixed methodology is a model for addressing these challenges even in laboratory work on emotion, because it shows how close observation of emotional phenomena makes sense only within broader historical contexts. The article concludes with 5 practical research recommendations.
Collapse
|
5
|
Tillman R, Gordon I, Naples A, Rolison M, Leckman JF, Feldman R, Pelphrey KA, McPartland JC. Oxytocin Enhances the Neural Efficiency of Social Perception. Front Hum Neurosci 2019; 13:71. [PMID: 30914935 PMCID: PMC6421852 DOI: 10.3389/fnhum.2019.00071] [Citation(s) in RCA: 27] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2018] [Accepted: 02/13/2019] [Indexed: 11/17/2022] Open
Abstract
Face perception is a highly conserved process that directs our attention from infancy and is supported by specialized neural circuitry. Oxytocin (OT) can increase accuracy and detection of emotional faces, but these effects are mediated by valence, individual differences, and context. We investigated the temporal dynamics of OT’s influence on the neural substrates of face perception using event related potentials (ERPs). In a double blind, placebo controlled within-subject design, 21 healthy male adults inhaled OT or placebo and underwent ERP imaging during two face processing tasks. Experiment 1 investigated effects of OT on neural correlates of fearful vs. neutral facial expressions, and Experiment 2 manipulated point-of-gaze to neutral faces. In Experiment 1, we found that OT reduced N170 latency to fearful faces. In Experiment 2, N170 latency was decreased when participant gaze was directed to the eyes of neutral faces; however, there were no OT-associated effects in response to different facial features. Findings suggest OT modulates early stages of social perception for socially complex information such as emotional faces relative to neutral. These results are consistent with models suggesting OT impacts the salience of socially informative cues during processing, which leads to downstream effects in behavior. Future work should examine how OT affects neural processes underlying basic components of social behavior (such as, face perception) while varying emotional expression of stimuli or comparing different characteristics of participants (e.g., gender, personality traits).
Collapse
Affiliation(s)
- Rachael Tillman
- Department of Psychology, University of Maryland, College Park, College Park, MD, United States
| | - Ilanit Gordon
- Yale Child Study Center, School of Medicine, Yale University, New Haven, CT, United States.,Department of Psychology, Bar-Ilan University, Ramat Gan, Israel
| | - Adam Naples
- Yale Child Study Center, School of Medicine, Yale University, New Haven, CT, United States
| | - Max Rolison
- Yale Child Study Center, School of Medicine, Yale University, New Haven, CT, United States
| | - James F Leckman
- Yale Child Study Center, School of Medicine, Yale University, New Haven, CT, United States
| | - Ruth Feldman
- Yale Child Study Center, School of Medicine, Yale University, New Haven, CT, United States.,Department of Psychology, Interdisciplinary Center (IDC) Herzliya, Herzliya, Israel
| | - Kevin A Pelphrey
- Harrison-Wood Jefferson Scholars Foundation Professor, University of Virginia, Charlottesville, VA, United States
| | - James C McPartland
- Yale Child Study Center, School of Medicine, Yale University, New Haven, CT, United States
| |
Collapse
|
6
|
Naples AJ, Wu J, Mayes LC, McPartland JC. Event-related potentials index neural response to eye contact. Biol Psychol 2017; 127:18-24. [PMID: 28396215 DOI: 10.1016/j.biopsycho.2017.04.006] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2016] [Revised: 02/22/2017] [Accepted: 04/05/2017] [Indexed: 10/19/2022]
Abstract
Sensitivity to eye-contact is a foundation upon which social cognition is built. However, there are no known neural markers characterizing response to reciprocal gaze. Using co-registered EEG and eye-tracking, we measured brain activity while participants viewed faces that responded to their looking patterns. Contingent upon participant gaze, onscreen faces opened their eyes or mouths; in this way we measured brain response to reciprocal eye-contact. We identified two ERP components that were largest in response to reciprocal eye-contact: the N170 and the P300. The magnitude of the components' differences between reciprocal eye-contact and mouth movement predicted self-reported social function. Individuals with greater brain response to reciprocal eye-contact reported more normative scores on measures of autistic traits. These results present the first neural markers of eye-contact, revealing that reciprocal eye-contact is identified in less than 500ms. Furthermore, individual differences in brain response to eye-contact predict meaningful variability in self-reports of social performance.
Collapse
Affiliation(s)
- Adam J Naples
- Yale University School of Medicine, Child Study Center, 230 South Frontage Road, New Haven, CT, 06520, United States.
| | - Jia Wu
- Yale University School of Medicine, Child Study Center, 230 South Frontage Road, New Haven, CT, 06520, United States.
| | - Linda C Mayes
- Yale University School of Medicine, Child Study Center, 230 South Frontage Road, New Haven, CT, 06520, United States.
| | - James C McPartland
- Yale University School of Medicine, Child Study Center, 230 South Frontage Road, New Haven, CT, 06520, United States.
| |
Collapse
|
7
|
Hakala J, Kätsyri J, Häkkinen J. Stereoscopy Amplifies Emotions Elicited by Facial Expressions. Iperception 2016; 6:2041669515615071. [PMID: 27551358 PMCID: PMC4975116 DOI: 10.1177/2041669515615071] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Mediated facial expressions do not elicit emotions as strongly as real-life facial expressions, possibly due to the low fidelity of pictorial presentations in typical mediation technologies. In the present study, we investigated the extent to which stereoscopy amplifies emotions elicited by images of neutral, angry, and happy facial expressions. The emotional self-reports of positive and negative valence (which were evaluated separately) and arousal of 40 participants were recorded. The magnitude of perceived depth in the stereoscopic images was manipulated by varying the camera base at 15, 40, 65, 90, and 115 mm. The analyses controlled for participants' gender, gender match, emotional empathy, and trait alexithymia. The results indicated that stereoscopy significantly amplified the negative valence and arousal elicited by angry expressions at the most natural (65 mm) camera base, whereas stereoscopy amplified the positive valence elicited by happy expressions in both the narrowed and most natural (15-65 mm) base conditions. Overall, the results indicate that stereoscopy amplifies the emotions elicited by mediated emotional facial expressions when the depth geometry is close to natural. The findings highlight the sensitivity of the visual system to depth and its effect on emotions.
Collapse
Affiliation(s)
- Jussi Hakala
- Department of Computer Science, Aalto University, Espoo, Finland
| | - Jari Kätsyri
- Department of Computer Science, Aalto University, Espoo, Finland
| | - Jukka Häkkinen
- Institute of Behavioural Sciences, University of Helsinki, Finland
| |
Collapse
|
8
|
Abstract
Human faces are fundamentally dynamic, but experimental investigations of face perception have traditionally relied on static images of faces. Although naturalistic videos of actors have been used with success in some contexts, much research in neuroscience and psychophysics demands carefully controlled stimuli. In this article, we describe a novel set of computer-generated, dynamic face stimuli. These grayscale faces are tightly controlled for low- and high-level visual properties. All faces are standardized in terms of size, luminance, location, and the size of facial features. Each face begins with a neutral pose and transitions to an expression over the course of 30 frames. Altogether, 222 stimuli were created, spanning three different categories of movement: (1) an affective movement (fearful face), (2) a neutral movement (close-lipped, puffed cheeks with open eyes), and (3) a biologically impossible movement (upward dislocation of eyes and mouth). To determine whether early brain responses sensitive to low-level visual features differed between the expressions, we measured the occipital P100 event-related potential, which is known to reflect differences in early stages of visual processing, and the N170, which reflects structural encoding of faces. We found no differences between the faces at the P100, indicating that different face categories were well matched on low-level image properties. This database provides researchers with a well-controlled set of dynamic faces, controlled for low-level image characteristics, that are applicable to a range of research questions in social perception.
Collapse
|
9
|
Shahrestani S, Kemp AH, Guastella AJ. The impact of a single administration of intranasal oxytocin on the recognition of basic emotions in humans: a meta-analysis. Neuropsychopharmacology 2013; 38:1929-36. [PMID: 23575742 PMCID: PMC3746698 DOI: 10.1038/npp.2013.86] [Citation(s) in RCA: 206] [Impact Index Per Article: 18.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/24/2013] [Revised: 03/11/2013] [Accepted: 03/16/2013] [Indexed: 02/06/2023]
Abstract
Many studies have highlighted the potential of oxytocin (OT) to enhance facial affect recognition in healthy humans. However, inconsistencies have emerged with regard to the influence of OT on the recognition of specific emotional expressions (happy, angry, fear, surprise, disgust, and sadness). In this study, we conducted a meta-analysis of seven studies comprising 381 research participants (71 females) examining responses to the basic emotion types to assess whether OT enhances the recognition of emotion from human faces and whether this was influenced by the emotion expression and exposure time of the face. Results showed that intranasal OT administration enhances emotion recognition of faces overall, with a Hedges g effect size of 0.29. When analysis was restricted to facial expression types, significant effects of OT on recognition accuracy were specifically found for the recognition of happy and fear faces. We also found that effect sizes increased to moderate when exposure time of the photograph was restricted to early phase recognition (< 300 ms) for happy and angry faces, or later phase recognition for fear faces (> 300 ms). The results of the meta-analysis further suggest that OT has potential as a treatment to improve the recognition of emotion in faces, allowing individuals to improve their insight into the intentions, desires, and mental states of others.
Collapse
Affiliation(s)
- Sara Shahrestani
- Brain and Mind Research Institute, The University of Sydney, Camperdown, NSW, Australia
| | - Andrew H Kemp
- SCAN Research and Teaching Unit, School of Psychology, The University of Sydney, Sydney, NSW, Australia
| | - Adam J Guastella
- Brain and Mind Research Institute, The University of Sydney, Camperdown, NSW, Australia
| |
Collapse
|
10
|
Cecchini M, Aceto P, Altavilla D, Palumbo L, Lai C. The role of the eyes in processing an intact face and its scrambled image: a dense array ERP and low-resolution electromagnetic tomography (sLORETA) study. Soc Neurosci 2013; 8:314-25. [PMID: 23706064 DOI: 10.1080/17470919.2013.797020] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Abstract
The aim of the present study was to test whether the eyes of an intact face produced a specific brain response compared to the mouth, nose, or hair and whether their specificity was also maintained in a scrambled face. Fifteen subjects were asked to focus visual attention on global and single elements in intact faces and in their scrambled image. EEG data were recorded from 256-Hydrocel Geodesic Sensor-Net200. Event-related potentials (ERPs) analyses showed a difference between the intact face and the scrambled face from N170 component until 600 ms on the occipito-temporal montage and at 400-600 ms on the frontal montage. Only the eyes showed a difference between conditions (intact/scrambled face) at 500 ms. The most activated source detected by sLORETA was the right middle temporal gyrus (BA21) for both conditions and for all elements. Left BA21 resulted in significantly more activation in response to eyes in the intact face compared to the eyes in the scrambled face at 500 ms. The left BA21 has a central role in high-level visual processing and in understanding others' intentions. These findings suggest a specificity of the eyes and indicate that the eyes play the social and communicative role of comprehending the nonverbal intentions of others only when inserted in an intact face.
Collapse
Affiliation(s)
- Marco Cecchini
- Department of Dynamic and Clinical Psychology, Sapienza University of Rome, Roma, Italy
| | | | | | | | | |
Collapse
|
11
|
Kovács-Bálint Z, Bereczkei T, Hernádi I. The telltale face: possible mechanisms behind defector and cooperator recognition revealed by emotional facial expression metrics. Br J Psychol 2012; 104:563-76. [PMID: 24094284 DOI: 10.1111/bjop.12007] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2012] [Accepted: 07/24/2012] [Indexed: 11/30/2022]
Abstract
In this study, we investigated the role of facial cues in cooperator and defector recognition. First, a face image database was constructed from pairs of full face portraits of target subjects taken at the moment of decision-making in a prisoner's dilemma game (PDG) and in a preceding neutral task. Image pairs with no deficiencies (n = 67) were standardized for orientation and luminance. Then, confidence in defector and cooperator recognition was tested with image rating in a different group of lay judges (n = 62). Results indicate that (1) defectors were better recognized (58% vs. 47%), (2) they looked different from cooperators (p < .01), (3) males but not females evaluated the images with a relative bias towards the cooperator category (p < .01), and (4) females were more confident in detecting defectors (p < .05). According to facial microexpression analysis, defection was strongly linked with depressed lower lips and less opened eyes. Significant correlation was found between the intensity of micromimics and the rating of images in the cooperator-defector dimension. In summary, facial expressions can be considered as reliable indicators of momentary social dispositions in the PDG. Females may exhibit an evolutionary-based overestimation bias to detecting social visual cues of the defector face.
Collapse
|
12
|
Carlson CA, Gronlund SD, Weatherford DR, Carlson MA. Processing Differences between Feature-Based Facial Composites and Photos of Real Faces. APPLIED COGNITIVE PSYCHOLOGY 2012. [DOI: 10.1002/acp.2824] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
13
|
McPartland J, Cheung CHM, Perszyk D, Mayes LC. Face-related ERPs are modulated by point of gaze. Neuropsychologia 2010; 48:3657-60. [PMID: 20654637 DOI: 10.1016/j.neuropsychologia.2010.07.020] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2010] [Revised: 05/28/2010] [Accepted: 07/14/2010] [Indexed: 11/29/2022]
Abstract
This study examined the influence of gaze fixation on face-sensitive ERPs. A fixation crosshair presented prior to face onset directed visual attention to upper, central, or lower face regions while ERPs were recorded. This manipulation modulated a face-sensitive component (N170) but not an early sensory component (P1). Upper and lower face fixations elicited enhanced N170 amplitude and longer N170 latency. Results expand upon extant hemodynamic research by demonstrating early effects at basic stages of face processing. These findings distinguish attention to facial features in context from attention to isolated features, and they inform electrophysiological studies of face processing in clinical populations.
Collapse
Affiliation(s)
- James McPartland
- Yale Child Study Center, Yale School of Medicine, New Haven, CT 06520, USA.
| | | | | | | |
Collapse
|