1
|
Ghost on the Windshield: Employing a Virtual Human Character to Communicate Pedestrian Acknowledgement and Vehicle Intention. INFORMATION 2022. [DOI: 10.3390/info13090420] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
Pedestrians base their street-crossing decisions on vehicle-centric as well as driver-centric cues. In the future, however, drivers of autonomous vehicles will be preoccupied with non-driving related activities and will thus be unable to provide pedestrians with relevant communicative cues. External human–machine interfaces (eHMIs) hold promise for filling the expected communication gap by providing information about a vehicle’s situational awareness and intention. In this paper, we present an eHMI concept that employs a virtual human character (VHC) to communicate pedestrian acknowledgement and vehicle intention (non-yielding; cruising; yielding). Pedestrian acknowledgement is communicated via gaze direction while vehicle intention is communicated via facial expression. The effectiveness of the proposed anthropomorphic eHMI concept was evaluated in the context of a monitor-based laboratory experiment where the participants performed a crossing intention task (self-paced, two-alternative forced choice) and their accuracy in making appropriate street-crossing decisions was measured. In each trial, they were first presented with a 3D animated sequence of a VHC (male; female) that either looked directly at them or clearly to their right while producing either an emotional (smile; angry expression; surprised expression), a conversational (nod; head shake), or a neutral (neutral expression; cheek puff) facial expression. Then, the participants were asked to imagine they were pedestrians intending to cross a one-way street at a random uncontrolled location when they saw an autonomous vehicle equipped with the eHMI approaching from the right and indicate via mouse click whether they would cross the street in front of the oncoming vehicle or not. An implementation of the proposed concept where non-yielding intention is communicated via the VHC producing either an angry expression, a surprised expression, or a head shake; cruising intention is communicated via the VHC puffing its cheeks; and yielding intention is communicated via the VHC nodding, was shown to be highly effective in ensuring the safety of a single pedestrian or even two co-located pedestrians without compromising traffic flow in either case. The implications for the development of intuitive, culture-transcending eHMIs that can support multiple pedestrians in parallel are discussed.
Collapse
|
2
|
Type of Task Instruction Enhances the Role of Face and Context in Emotion Perception. JOURNAL OF NONVERBAL BEHAVIOR 2021. [DOI: 10.1007/s10919-021-00383-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
3
|
Hudac CM, Naples A, DesChamps TD, Coffman MC, Kresse A, Ward T, Mukerji C, Aaronson B, Faja S, McPartland JC, Bernier R. Modeling temporal dynamics of face processing in youth and adults. Soc Neurosci 2021; 16:345-361. [PMID: 33882266 PMCID: PMC8324546 DOI: 10.1080/17470919.2021.1920050] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
A hierarchical model of temporal dynamics was examined in adults (n = 34) and youth (n = 46) across the stages of face processing during the perception of static and dynamic faces. Three ERP components (P100, N170, N250) and spectral power in the mu range were extracted, corresponding to cognitive stages of face processing: low-level vision processing, structural encoding, higher-order processing, and action understanding. Youth and adults exhibited similar yet distinct patterns of hierarchical temporal dynamics such that earlier cognitive stages predicted later stages, directly and indirectly. However, latent factors indicated unique profiles related to behavioral performance for adults and youth and age as a continuous factor. The application of path analysis to electrophysiological data can yield novel insights into the cortical dynamics of social information processing.
Collapse
Affiliation(s)
- Caitlin M Hudac
- Center for Youth Development and Intervention and Department of Psychology, University of Alabama, Tuscaloosa, AL, USA
| | - Adam Naples
- Yale Child Study Center, Yale University, New Haven, CT, USA
| | - Trent D DesChamps
- Department of Psychiatry and Behavioral Sciences, University of Washington, Seattle, WA, USA
| | - Marika C Coffman
- Center for Autism and Brain Development and Department of Psychiatry and Behavioral Sciences, Duke University, Durham, NC, USA
| | - Anna Kresse
- Department of Psychiatry and Behavioral Sciences, University of Washington, Seattle, WA, USA
| | - Tracey Ward
- Department of Psychiatry and Behavioral Sciences, University of Washington, Seattle, WA, USA.,The Seattle Clinic, Seattle, WA, USA
| | - Cora Mukerji
- Boston Children's Hospital, Harvard Medical School, Boston, MA, USA
| | - Benjamin Aaronson
- Department of Pediatrics, University of Washington, Seattle, WA, USA
| | | | | | - Raphael Bernier
- Department of Psychiatry and Behavioral Sciences, University of Washington, Seattle, WA, USA
| |
Collapse
|
4
|
Kala S, Rolison MJ, Trevisan DA, Naples AJ, Pelphrey K, Ventola P, McPartland JC. Brief Report: Preliminary Evidence of the N170 as a Biomarker of Response to Treatment in Autism Spectrum Disorder. Front Psychiatry 2021; 12:709382. [PMID: 34267691 PMCID: PMC8275957 DOI: 10.3389/fpsyt.2021.709382] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/13/2021] [Accepted: 06/02/2021] [Indexed: 11/13/2022] Open
Abstract
Background: Autism Spectrum Disorder (ASD) is a neurodevelopmental disorder characterized by primary difficulties in social function. Individuals with ASD display slowed neural processing of faces, as indexed by the latency of the N170, a face-sensitive event-related potential. Currently, there are no objective biomarkers of ASD useful in clinical care or research. Efficacy of behavioral treatment is currently evaluated through subjective clinical impressions. To explore whether the N170 might have utility as an objective index of treatment response, we examined N170 before and after receipt of an empirically validated behavioral treatment in children with ASD. Method: Electroencephalography (EEG) data were obtained on a preliminary cohort of preschool-aged children with ASD before and after a 16-week course of PRT and in a subset of participants in waitlist control (16-weeks before the start of PRT) and follow-up (16-weeks after the end of PRT). EEG was recorded while participants viewed computer-generated faces with neutral and fearful affect. Results: Significant reductions in N170 latency to faces were observed following 16 weeks of PRT intervention. Change in N170 latency was not observed in the waitlist-control condition. Conclusions: This exploratory study offers suggestive evidence that N170 latency may index response to behavioral treatment. Future, more rigorous, studies in larger samples are indicated to evaluate whether the N170 may be useful as a biomarker of treatment response.
Collapse
Affiliation(s)
- Shashwat Kala
- Child Study Center, Yale School of Medicine, New Haven, CT, United States
| | - Max J. Rolison
- Child Study Center, Yale School of Medicine, New Haven, CT, United States
| | | | - Adam J. Naples
- Child Study Center, Yale School of Medicine, New Haven, CT, United States
| | - Kevin Pelphrey
- Department of Neurology, University of Virginia, Charlottesville, VA, United States
| | - Pamela Ventola
- Child Study Center, Yale School of Medicine, New Haven, CT, United States
| | | |
Collapse
|
5
|
Desai A, Foss-Feig JH, Naples AJ, Coffman M, Trevisan DA, McPartland JC. Autistic and alexithymic traits modulate distinct aspects of face perception. Brain Cogn 2019; 137:103616. [PMID: 31734588 DOI: 10.1016/j.bandc.2019.103616] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2019] [Revised: 09/16/2019] [Accepted: 09/19/2019] [Indexed: 10/25/2022]
Abstract
BACKGROUND Atypical face processing is a prominent feature of autism spectrum disorder (ASD) but is not universal and is subject to individual variability. This heterogeneity could be accounted for by reliable yet unidentified subgroups within the diverse population of individuals with ASD. Alexithymia, which is characterized by difficulties in emotion recognition and identification, serves as a potential grouping factor. Recent research demonstrates that emotion recognition impairments in ASD are predicted by its comorbidity with alexithymia. The current study assessed the relative influence of autistic versus alexithymic traits on neural indices of face and emotion perception. METHODS Capitalizing upon the temporal sensitivity of event-related potentials (ERPs), it investigates the distinct contributions of alexithymic versus autistic traits at specific stages of emotional face processing in 27 typically developing adults (18 female). ERP components reflecting sequential stages of perceptual processing (P100, N170 and N250) were recorded in response to fear and neutral faces. RESULTS The results indicated that autistic traits were associated with structural encoding of faces (N170), whereas alexithymic traits were associated with more complex emotion decoding (N250). CONCLUSIONS These findings have important implications for deconstructing heterogeneity within ASD.
Collapse
Affiliation(s)
- Aishani Desai
- Yale Child Study Center, Yale University, New Haven, CT 06519, United States; Department of Psychology, Macquarie University, Sydney, Australia
| | - Jennifer H Foss-Feig
- Yale Child Study Center, Yale University, New Haven, CT 06519, United States; Department of Psychiatry and Seaver Autism Center for Research and Treatment, Icahn School of Medicine at Mount Sinai, New York, NY, United States
| | - Adam J Naples
- Yale Child Study Center, Yale University, New Haven, CT 06519, United States
| | - Marika Coffman
- Yale Child Study Center, Yale University, New Haven, CT 06519, United States; Department of Psychology, Virginia Polytechnic Institute and State University, Blacksburg, VA 24060, United States
| | - Dominic A Trevisan
- Yale Child Study Center, Yale University, New Haven, CT 06519, United States
| | - James C McPartland
- Yale Child Study Center, Yale University, New Haven, CT 06519, United States.
| |
Collapse
|
6
|
Tillman R, Gordon I, Naples A, Rolison M, Leckman JF, Feldman R, Pelphrey KA, McPartland JC. Oxytocin Enhances the Neural Efficiency of Social Perception. Front Hum Neurosci 2019; 13:71. [PMID: 30914935 PMCID: PMC6421852 DOI: 10.3389/fnhum.2019.00071] [Citation(s) in RCA: 27] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2018] [Accepted: 02/13/2019] [Indexed: 11/17/2022] Open
Abstract
Face perception is a highly conserved process that directs our attention from infancy and is supported by specialized neural circuitry. Oxytocin (OT) can increase accuracy and detection of emotional faces, but these effects are mediated by valence, individual differences, and context. We investigated the temporal dynamics of OT’s influence on the neural substrates of face perception using event related potentials (ERPs). In a double blind, placebo controlled within-subject design, 21 healthy male adults inhaled OT or placebo and underwent ERP imaging during two face processing tasks. Experiment 1 investigated effects of OT on neural correlates of fearful vs. neutral facial expressions, and Experiment 2 manipulated point-of-gaze to neutral faces. In Experiment 1, we found that OT reduced N170 latency to fearful faces. In Experiment 2, N170 latency was decreased when participant gaze was directed to the eyes of neutral faces; however, there were no OT-associated effects in response to different facial features. Findings suggest OT modulates early stages of social perception for socially complex information such as emotional faces relative to neutral. These results are consistent with models suggesting OT impacts the salience of socially informative cues during processing, which leads to downstream effects in behavior. Future work should examine how OT affects neural processes underlying basic components of social behavior (such as, face perception) while varying emotional expression of stimuli or comparing different characteristics of participants (e.g., gender, personality traits).
Collapse
Affiliation(s)
- Rachael Tillman
- Department of Psychology, University of Maryland, College Park, College Park, MD, United States
| | - Ilanit Gordon
- Yale Child Study Center, School of Medicine, Yale University, New Haven, CT, United States.,Department of Psychology, Bar-Ilan University, Ramat Gan, Israel
| | - Adam Naples
- Yale Child Study Center, School of Medicine, Yale University, New Haven, CT, United States
| | - Max Rolison
- Yale Child Study Center, School of Medicine, Yale University, New Haven, CT, United States
| | - James F Leckman
- Yale Child Study Center, School of Medicine, Yale University, New Haven, CT, United States
| | - Ruth Feldman
- Yale Child Study Center, School of Medicine, Yale University, New Haven, CT, United States.,Department of Psychology, Interdisciplinary Center (IDC) Herzliya, Herzliya, Israel
| | - Kevin A Pelphrey
- Harrison-Wood Jefferson Scholars Foundation Professor, University of Virginia, Charlottesville, VA, United States
| | - James C McPartland
- Yale Child Study Center, School of Medicine, Yale University, New Haven, CT, United States
| |
Collapse
|
7
|
Calvo MG, Fernández-Martín A, Recio G, Lundqvist D. Human Observers and Automated Assessment of Dynamic Emotional Facial Expressions: KDEF-dyn Database Validation. Front Psychol 2018; 9:2052. [PMID: 30416473 PMCID: PMC6212581 DOI: 10.3389/fpsyg.2018.02052] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2018] [Accepted: 10/05/2018] [Indexed: 12/11/2022] Open
Abstract
Most experimental studies of facial expression processing have used static stimuli (photographs), yet facial expressions in daily life are generally dynamic. In its original photographic format, the Karolinska Directed Emotional Faces (KDEF) has been frequently utilized. In the current study, we validate a dynamic version of this database, the KDEF-dyn. To this end, we applied animation between neutral and emotional expressions (happy, sad, angry, fearful, disgusted, and surprised; 1,033-ms unfolding) to 40 KDEF models, with morphing software. Ninety-six human observers categorized the expressions of the resulting 240 video-clip stimuli, and automated face analysis assessed the evidence for 6 expressions and 20 facial action units (AUs) at 31 intensities. Low-level image properties (luminance, signal-to-noise ratio, etc.) and other purely perceptual factors (e.g., size, unfolding speed) were controlled. Human recognition performance (accuracy, efficiency, and confusions) patterns were consistent with prior research using static and other dynamic expressions. Automated assessment of expressions and AUs was sensitive to intensity manipulations. Significant correlations emerged between human observers' categorization and automated classification. The KDEF-dyn database aims to provide a balance between experimental control and ecological validity for research on emotional facial expression processing. The stimuli and the validation data are available to the scientific community.
Collapse
Affiliation(s)
- Manuel G. Calvo
- Department of Cognitive Psychology, Universidad de La Laguna, San Cristóbal de La Laguna, Spain
- Instituto Universitario de Neurociencia (IUNE), Universidad de La Laguna, Santa Cruz de Tenerife, Spain
| | | | - Guillermo Recio
- Institute of Psychology, Universität Hamburg, Hamburg, Germany
| | - Daniel Lundqvist
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| |
Collapse
|
8
|
Coffman MC, Trubanova A, Richey JA, White SW, Kim-Spoon J, Ollendick TH, Pine DS. Validation of the NIMH-ChEFS adolescent face stimulus set in an adolescent, parent, and health professional sample. Int J Methods Psychiatr Res 2015; 24:275-86. [PMID: 26359940 PMCID: PMC5103077 DOI: 10.1002/mpr.1490] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/31/2014] [Revised: 05/02/2015] [Accepted: 07/02/2015] [Indexed: 11/10/2022] Open
Abstract
Attention to faces is a fundamental psychological process in humans, with atypical attention to faces noted across several clinical disorders. Although many clinical disorders onset in adolescence, there is a lack of well-validated stimulus sets containing adolescent faces available for experimental use. Further, the images comprising most available sets are not controlled for high- and low-level visual properties. Here, we present a cross-site validation of the National Institute of Mental Health Child Emotional Faces Picture Set (NIMH-ChEFS), comprised of 257 photographs of adolescent faces displaying angry, fearful, happy, sad, and neutral expressions. All of the direct facial images from the NIMH-ChEFS set were adjusted in terms of location of facial features and standardized for luminance, size, and smoothness. Although overall agreement between raters in this study and the original development-site raters was high (89.52%), this differed by group such that agreement was lower for adolescents relative to mental health professionals in the current study. These results suggest that future research using this face set or others of adolescent/child faces should base comparisons on similarly-aged validation data. Copyright © 2015 John Wiley & Sons, Ltd.
Collapse
Affiliation(s)
- Marika C Coffman
- Department of Psychology, Virginia Polytechnic Institute and State University, Blacksburg, VA, USA
| | - Andrea Trubanova
- Department of Psychology, Virginia Polytechnic Institute and State University, Blacksburg, VA, USA
| | - J Anthony Richey
- Department of Psychology, Virginia Polytechnic Institute and State University, Blacksburg, VA, USA
| | - Susan W White
- Department of Psychology, Virginia Polytechnic Institute and State University, Blacksburg, VA, USA
| | - Jungmeen Kim-Spoon
- Department of Psychology, Virginia Polytechnic Institute and State University, Blacksburg, VA, USA
| | - Thomas H Ollendick
- Department of Psychology, Virginia Polytechnic Institute and State University, Blacksburg, VA, USA
| | - Daniel S Pine
- Section on Development and Affective Neuroscience, Mood, and Anxiety Programs, National Institutes of Mental Health Intramural Research Program, Bethesda, MD, USA
| |
Collapse
|
9
|
Rolison MJ, Naples AJ, McPartland JC. Interactive social neuroscience to study autism spectrum disorder. THE YALE JOURNAL OF BIOLOGY AND MEDICINE 2015; 88:17-24. [PMID: 25745371 PMCID: PMC4345534] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 12/03/2022]
Abstract
Individuals with autism spectrum disorder (ASD) demonstrate difficulty with social interactions and relationships, but the neural mechanisms underlying these difficulties remain largely unknown. While social difficulties in ASD are most apparent in the context of interactions with other people, most neuroscience research investigating ASD have provided limited insight into the complex dynamics of these interactions. The development of novel, innovative "interactive social neuroscience" methods to study the brain in contexts with two interacting humans is a necessary advance for ASD research. Studies applying an interactive neuroscience approach to study two brains engaging with one another have revealed significant differences in neural processes during interaction compared to observation in brain regions that are implicated in the neuropathology of ASD. Interactive social neuroscience methods are crucial in clarifying the mechanisms underlying the social and communication deficits that characterize ASD.
Collapse
Affiliation(s)
| | | | - James C. McPartland
- Yale Child Study Center, New Haven, Connecticut,James C. McPartland, PhD, Yale Child Study Center, 230 South Frontage Road, New Haven, CT 06520; Tele: 203-785-7179; Fax: 203-737-4197;
| |
Collapse
|