1
|
Wang Y, Cao R, Chakravarthula PN, Yu H, Wang S. Atypical neural encoding of faces in individuals with autism spectrum disorder. Cereb Cortex 2024; 34:172-186. [PMID: 38696606 PMCID: PMC11065108 DOI: 10.1093/cercor/bhae060] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2023] [Revised: 02/02/2024] [Accepted: 02/03/2024] [Indexed: 05/04/2024] Open
Abstract
Individuals with autism spectrum disorder (ASD) experience pervasive difficulties in processing social information from faces. However, the behavioral and neural mechanisms underlying social trait judgments of faces in ASD remain largely unclear. Here, we comprehensively addressed this question by employing functional neuroimaging and parametrically generated faces that vary in facial trustworthiness and dominance. Behaviorally, participants with ASD exhibited reduced specificity but increased inter-rater variability in social trait judgments. Neurally, participants with ASD showed hypo-activation across broad face-processing areas. Multivariate analysis based on trial-by-trial face responses could discriminate participant groups in the majority of the face-processing areas. Encoding social traits in ASD engaged vastly different face-processing areas compared to controls, and encoding different social traits engaged different brain areas. Interestingly, the idiosyncratic brain areas encoding social traits in ASD were still flexible and context-dependent, similar to neurotypicals. Additionally, participants with ASD also showed an altered encoding of facial saliency features in the eyes and mouth. Together, our results provide a comprehensive understanding of the neural mechanisms underlying social trait judgments in ASD.
Collapse
Affiliation(s)
- Yue Wang
- Department of Radiology, Washington University in St. Louis, 4525 Scott Ave, St. Louis, MO 63110, United States
| | - Runnan Cao
- Department of Radiology, Washington University in St. Louis, 4525 Scott Ave, St. Louis, MO 63110, United States
| | - Puneeth N Chakravarthula
- Department of Radiology, Washington University in St. Louis, 4525 Scott Ave, St. Louis, MO 63110, United States
| | - Hongbo Yu
- Department of Psychological & Brain Sciences, University of California Santa Barbara, Santa Barbara, CA 93106, United States
| | - Shuo Wang
- Department of Radiology, Washington University in St. Louis, 4525 Scott Ave, St. Louis, MO 63110, United States
| |
Collapse
|
2
|
Meng F, Li F, Wu S, Yang T, Xiao Z, Zhang Y, Liu Z, Lu J, Luo X. Machine learning-based early diagnosis of autism according to eye movements of real and artificial faces scanning. Front Neurosci 2023; 17:1170951. [PMID: 37795184 PMCID: PMC10545898 DOI: 10.3389/fnins.2023.1170951] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2023] [Accepted: 08/17/2023] [Indexed: 10/06/2023] Open
Abstract
Background Studies on eye movements found that children with autism spectrum disorder (ASD) had abnormal gaze behavior to social stimuli. The current study aimed to investigate whether their eye movement patterns in relation to cartoon characters or real people could be useful in identifying ASD children. Methods Eye-tracking tests based on videos of cartoon characters and real people were performed for ASD and typically developing (TD) children aged between 12 and 60 months. A three-level hierarchical structure including participants, events, and areas of interest was used to arrange the data obtained from eye-tracking tests. Random forest was adopted as the feature selection tool and classifier, and the flattened vectors and diagnostic information were used as features and labels. A logistic regression was used to evaluate the impact of the most important features. Results A total of 161 children (117 ASD and 44 TD) with a mean age of 39.70 ± 12.27 months were recruited. The overall accuracy, precision, and recall of the model were 0.73, 0.73, and 0.75, respectively. Attention to human-related elements was positively related to the diagnosis of ASD, while fixation time for cartoons was negatively related to the diagnosis. Conclusion Using eye-tracking techniques with machine learning algorithms might be promising for identifying ASD. The value of artificial faces, such as cartoon characters, in the field of ASD diagnosis and intervention is worth further exploring.
Collapse
Affiliation(s)
- Fanchao Meng
- The National Clinical Research Center for Mental Disorder & Beijing Key Laboratory of Mental Disorders, Beijing Anding Hospital, Capital Medical University, Beijing, China
- Department of Psychiatry, and National Clinical Research Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha, Hunan, China
| | - Fenghua Li
- Key Lab of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Shuxian Wu
- Department of Psychiatry, and National Clinical Research Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha, Hunan, China
| | - Tingyu Yang
- Department of Psychiatry, and National Clinical Research Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha, Hunan, China
| | - Zhou Xiao
- Department of Child Psychiatry, Kangning Hospital of Shenzhen, Shenzhen Mental Health Center, Shenzhen, Guangdong, China
| | - Yujian Zhang
- Sichuan Cancer Hospital & Institute, Sichuan Cancer Center, Chengdu, Sichuan, China
| | - Zhengkui Liu
- Key Lab of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Jianping Lu
- Department of Child Psychiatry, Kangning Hospital of Shenzhen, Shenzhen Mental Health Center, Shenzhen, Guangdong, China
| | - Xuerong Luo
- Department of Psychiatry, and National Clinical Research Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha, Hunan, China
| |
Collapse
|
3
|
Wang S, Li X. A revisit of the amygdala theory of autism: Twenty years after. Neuropsychologia 2023; 183:108519. [PMID: 36803966 PMCID: PMC10824605 DOI: 10.1016/j.neuropsychologia.2023.108519] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2022] [Revised: 01/23/2023] [Accepted: 02/16/2023] [Indexed: 02/19/2023]
Abstract
The human amygdala has long been implicated to play a key role in autism spectrum disorder (ASD). Yet it remains unclear to what extent the amygdala accounts for the social dysfunctions in ASD. Here, we review studies that investigate the relationship between amygdala function and ASD. We focus on studies that employ the same task and stimuli to directly compare people with ASD and patients with focal amygdala lesions, and we also discuss functional data associated with these studies. We show that the amygdala can only account for a limited number of deficits in ASD (primarily face perception tasks but not social attention tasks), a network view is, therefore, more appropriate. We next discuss atypical brain connectivity in ASD, factors that can explain such atypical brain connectivity, and novel tools to analyze brain connectivity. Lastly, we discuss new opportunities from multimodal neuroimaging with data fusion and human single-neuron recordings that can enable us to better understand the neural underpinnings of social dysfunctions in ASD. Together, the influential amygdala theory of autism should be extended with emerging data-driven scientific discoveries such as machine learning-based surrogate models to a broader framework that considers brain connectivity at the global scale.
Collapse
Affiliation(s)
- Shuo Wang
- Department of Radiology, Washington University in St. Louis, St. Louis, MO 63110, USA; Lane Department of Computer Science and Electrical Engineering, West Virginia University, Morgantown, WV 26506, USA.
| | - Xin Li
- Lane Department of Computer Science and Electrical Engineering, West Virginia University, Morgantown, WV 26506, USA.
| |
Collapse
|
4
|
Tang WYF. Application of Eye Tracker to Detect Visual Processing of Children with Autism Spectrum Disorder. CURRENT DEVELOPMENTAL DISORDERS REPORTS 2022. [DOI: 10.1007/s40474-022-00252-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/14/2022]
|
5
|
Amaral DG, Nordahl CW. Amygdala Involvement in Autism: Early Postnatal Changes, But What Are the Behavioral Consequences? Am J Psychiatry 2022; 179:522-524. [PMID: 35921392 DOI: 10.1176/appi.ajp.20220509] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Affiliation(s)
- David G Amaral
- Department of Psychiatry and Behavioral Sciences, the MIND Institute, University of California, Davis, Sacramento
| | - Christine Wu Nordahl
- Department of Psychiatry and Behavioral Sciences, the MIND Institute, University of California, Davis, Sacramento
| |
Collapse
|
6
|
Ren X, Duan H, Min X, Zhu Y, Shen W, Wang L, Shi F, Fan L, Yang X, Zhai G. Where are the Children with Autism Looking in Reality? ARTIF INTELL 2022. [DOI: 10.1007/978-3-031-20500-2_48] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/02/2023]
|
7
|
Cao R, Li X, Brandmeir NJ, Wang S. Encoding of facial features by single neurons in the human amygdala and hippocampus. Commun Biol 2021; 4:1394. [PMID: 34907323 PMCID: PMC8671411 DOI: 10.1038/s42003-021-02917-1] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2021] [Accepted: 11/18/2021] [Indexed: 12/05/2022] Open
Abstract
Faces are salient social stimuli that attract a stereotypical pattern of eye movement. The human amygdala and hippocampus are involved in various aspects of face processing; however, it remains unclear how they encode the content of fixations when viewing faces. To answer this question, we employed single-neuron recordings with simultaneous eye tracking when participants viewed natural face stimuli. We found a class of neurons in the human amygdala and hippocampus that encoded salient facial features such as the eyes and mouth. With a control experiment using non-face stimuli, we further showed that feature selectivity was specific to faces. We also found another population of neurons that differentiated saccades to the eyes vs. the mouth. Population decoding confirmed our results and further revealed the temporal dynamics of face feature coding. Interestingly, we found that the amygdala and hippocampus played different roles in encoding facial features. Lastly, we revealed two functional roles of feature-selective neurons: 1) they encoded the salient region for face recognition, and 2) they were related to perceived social trait judgments. Together, our results link eye movement with neural face processing and provide important mechanistic insights for human face perception.
Collapse
Affiliation(s)
- Runnan Cao
- Lane Department of Computer Science and Electrical Engineering, West Virginia University, Morgantown, WV, 26506, USA.
| | - Xin Li
- Lane Department of Computer Science and Electrical Engineering, West Virginia University, Morgantown, WV, 26506, USA
| | - Nicholas J Brandmeir
- Department of Neurosurgery, West Virginia University, Morgantown, WV, 26506, USA
| | - Shuo Wang
- Lane Department of Computer Science and Electrical Engineering, West Virginia University, Morgantown, WV, 26506, USA.
- Department of Radiology, Washington University in St. Louis, St. Louis, MO, 63110, USA.
| |
Collapse
|
8
|
Liu W, Li M, Zou X, Raj B. Discriminative Dictionary Learning for Autism Spectrum Disorder Identification. Front Comput Neurosci 2021; 15:662401. [PMID: 34819846 PMCID: PMC8606656 DOI: 10.3389/fncom.2021.662401] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2021] [Accepted: 09/20/2021] [Indexed: 12/02/2022] Open
Abstract
Autism Spectrum Disorder (ASD) is a group of lifelong neurodevelopmental disorders with complicated causes. A key symptom of ASD patients is their impaired interpersonal communication ability. Recent study shows that face scanning patterns of individuals with ASD are often different from those of typical developing (TD) ones. Such abnormality motivates us to study the feasibility of identifying ASD children based on their face scanning patterns with machine learning methods. In this paper, we consider using the bag-of-words (BoW) model to encode the face scanning patterns, and propose a novel dictionary learning method based on dual mode seeking for better BoW representation. Unlike k-means which is broadly used in conventional BoW models to learn dictionaries, the proposed method captures discriminative information by finding atoms which maximizes both the purity and coverage of belonging samples within one class. Compared to the rich literature of ASD studies from psychology and neural science, our work marks one of the relatively few attempts to directly identify high-functioning ASD children with machine learning methods. Experiments demonstrate the superior performance of our method with considerable gain over several baselines. Although the proposed work is yet too preliminary to directly replace existing autism diagnostic observation schedules in the clinical practice, it shed light on future applications of machine learning methods in early screening of ASD.
Collapse
Affiliation(s)
- Wenbo Liu
- Department of Electrical and Computer Engineering, Carnegie Mellon University, Pittsburgh, PA, United States
- School of Electronics and Information Technology, Sun Yat-sen University, Guangzhou, China
| | - Ming Li
- Data Science Research Center, Duke Kunshan University, Suzhou, China
- School of Computer Science, Wuhan University, Wuhan, China
| | - Xiaobing Zou
- The Third Affiliated Hospital, Sun Yat-sen University, Guangzhou, China
| | - Bhiksha Raj
- Department of Electrical and Computer Engineering, Carnegie Mellon University, Pittsburgh, PA, United States
- Language Technologies Institute, Carnegie Mellon University, Pittsburgh, PA, United States
| |
Collapse
|
9
|
Intranasal vasopressin like oxytocin increases social attention by influencing top-down control, but additionally enhances bottom-up control. Psychoneuroendocrinology 2021; 133:105412. [PMID: 34537624 DOI: 10.1016/j.psyneuen.2021.105412] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/25/2021] [Revised: 09/01/2021] [Accepted: 09/02/2021] [Indexed: 11/22/2022]
Abstract
The respective roles of the neuropeptides arginine vasopressin (AVP) and oxytocin (OXT) in modulating social cognition and for therapeutic intervention in autism spectrum disorder have not been fully established. In particular, while numerous studies have demonstrated effects of oxytocin in promoting social attention the role of AVP has not been examined. The present study employed a randomized, double-blind, placebo (PLC)-controlled between-subject design to explore the social- and emotion-specific effects of AVP on both bottom-up and top-down attention processing with a validated emotional anti-saccade eye-tracking paradigm in 80 healthy male subjects (PLC = 40, AVP = 40). Our findings showed that AVP increased the error rate for social (angry, fearful, happy, neutral and sad faces) but not non-social (oval shapes) stimuli during the anti-saccade condition and reduced error rates in the pro-saccade condition. Comparison of these findings with a previous study (sample size: PLC = 33, OXT = 33) using intranasal oxytocin revealed similar effects of the two peptides on anti-saccade errors, although with some difference in effects of specific face emotions, but a significantly greater effect of AVP on pro-saccades. Both peptides also produced a post-task anxiolytic effect by reducing state anxiety. Together these findings suggested that both AVP and OXT decrease goal-directed top-down attention control to social salient stimuli but that AVP more potently increased bottom-up social attentional processing.
Collapse
|
10
|
Predicting atypical visual saliency for autism spectrum disorder via scale-adaptive inception module and discriminative region enhancement loss. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2020.06.125] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/19/2023]
|
11
|
Tang WYF, Fong KNK, Chung RCK. The Effects of Storytelling With or Without Social Contextual Information Regarding Eye Gaze and Visual Attention in Children with Autistic Spectrum Disorder and Typical Development: A Randomized, Controlled Eye-Tracking Study. J Autism Dev Disord 2021; 52:1257-1267. [PMID: 33909213 DOI: 10.1007/s10803-021-05012-w] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/06/2021] [Indexed: 11/24/2022]
Abstract
This study examined the effects of storytelling with or without contextual information on children with autism spectrum disorder (ASD) and typical development (TD) using eye-tracker. They were randomized into two groups-the stories included and did not include social contextual information respectively. Training was delivered in groups, with eight sessions across four weeks, 30 min/session. Participants' fixation duration, visit duration, and fixation count on human faces from 20 photos and a video were recorded. Our findings revealed that storytelling with social contextual information enhanced participants' eye gazes on eyes/ faces in static information (photos) for both children with ASD and TD, but the same advantage could not be seen for children with ASD in regard to dynamic information (videos).Clinical Trial Registration Number (URL: http://www.clinicaltrials.gov ): NCT04587557.
Collapse
Affiliation(s)
- Wilson Y F Tang
- Department of Rehabilitation Sciences, The Hong Kong Polytechnic University, Hong Kong, Hong Kong.,SAHK, Hong Kong, Hong Kong
| | - Kenneth N K Fong
- Department of Rehabilitation Sciences, The Hong Kong Polytechnic University, Hong Kong, Hong Kong.
| | - Raymond C K Chung
- Department of Rehabilitation Sciences, The Hong Kong Polytechnic University, Hong Kong, Hong Kong
| |
Collapse
|
12
|
Hedger N, Dubey I, Chakrabarti B. Social orienting and social seeking behaviors in ASD. A meta analytic investigation. Neurosci Biobehav Rev 2020; 119:376-395. [PMID: 33069686 DOI: 10.1016/j.neubiorev.2020.10.003] [Citation(s) in RCA: 25] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2020] [Revised: 10/02/2020] [Accepted: 10/03/2020] [Indexed: 12/29/2022]
Abstract
Social motivation accounts of autism spectrum disorder (ASD) posit that individuals with ASD find social stimuli less rewarding than neurotypical (NT) individuals. Behaviorally, this is proposed to manifest in reduced social orienting (individuals with ASD direct less attention towards social stimuli) and reduced social seeking (individuals with ASD invest less effort to receive social stimuli). In two meta-analyses, involving data from over 6000 participants, we review the available behavioral studies that assess social orienting and social seeking behaviors in ASD. We found robust evidence for reduced social orienting in ASD, across a range of paradigms, demographic variables and stimulus contexts. The most robust predictor of this effect was interactive content - effects were larger when the stimulus involved an interaction between people. By contrast, the evidence for reduced social seeking indicated weaker evidence for group differences, observed only under specific experimental conditions. The insights gained from this meta-analysis can inform design of relevant task measures for social reward responsivity and promote directions for further study on the ASD phenotype.
Collapse
Affiliation(s)
- Nicholas Hedger
- Centre for Autism, School of Psychology and Clinical Language Sciences, University of Reading, Reading, RG6 6AL, UK.
| | - Indu Dubey
- School of Applied Social Sciences, De Montfort University, The Gateway, Leicester, LE1 9BH, UK
| | - Bhismadev Chakrabarti
- Centre for Autism, School of Psychology and Clinical Language Sciences, University of Reading, Reading, RG6 6AL, UK
| |
Collapse
|
13
|
Ruan M, Webster PJ, Li X, Wang S. Deep Neural Network Reveals the World of Autism From a First-Person Perspective. Autism Res 2020; 14:333-342. [PMID: 32869953 DOI: 10.1002/aur.2376] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2020] [Revised: 07/26/2020] [Accepted: 07/27/2020] [Indexed: 01/22/2023]
Abstract
People with autism spectrum disorder (ASD) show atypical attention to social stimuli and aberrant gaze when viewing images of the physical world. However, it is unknown how they perceive the world from a first-person perspective. In this study, we used machine learning to classify photos taken in three different categories (people, indoors, and outdoors) as either having been taken by individuals with ASD or by peers without ASD. Our classifier effectively discriminated photos from all three categories, but was particularly successful at classifying photos of people with >80% accuracy. Importantly, visualization of our model revealed critical features that led to successful discrimination and showed that our model adopted a strategy similar to that of ASD experts. Furthermore, for the first time we showed that photos taken by individuals with ASD contained less salient objects, especially in the central visual field. Notably, our model outperformed classification of these photos by ASD experts. Together, we demonstrate an effective and novel method that is capable of discerning photos taken by individuals with ASD and revealing aberrant visual attention in ASD from a unique first-person perspective. Our method may in turn provide an objective measure for evaluations of individuals with ASD. LAY SUMMARY: People with autism spectrum disorder (ASD) demonstrate atypical visual attention to social stimuli. However, it remains largely unclear how they perceive the world from a first-person perspective. In this study, we employed a deep learning approach to analyze a unique dataset of photos taken by people with and without ASD. Our computer modeling was not only able to discern which photos were taken by individuals with ASD, outperforming ASD experts, but importantly, it revealed critical features that led to successful discrimination, revealing aspects of atypical visual attention in ASD from their first-person perspective.
Collapse
Affiliation(s)
- Mindi Ruan
- Lane Department of Computer Science and Electrical Engineering, West Virginia University, Morgantown, West Virginia, USA
| | - Paula J Webster
- Department of Chemical and Biomedical Engineering and Rockefeller Neuroscience Institute, West Virginia University, Morgantown, West Virginia, USA
| | - Xin Li
- Lane Department of Computer Science and Electrical Engineering, West Virginia University, Morgantown, West Virginia, USA
| | - Shuo Wang
- Department of Chemical and Biomedical Engineering and Rockefeller Neuroscience Institute, West Virginia University, Morgantown, West Virginia, USA
| |
Collapse
|
14
|
Wang S, Mamelak AN, Adolphs R, Rutishauser U. Abstract goal representation in visual search by neurons in the human pre-supplementary motor area. Brain 2019; 142:3530-3549. [PMID: 31549164 PMCID: PMC6821249 DOI: 10.1093/brain/awz279] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2019] [Revised: 06/14/2019] [Accepted: 07/11/2019] [Indexed: 01/18/2023] Open
Abstract
The medial frontal cortex is important for goal-directed behaviours such as visual search. The pre-supplementary motor area (pre-SMA) plays a critical role in linking higher-level goals to actions, but little is known about the responses of individual cells in this area in humans. Pre-SMA dysfunction is thought to be a critical factor in the cognitive deficits that are observed in diseases such as Parkinson's disease and schizophrenia, making it important to develop a better mechanistic understanding of the pre-SMA's role in cognition. We simultaneously recorded single neurons in the human pre-SMA and eye movements while subjects performed goal-directed visual search tasks. We characterized two groups of neurons in the pre-SMA. First, 40% of neurons changed their firing rate whenever a fixation landed on the search target. These neurons responded to targets in an abstract manner across several conditions and tasks. Responses were invariant to motor output (i.e. button press or not), and to different ways of defining the search target (by instruction or pop-out). Second, ∼50% of neurons changed their response as a function of fixation order. Together, our results show that human pre-SMA neurons carry abstract signals during visual search that indicate whether a goal was reached in an action- and cue-independent manner. This suggests that the pre-SMA contributes to goal-directed behaviour by flexibly signalling goal detection and time elapsed since start of the search, and this process occurs regardless of task. These observations provide insights into how pre-SMA dysfunction might impact cognitive function.
Collapse
Affiliation(s)
- Shuo Wang
- Department of Chemical and Biomedical Engineering, and Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV, USA
- Division of Biology and Biological Engineering, California Institute of Technology, Pasadena, CA, USA
| | - Adam N Mamelak
- Department of Neurosurgery, Cedars-Sinai Medical Center, Los Angeles, CA, USA
| | - Ralph Adolphs
- Division of Biology and Biological Engineering, California Institute of Technology, Pasadena, CA, USA
| | - Ueli Rutishauser
- Division of Biology and Biological Engineering, California Institute of Technology, Pasadena, CA, USA
- Department of Neurosurgery, Cedars-Sinai Medical Center, Los Angeles, CA, USA
| |
Collapse
|
15
|
Oxytocin reduces top-down control of attention by increasing bottom-up attention allocation to social but not non-social stimuli - A randomized controlled trial. Psychoneuroendocrinology 2019; 108:62-69. [PMID: 31229634 DOI: 10.1016/j.psyneuen.2019.06.004] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/17/2019] [Revised: 04/22/2019] [Accepted: 06/06/2019] [Indexed: 01/18/2023]
Abstract
The neuropeptide oxytocin (OXT) may facilitate attention to social stimuli by influencing early stage bottom-up processing although findings in relation to different emotional expressions are inconsistent and its influence on top-down cognitive processing mechanisms unclear. In the current double-blind placebo (PLC) controlled between-subject design study we therefore recruited 71 male subjects (OXT = 34, PLC = 37) to investigate the effects of intranasal OXT (24IU) on both bottom-up attention allocation and top-down attention inhibition using a prosaccade and antisaccade paradigm incorporating social (neutral, happy, fearful, sad, angry faces) and non-social (oval shape) visual stimuli with concurrent eye movement acquisition. Results revealed a marginal significant interaction effect between treatment, condition and task (p = 0.054), with Bonferroni-corrected post-hoc tests indicating that OXT specifically increased antisaccade errors for social stimuli (ps < 0.04, effect sizes 0.46-0.88), but not non-social stimuli. Antisaccades are under volitional control and therefore this may indicate that OXT treatment reduced top-down inhibition. However, the overall findings are consistent with OXT acting to reduce top-down control of attention as a result of increasing bottom-up early attentional processing of social, but not non-social, stimuli in situations where the two systems are in potential conflict. Marked deficits in bottom-up attention allocation to social stimuli have been reported in autism spectrum disorder, within this context OXT may have the potential to increase early attention allocation towards social cues.
Collapse
|
16
|
Król ME, Król M. Scanpath similarity measure reveals not only a decreased social preference, but also an increased nonsocial preference in individuals with autism. AUTISM : THE INTERNATIONAL JOURNAL OF RESEARCH AND PRACTICE 2019; 24:374-386. [DOI: 10.1177/1362361319865809] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
We compared scanpath similarity in response to repeated presentations of social and nonsocial images representing natural scenes in a sample of 30 participants with autism spectrum disorder and 32 matched typically developing individuals. We used scanpath similarity (calculated using ScanMatch) as a novel measure of attentional bias or preference, which constrains eye-movement patterns by directing attention to specific visual or semantic features of the image. We found that, compared with the control group, scanpath similarity of participants with autism was significantly higher in response to nonsocial images, and significantly lower in response to social images. Moreover, scanpaths of participants with autism were more similar to scanpaths of other participants with autism in response to nonsocial images, and less similar in response to social images. Finally, we also found that in response to nonsocial images, scanpath similarity of participants with autism did not decline with stimulus repetition to the same extent as in the control group, which suggests more perseverative attention in the autism spectrum disorder group. These results show a preferential fixation on certain elements of social stimuli in typically developing individuals compared with individuals with autism, and on certain elements of nonsocial stimuli in the autism spectrum disorder group, compared with the typically developing group.
Collapse
Affiliation(s)
| | - Michał Król
- School of Social Sciences, The University of Manchester, UK
| |
Collapse
|
17
|
Wang S, Chandravadia N, Mamelak AN, Rutishauser U. Simultaneous Eye Tracking and Single-Neuron Recordings in Human Epilepsy Patients. J Vis Exp 2019. [PMID: 31259902 DOI: 10.3791/59117] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/31/2022] Open
Abstract
Intracranial recordings from patients with intractable epilepsy provide a unique opportunity to study the activity of individual human neurons during active behavior. An important tool for quantifying behavior is eye tracking, which is an indispensable tool for studying visual attention. However, eye tracking is challenging to use concurrently with invasive electrophysiology and this approach has consequently been little used. Here, we present a proven experimental protocol to conduct single-neuron recordings with simultaneous eye tracking in humans. We describe how the systems are connected and the optimal settings to record neurons and eye movements. To illustrate the utility of this method, we summarize results that were made possible by this setup. This data shows how using eye tracking in a memory-guided visual search task allowed us to describe a new class of neurons called target neurons, whose response was reflective of top-down attention to the current search target. Lastly, we discuss the significance and solutions to potential problems of this setup. Together, our protocol and results suggest that single-neuron recordings with simultaneous eye tracking in humans are an effective method to study human brain function. It provides a key missing link between animal neurophysiology and human cognitive neuroscience.
Collapse
Affiliation(s)
- Shuo Wang
- Department of Chemical and Biomedical Engineering, and Rockefeller Neuroscience Institute, West Virginia University;
| | - Nand Chandravadia
- Departments of Neurosurgery and Neurology, Cedars-Sinai Medical Center
| | - Adam N Mamelak
- Departments of Neurosurgery and Neurology, Cedars-Sinai Medical Center;
| | - Ueli Rutishauser
- Departments of Neurosurgery and Neurology, Cedars-Sinai Medical Center; Center for Neural Science and Medicine, Department of Biomedical Sciences, Cedars-Sinai Medical Center; Division of Biology and Biological Engineering, California Institute of Technology;
| |
Collapse
|
18
|
Waytz A, Cacioppo JT, Hurlemann R, Castelli F, Adolphs R, Paul LK. Anthropomorphizing without Social Cues Requires the Basolateral Amygdala. J Cogn Neurosci 2019; 31:482-496. [DOI: 10.1162/jocn_a_01365] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/03/2023]
Abstract
Anthropomorphism, the attribution of distinctively human mental characteristics to nonhuman animals and objects, illustrates the human propensity for extending social cognition beyond typical social targets. Yet, its processing components remain challenging to study because they are typically all engaged simultaneously. Across one pilot study and one focal study, we tested three rare people with basolateral amygdala lesions to dissociate two specific processing components: those triggered by attention to social cues (e.g., seeing a face) and those triggered by endogenous semantic knowledge (e.g., imbuing a machine with animacy). A pilot study demonstrated that, like neurologically intact control group participants, the three amygdala-damaged participants produced anthropomorphic descriptions for highly socially salient stimuli but not for stimuli lacking clear social cues. A focal study found that the three amygdala participants could anthropomorphize animate and living entities normally, but anthropomorphized inanimate stimuli less than control participants. Our findings suggest that the amygdala contributes to how we anthropomorphize stimuli that are not explicitly social.
Collapse
|
19
|
Gray KLH, Haffey A, Mihaylova HL, Chakrabarti B. Lack of Privileged Access to Awareness for Rewarding Social Scenes in Autism Spectrum Disorder. J Autism Dev Disord 2018; 48:3311-3318. [PMID: 29728947 PMCID: PMC6153919 DOI: 10.1007/s10803-018-3595-9] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
Reduced social motivation is hypothesised to underlie social behavioural symptoms of Autism Spectrum Disorder (ASD). The extent to which rewarding social stimuli are granted privileged access to awareness in ASD is currently unknown. We use continuous flash suppression to investigate whether individuals with and without ASD show privileged access to awareness for social over nonsocial rewarding scenes that are closely matched for stimulus features. Strong evidence for a privileged access to awareness for rewarding social over nonsocial scenes was observed in neurotypical adults. No such privileged access was seen in ASD individuals, and moderate support for the null model was noted. These results suggest that the purported deficits in social motivation in ASD may extend to early processing mechanisms.
Collapse
Affiliation(s)
- Katie L H Gray
- Centre for Autism, School of Psychology and Clinical Language Sciences, University of Reading, Reading, RG6 6AL, UK
| | - Anthony Haffey
- Centre for Autism, School of Psychology and Clinical Language Sciences, University of Reading, Reading, RG6 6AL, UK
| | - Hristina L Mihaylova
- Centre for Autism, School of Psychology and Clinical Language Sciences, University of Reading, Reading, RG6 6AL, UK
| | - Bhismadev Chakrabarti
- Centre for Autism, School of Psychology and Clinical Language Sciences, University of Reading, Reading, RG6 6AL, UK.
| |
Collapse
|
20
|
Wang S, Mamelak AN, Adolphs R, Rutishauser U. Encoding of Target Detection during Visual Search by Single Neurons in the Human Brain. Curr Biol 2018; 28:2058-2069.e4. [PMID: 29910078 DOI: 10.1016/j.cub.2018.04.092] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2018] [Revised: 04/03/2018] [Accepted: 04/27/2018] [Indexed: 10/14/2022]
Abstract
Neurons in the primate medial temporal lobe (MTL) respond selectively to visual categories such as faces, contributing to how the brain represents stimulus meaning. However, it remains unknown whether MTL neurons continue to encode stimulus meaning when it changes flexibly as a function of variable task demands imposed by goal-directed behavior. While classically associated with long-term memory, recent lesion and neuroimaging studies show that the MTL also contributes critically to the online guidance of goal-directed behaviors such as visual search. Do such tasks modulate responses of neurons in the MTL, and if so, do their responses mirror bottom-up input from visual cortices or do they reflect more abstract goal-directed properties? To answer these questions, we performed concurrent recordings of eye movements and single neurons in the MTL and medial frontal cortex (MFC) in human neurosurgical patients performing a memory-guided visual search task. We identified a distinct population of target-selective neurons in both the MTL and MFC whose response signaled whether the currently fixated stimulus was a target or distractor. This target-selective response was invariant to visual category and predicted whether a target was detected or missed behaviorally during a given fixation. The response latencies, relative to fixation onset, of MFC target-selective neurons preceded those in the MTL by ∼200 ms, suggesting a frontal origin for the target signal. The human MTL thus represents not only fixed stimulus identity, but also task-specified stimulus relevance due to top-down goal relevance.
Collapse
Affiliation(s)
- Shuo Wang
- Department of Chemical and Biomedical Engineering, and Rockefeller Neuroscience Institute, West Virginia University, 1 Medical Center Dr, Morgantown, WV 26506, USA; Computation and Neural Systems, California Institute of Technology, 1200 E California Blvd, Pasadena, CA 91125, USA.
| | - Adam N Mamelak
- Departments of Neurosurgery and Neurology, Cedars-Sinai Medical Center, 8700 Beverly Blvd, Los Angeles, CA 90048, USA
| | - Ralph Adolphs
- Computation and Neural Systems, California Institute of Technology, 1200 E California Blvd, Pasadena, CA 91125, USA; Division of Biology and Biological Engineering, California Institute of Technology, 1200 E California Blvd, Pasadena, CA 91125, USA.
| | - Ueli Rutishauser
- Departments of Neurosurgery and Neurology, Cedars-Sinai Medical Center, 8700 Beverly Blvd, Los Angeles, CA 90048, USA; Center for Neural Science and Medicine, Department of Biomedical Sciences, Cedars-Sinai Medical Center, 8700 Beverly Blvd, Los Angeles, CA 90048, USA; Division of Biology and Biological Engineering, California Institute of Technology, 1200 E California Blvd, Pasadena, CA 91125, USA.
| |
Collapse
|
21
|
Casellato C, Gandolla M, Crippa A, Pedrocchi A. Robotic set-up to quantify hand-eye behavior in motor execution and learning of children with autism spectrum disorder. IEEE Int Conf Rehabil Robot 2017; 2017:953-958. [PMID: 28813944 DOI: 10.1109/icorr.2017.8009372] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Autism spectrum disorder (ASD) is a multifaceted neurodevelopmental disorder characterized by a persistence of social and communication impairment, and restricted and repetitive behaviors. However, motor disorders have also been described, but not objectively assessed. Most studies showed inefficient eye-hand coordination and motor learning in children with ASD; in other experiments, mechanisms of acquisition of internal models in self-generated movements appeared to be normal in autism. In this framework, we have developed a robotic protocol, recording gaze and hand data during upper limb tasks, in which a haptic pen-like handle is moved along specific trajectories displayed on the screen. The protocol includes trials of reaching under a perturbing force field and catching moving targets, with or without visual availability of the whole path. We acquired 16 typically-developing scholar-age children and one child with ASD as a case study. Speed-accuracy tradeoff, motor performance, and gaze-hand spatial coordination have been evaluated. Compared to typically developing peers, in the force field sequence, the child with ASD showed an intact but delayed learning, and more variable gazehand patterns. In the catching trials, he showed less efficient movements, but an intact capability of exploiting the available a-priori plan. The proposed protocol represents a powerful tool, easily tunable, for quantitative (longitudinal) assessment, and for subject-tailored training in ASD.
Collapse
|
22
|
Wang S, Adolphs R. Reduced specificity in emotion judgment in people with autism spectrum disorder. Neuropsychologia 2017; 99:286-295. [PMID: 28343960 DOI: 10.1016/j.neuropsychologia.2017.03.024] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2016] [Revised: 02/13/2017] [Accepted: 03/22/2017] [Indexed: 11/29/2022]
Abstract
There is a conflicting literature on facial emotion processing in autism spectrum disorder (ASD): both typical and atypical performance have been reported, and inconsistencies in the literature may stem from different processes examined (emotion judgment, face perception, fixations) as well as differences in participant populations. Here we conducted a detailed investigation of the ability to discriminate graded emotions shown in morphs of fear-happy faces, in a well-characterized high-functioning sample of participants with ASD and matched controls. Signal detection approaches were used in the analyses, and concurrent high-resolution eye-tracking was collected. Although people with ASD had typical thresholds for categorical fear and confidence judgments, their psychometric specificity to detect emotions across the entire range of intensities was reduced. However, fixation patterns onto the stimuli were typical and could not account for the reduced specificity of emotion judgment. Together, our results argue for a subtle and specific deficit in emotion perception in ASD that, from a signal detection perspective, is best understood as a reduced specificity due to increased noise in central processing of the face stimuli.
Collapse
Affiliation(s)
- Shuo Wang
- Humanities and Social Sciences, California Institute of Technology, Pasadena, CA 91125, USA; Princeton Neuroscience Institute, Princeton University, Princeton, NJ 08540, USA.
| | - Ralph Adolphs
- Humanities and Social Sciences, California Institute of Technology, Pasadena, CA 91125, USA.
| |
Collapse
|
23
|
Chita-Tegmark M. Social attention in ASD: A review and meta-analysis of eye-tracking studies. RESEARCH IN DEVELOPMENTAL DISABILITIES 2016; 48:79-93. [PMID: 26547134 DOI: 10.1016/j.ridd.2015.10.011] [Citation(s) in RCA: 203] [Impact Index Per Article: 25.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/20/2015] [Revised: 09/29/2015] [Accepted: 10/19/2015] [Indexed: 06/05/2023]
Abstract
Determining whether social attention is reduced in Autism Spectrum Disorder (ASD) and what factors influence social attention is important to our theoretical understanding of developmental trajectories of ASD and to designing targeted interventions for ASD. This meta-analysis examines data from 38 articles that used eye-tracking methods to compare individuals with ASD and TD controls. In this paper, the impact of eight factors on the size of the effect for the difference in social attention between these two groups are evaluated: age, non-verbal IQ matching, verbal IQ matching, motion, social content, ecological validity, audio input and attention bids. Results show that individuals with ASD spend less time attending to social stimuli than typically developing (TD) controls, with a mean effect size of 0.55. Social attention in ASD was most impacted when stimuli had a high social content (showed more than one person). This meta-analysis provides an opportunity to survey the eye-tracking research on social attention in ASD and to outline potential future research directions, more specifically research of social attention in the context of stimuli with high social content.
Collapse
Affiliation(s)
- Meia Chita-Tegmark
- Department of Psychological and Brain Sciences, Boston University, United States.
| |
Collapse
|
24
|
Wang S, Jiang M, Duchesne XM, Laugeson EA, Kennedy DP, Adolphs R, Zhao Q. Atypical Visual Saliency in Autism Spectrum Disorder Quantified through Model-Based Eye Tracking. Neuron 2015; 88:604-16. [PMID: 26593094 DOI: 10.1016/j.neuron.2015.09.042] [Citation(s) in RCA: 164] [Impact Index Per Article: 18.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2015] [Revised: 07/21/2015] [Accepted: 09/21/2015] [Indexed: 01/21/2023]
Abstract
The social difficulties that are a hallmark of autism spectrum disorder (ASD) are thought to arise, at least in part, from atypical attention toward stimuli and their features. To investigate this hypothesis comprehensively, we characterized 700 complex natural scene images with a novel three-layered saliency model that incorporated pixel-level (e.g., contrast), object-level (e.g., shape), and semantic-level attributes (e.g., faces) on 5,551 annotated objects. Compared with matched controls, people with ASD had a stronger image center bias regardless of object distribution, reduced saliency for faces and for locations indicated by social gaze, and yet a general increase in pixel-level saliency at the expense of semantic-level saliency. These results were further corroborated by direct analysis of fixation characteristics and investigation of feature interactions. Our results for the first time quantify atypical visual attention in ASD across multiple levels and categories of objects.
Collapse
Affiliation(s)
- Shuo Wang
- Computation and Neural Systems, California Institute of Technology, Pasadena, CA 91125, USA; Humanities and Social Sciences, California Institute of Technology, Pasadena, CA 91125, USA
| | - Ming Jiang
- Department of Electrical and Computer Engineering, National University of Singapore, Singapore 117583, Singapore
| | - Xavier Morin Duchesne
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN 47405, USA
| | - Elizabeth A Laugeson
- Department of Psychiatry and PEERS Clinic, Semel Institute for Neuroscience and Human Behavior, University of California, Los Angeles, Los Angeles, CA 90024, USA
| | - Daniel P Kennedy
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN 47405, USA
| | - Ralph Adolphs
- Computation and Neural Systems, California Institute of Technology, Pasadena, CA 91125, USA; Humanities and Social Sciences, California Institute of Technology, Pasadena, CA 91125, USA
| | - Qi Zhao
- Department of Electrical and Computer Engineering, National University of Singapore, Singapore 117583, Singapore.
| |
Collapse
|
25
|
Chevallier C, Parish-Morris J, McVey A, Rump KM, Sasson NJ, Herrington JD, Schultz RT. Measuring social attention and motivation in autism spectrum disorder using eye-tracking: Stimulus type matters. Autism Res 2015; 8:620-8. [PMID: 26069030 DOI: 10.1002/aur.1479] [Citation(s) in RCA: 144] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2014] [Accepted: 02/24/2015] [Indexed: 11/11/2022]
Abstract
Autism Spectrum Disorder (ASD) is characterized by social impairments that have been related to deficits in social attention, including diminished gaze to faces. Eye-tracking studies are commonly used to examine social attention and social motivation in ASD, but they vary in sensitivity. In this study, we hypothesized that the ecological nature of the social stimuli would affect participants' social attention, with gaze behavior during more naturalistic scenes being most predictive of ASD vs. typical development. Eighty-one children with and without ASD participated in three eye-tracking tasks that differed in the ecological relevance of the social stimuli. In the "Static Visual Exploration" task, static images of objects and people were presented; in the "Dynamic Visual Exploration" task, video clips of individual faces and objects were presented side-by-side; in the "Interactive Visual Exploration" task, video clips of children playing with objects in a naturalistic context were presented. Our analyses uncovered a three-way interaction between Task, Social vs. Object Stimuli, and Diagnosis. This interaction was driven by group differences on one task only-the Interactive task. Bayesian analyses confirmed that the other two tasks were insensitive to group membership. In addition, receiver operating characteristic analyses demonstrated that, unlike the other two tasks, the Interactive task had significant classification power. The ecological relevance of social stimuli is an important factor to consider for eye-tracking studies aiming to measure social attention and motivation in ASD.
Collapse
Affiliation(s)
- Coralie Chevallier
- Center for Autism Research, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania.,Laboratoire de Neurosciences Cognitives, INSERM U960, DEC, Ecole Normale Supérieure, Paris, France
| | - Julia Parish-Morris
- Center for Autism Research, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | - Alana McVey
- Center for Autism Research, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | - Keiran M Rump
- Center for Autism Research, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | - Noah J Sasson
- School of Behavioral and Brain Sciences, The University of Texas at Dallas, Richardson, Texas
| | - John D Herrington
- Center for Autism Research, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania.,Department of Pediatrics, University of Pennsylvania School of Medicine, Philadelphia, PA
| | - Robert T Schultz
- Center for Autism Research, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania.,Departments of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania
| |
Collapse
|