1
|
Jertberg RM, Begeer S, Geurts HM, Chakrabarti B, Van der Burg E. Age, not autism, influences multisensory integration of speech stimuli among adults in a McGurk/MacDonald paradigm. Eur J Neurosci 2024; 59:2979-2994. [PMID: 38570828 DOI: 10.1111/ejn.16319] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2023] [Revised: 02/27/2024] [Accepted: 02/28/2024] [Indexed: 04/05/2024]
Abstract
Differences between autistic and non-autistic individuals in perception of the temporal relationships between sights and sounds are theorized to underlie difficulties in integrating relevant sensory information. These, in turn, are thought to contribute to problems with speech perception and higher level social behaviour. However, the literature establishing this connection often involves limited sample sizes and focuses almost entirely on children. To determine whether these differences persist into adulthood, we compared 496 autistic and 373 non-autistic adults (aged 17 to 75 years). Participants completed an online version of the McGurk/MacDonald paradigm, a multisensory illusion indicative of the ability to integrate audiovisual speech stimuli. Audiovisual asynchrony was manipulated, and participants responded both to the syllable they perceived (revealing their susceptibility to the illusion) and to whether or not the audio and video were synchronized (allowing insight into temporal processing). In contrast with prior research with smaller, younger samples, we detected no evidence of impaired temporal or multisensory processing in autistic adults. Instead, we found that in both groups, multisensory integration correlated strongly with age. This contradicts prior presumptions that differences in multisensory perception persist and even increase in magnitude over the lifespan of autistic individuals. It also suggests that the compensatory role multisensory integration may play as the individual senses decline with age is intact. These findings challenge existing theories and provide an optimistic perspective on autistic development. They also underline the importance of expanding autism research to better reflect the age range of the autistic population.
Collapse
Affiliation(s)
- Robert M Jertberg
- Department of Clinical and Developmental Psychology, Vrije Universiteit Amsterdam, The Netherlands and Amsterdam Public Health Research Institute, Amsterdam, Netherlands
| | - Sander Begeer
- Department of Clinical and Developmental Psychology, Vrije Universiteit Amsterdam, The Netherlands and Amsterdam Public Health Research Institute, Amsterdam, Netherlands
| | - Hilde M Geurts
- Dutch Autism and ADHD Research Center (d'Arc), Brain & Cognition, Department of Psychology, Universiteit van Amsterdam, Amsterdam, The Netherlands
- Leo Kannerhuis (Youz/Parnassiagroup), Den Haag, The Netherlands
| | - Bhismadev Chakrabarti
- Centre for Autism, School of Psychology and Clinical Language Sciences, University of Reading, Reading, UK
- India Autism Center, Kolkata, India
- Department of Psychology, Ashoka University, Sonipat, India
| | - Erik Van der Burg
- Dutch Autism and ADHD Research Center (d'Arc), Brain & Cognition, Department of Psychology, Universiteit van Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|
2
|
Ross LA, Molholm S, Butler JS, Del Bene VA, Brima T, Foxe JJ. Neural correlates of audiovisual narrative speech perception in children and adults on the autism spectrum: A functional magnetic resonance imaging study. Autism Res 2024; 17:280-310. [PMID: 38334251 DOI: 10.1002/aur.3104] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2023] [Accepted: 01/19/2024] [Indexed: 02/10/2024]
Abstract
Autistic individuals show substantially reduced benefit from observing visual articulations during audiovisual speech perception, a multisensory integration deficit that is particularly relevant to social communication. This has mostly been studied using simple syllabic or word-level stimuli and it remains unclear how altered lower-level multisensory integration translates to the processing of more complex natural multisensory stimulus environments in autism. Here, functional neuroimaging was used to examine neural correlates of audiovisual gain (AV-gain) in 41 autistic individuals to those of 41 age-matched non-autistic controls when presented with a complex audiovisual narrative. Participants were presented with continuous narration of a story in auditory-alone, visual-alone, and both synchronous and asynchronous audiovisual speech conditions. We hypothesized that previously identified differences in audiovisual speech processing in autism would be characterized by activation differences in brain regions well known to be associated with audiovisual enhancement in neurotypicals. However, our results did not provide evidence for altered processing of auditory alone, visual alone, audiovisual conditions or AV- gain in regions associated with the respective task when comparing activation patterns between groups. Instead, we found that autistic individuals responded with higher activations in mostly frontal regions where the activation to the experimental conditions was below baseline (de-activations) in the control group. These frontal effects were observed in both unisensory and audiovisual conditions, suggesting that these altered activations were not specific to multisensory processing but reflective of more general mechanisms such as an altered disengagement of Default Mode Network processes during the observation of the language stimulus across conditions.
Collapse
Affiliation(s)
- Lars A Ross
- The Frederick J. and Marion A. Schindler Cognitive Neurophysiology Laboratory, The Ernest J. Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, New York, USA
- Department of Imaging Sciences, University of Rochester Medical Center, University of Rochester School of Medicine and Dentistry, Rochester, New York, USA
- The Cognitive Neurophysiology Laboratory, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, New York, USA
| | - Sophie Molholm
- The Frederick J. and Marion A. Schindler Cognitive Neurophysiology Laboratory, The Ernest J. Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, New York, USA
- The Cognitive Neurophysiology Laboratory, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, New York, USA
| | - John S Butler
- The Cognitive Neurophysiology Laboratory, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, New York, USA
- School of Mathematics and Statistics, Technological University Dublin, City Campus, Dublin, Ireland
| | - Victor A Del Bene
- The Cognitive Neurophysiology Laboratory, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, New York, USA
- Heersink School of Medicine, Department of Neurology, University of Alabama at Birmingham, Birmingham, Alabama, USA
| | - Tufikameni Brima
- The Frederick J. and Marion A. Schindler Cognitive Neurophysiology Laboratory, The Ernest J. Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, New York, USA
| | - John J Foxe
- The Frederick J. and Marion A. Schindler Cognitive Neurophysiology Laboratory, The Ernest J. Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, New York, USA
- The Cognitive Neurophysiology Laboratory, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, New York, USA
| |
Collapse
|
3
|
Hisaizumi M, Tantam D. Enhanced sensitivity to pitch perception and its possible relation to language acquisition in autism. AUTISM & DEVELOPMENTAL LANGUAGE IMPAIRMENTS 2024; 9:23969415241248618. [PMID: 38817731 PMCID: PMC11138189 DOI: 10.1177/23969415241248618] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/01/2024]
Abstract
Background and aims Fascinations for or aversions to particular sounds are a familiar feature of autism, as is an ability to reproduce another person's utterances, precisely copying the other person's prosody as well as their words. Such observations seem to indicate not only that autistic people can pay close attention to what they hear, but also that they have the ability to perceive the finer details of auditory stimuli. This is consistent with the previously reported consensus that absolute pitch is more common in autistic individuals than in neurotypicals. We take this to suggest that autistic people have perception that allows them to pay attention to fine details. It is important to establish whether or not this is so as autism is often presented as a deficit rather than a difference. We therefore undertook a narrative literature review of studies of auditory perception, in autistic and nonautistic individuals, focussing on any differences in processing linguistic and nonlinguistic sounds. Main contributions We find persuasive evidence that nonlinguistic auditory perception in autistic children differs from that of nonautistic children. This is supported by the additional finding of a higher prevalence of absolute pitch and enhanced pitch discriminating abilities in autistic children compared to neurotypical children. Such abilities appear to stem from atypical perception, which is biased toward local-level information necessary for processing pitch and other prosodic features. Enhanced pitch discriminating abilities tend to be found in autistic individuals with a history of language delay, suggesting possible reciprocity. Research on various aspects of language development in autism also supports the hypothesis that atypical pitch perception may be accountable for observed differences in language development in autism. Conclusions The results of our review of previously published studies are consistent with the hypothesis that auditory perception, and particularly pitch perception, in autism are different from the norm but not always impaired. Detail-oriented pitch perception may be an advantage given the right environment. We speculate that unusually heightened sensitivity to pitch differences may be at the cost of the normal development of the perception of the sounds that contribute most to early language development. Implications The acquisition of speech and language may be a process that normally involves an enhanced perception of speech sounds at the expense of the processing of nonlinguistic sounds, but autistic children may not give speech sounds this same priority.
Collapse
Affiliation(s)
| | - Digby Tantam
- Middlesex University, Existential Academy, London, UK
| |
Collapse
|
4
|
Jones SA, Noppeney U. Multisensory Integration and Causal Inference in Typical and Atypical Populations. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1437:59-76. [PMID: 38270853 DOI: 10.1007/978-981-99-7611-9_4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
Multisensory perception is critical for effective interaction with the environment, but human responses to multisensory stimuli vary across the lifespan and appear changed in some atypical populations. In this review chapter, we consider multisensory integration within a normative Bayesian framework. We begin by outlining the complex computational challenges of multisensory causal inference and reliability-weighted cue integration, and discuss whether healthy young adults behave in accordance with normative Bayesian models. We then compare their behaviour with various other human populations (children, older adults, and those with neurological or neuropsychiatric disorders). In particular, we consider whether the differences seen in these groups are due only to changes in their computational parameters (such as sensory noise or perceptual priors), or whether the fundamental computational principles (such as reliability weighting) underlying multisensory perception may also be altered. We conclude by arguing that future research should aim explicitly to differentiate between these possibilities.
Collapse
Affiliation(s)
- Samuel A Jones
- Department of Psychology, Nottingham Trent University, Nottingham, UK.
| | - Uta Noppeney
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
| |
Collapse
|
5
|
Khullar V, Singh HP. Vocal-friend: internet of social-things framework to aid verbal communication. Disabil Rehabil Assist Technol 2023; 18:1527-1535. [PMID: 35404708 DOI: 10.1080/17483107.2022.2060349] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2021] [Accepted: 03/26/2022] [Indexed: 10/18/2022]
Abstract
PURPOSE Deficits in social verbal communication in individuals with Social Communication Disorder (SCD) is of concern and SCD in the human community is prevalent in large population throughout the globe. Deficits in verbal social communication are prevalent in a large population. This paper aimed to propose internet connected multi-system architecture which is capable to support verbal communication in a social environment for individuals with social communication deficits. MATERIAL AND METHODS Implementation methodology was included with corpus collection for specific communication, deep learning based machine training for intelligent communication, and implementation of the trained algorithm on internet connected electronic multiple social communication devices. The implemented system is smart enough to initiate and maintain two types of communication; the first type includes communication between multiple individuals on the remote location and the second type includes communication with the individual present in the physical listening range. RESULTS The system was investigated in terms of its algorithmic parameters and found 97% to 100% in terms of training and testing accuracy with negligible mean squared error. Vocal-Friend analysed results based on audio-bot simulative conditions provide more than 91% accuracy, interaction rate and fallback rate. On the basis of the satisfaction analysis, above average results were noticed. CONCLUSION In terms of technical implementations and satisfaction analysis, results found acceptable with above average score.IMPLICATION FOR REHABILITATIONProposed framework is easy to use by caregivers with even having little knowledge.Support individual with deficit to learn social verbal communication skill to survive in society.Aiding parents, caregivers and professionals to understand the communication needs of individuals with communication deficits.Since technology is also grooming in the domain of rehabilitation, so this system could be used in various future applications such as social robots, social virtual assistants etc.
Collapse
Affiliation(s)
- Vikas Khullar
- Chitkara University Institute of Engineering and Technology, Chitkara University, Punjab, India
| | - Harjit Pal Singh
- CT Institute of Engineering, Management and Technology, Punjab, India
| |
Collapse
|
6
|
Hughes L, Kargas N, Wilhelm M, Meyerhoff HS, Föcker J. The Impact of Audio-Visual, Visual and Auditory Cues on Multiple Object Tracking Performance in Children with Autism. Percept Mot Skills 2023; 130:2047-2068. [PMID: 37452765 PMCID: PMC10552336 DOI: 10.1177/00315125231187984] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/18/2023]
Abstract
Previous studies have documented differences in processing multisensory information by children with autism compared to typically developing children. Furthermore, children with autism have been found to track fewer multiple objects on a screen than those without autism, suggesting reduced attentional control. In the present study, we investigated whether children with autism (n = 33) and children without autism (n = 33) were able to track four target objects moving amongst four indistinguishable distractor objects while sensory cues were presented. During tracking, we presented various types of cues - auditory, visual, or audio-visual or no cues while target objects bounced off the inner boundary of a centralized circle. We found that children with autism tracked fewer targets than children without autism. Furthermore, children without autism showed improved tracking performance in the presence of visual cues, whereas children with autism did not benefit from sensory cues. Whereas multiple object tracking performance improved with increasing age in children without autism, especially when using audio-visual cues, children with autism did not show age-related improvement in tracking. These results are in line with the hypothesis that attention and the ability to integrate sensory cues during tracking are reduced in children with autism. Our findings could contribute valuable insights for designing interventions that incorporate multisensory information.
Collapse
Affiliation(s)
- Lily Hughes
- School of Psychology, College of Social Science, University of Lincoln, Lincoln, UK
| | - Niko Kargas
- School of Psychology, College of Social Science, University of Lincoln, Lincoln, UK
| | - Maximilian Wilhelm
- Center for Psychotherapy Research, University Hospital Heidelberg, Heidelberg, Germany
| | | | - Julia Föcker
- School of Psychology, College of Social Science, University of Lincoln, Lincoln, UK
| |
Collapse
|
7
|
Choi I, Demir I, Oh S, Lee SH. Multisensory integration in the mammalian brain: diversity and flexibility in health and disease. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220338. [PMID: 37545309 PMCID: PMC10404930 DOI: 10.1098/rstb.2022.0338] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 04/30/2023] [Indexed: 08/08/2023] Open
Abstract
Multisensory integration (MSI) occurs in a variety of brain areas, spanning cortical and subcortical regions. In traditional studies on sensory processing, the sensory cortices have been considered for processing sensory information in a modality-specific manner. The sensory cortices, however, send the information to other cortical and subcortical areas, including the higher association cortices and the other sensory cortices, where the multiple modality inputs converge and integrate to generate a meaningful percept. This integration process is neither simple nor fixed because these brain areas interact with each other via complicated circuits, which can be modulated by numerous internal and external conditions. As a result, dynamic MSI makes multisensory decisions flexible and adaptive in behaving animals. Impairments in MSI occur in many psychiatric disorders, which may result in an altered perception of the multisensory stimuli and an abnormal reaction to them. This review discusses the diversity and flexibility of MSI in mammals, including humans, primates and rodents, as well as the brain areas involved. It further explains how such flexibility influences perceptual experiences in behaving animals in both health and disease. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Ilsong Choi
- Center for Synaptic Brain Dysfunctions, Institute for Basic Science (IBS), Daejeon 34141, Republic of Korea
| | - Ilayda Demir
- Department of biological sciences, KAIST, Daejeon 34141, Republic of Korea
| | - Seungmi Oh
- Department of biological sciences, KAIST, Daejeon 34141, Republic of Korea
| | - Seung-Hee Lee
- Center for Synaptic Brain Dysfunctions, Institute for Basic Science (IBS), Daejeon 34141, Republic of Korea
- Department of biological sciences, KAIST, Daejeon 34141, Republic of Korea
| |
Collapse
|
8
|
Irwin J, Harwood V, Kleinman D, Baron A, Avery T, Turcios J, Landi N. Neural and Behavioral Differences in Speech Perception for Children With Autism Spectrum Disorders Within an Audiovisual Context. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2023; 66:2390-2403. [PMID: 37390407 PMCID: PMC10468115 DOI: 10.1044/2023_jslhr-22-00661] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/18/2022] [Revised: 01/30/2023] [Accepted: 03/27/2023] [Indexed: 07/02/2023]
Abstract
PURPOSE Reduced use of visible articulatory information on a speaker's face has been implicated as a possible contributor to language deficits in autism spectrum disorders (ASD). We employ an audiovisual (AV) phonemic restoration paradigm to measure behavioral performance (button press) and event-related potentials (ERPs) of visual speech perception in children with ASD and their neurotypical peers to assess potential neural substrates that contribute to group differences. METHOD Two sets of speech stimuli, /ba/-"/a/" ("/a/" was created from the /ba/ token by a reducing the initial consonant) and /ba/-/pa/, were presented within an auditory oddball paradigm to children aged 6-13 years with ASD (n = 17) and typical development (TD; n = 33) within two conditions. The AV condition contained a fully visible speaking face; the pixelated (PX) condition included a face, but the mouth and jaw were PX, removing all articulatory information. When articulatory features were present for the /ba/-"/a/" contrast, it was expected that the influence of the visual articulators would facilitate a phonemic restoration effect in which "/a/" would be perceived as /ba/. ERPs were recorded during the experiment while children were required to press a button for the deviant sound for both sets of speech contrasts within both conditions. RESULTS Button press data revealed that TD children were more accurate in discriminating between /ba/-"/a/" and /ba/-/pa/ contrasts in the PX condition relative to the ASD group. ERPs in response to the /ba/-/pa/ contrast within both AV and PX conditions differed between children with ASD and TD children (earlier P300 responses for children with ASD). CONCLUSION Children with ASD differ in the underlying neural mechanisms responsible for speech processing compared with TD peers within an AV context.
Collapse
Affiliation(s)
- Julia Irwin
- Department of Psychology, Southern Connecticut State University, New Haven
- Haskins Laboratories, Yale University, New Haven, CT
| | - Vanessa Harwood
- Department of Communicative Disorders, University of Rhode Island, Kingston
| | | | - Alisa Baron
- Department of Communicative Disorders, University of Rhode Island, Kingston
| | | | - Jacqueline Turcios
- Department of Speech-Language Pathology, University of New Haven, West Haven, CT
| | - Nicole Landi
- Haskins Laboratories, Yale University, New Haven, CT
- Department of Psychological Sciences, University of Connecticut, Storrs
| |
Collapse
|
9
|
Dunham-Carr K, Feldman JI, Simon DM, Edmunds SR, Tu A, Kuang W, Conrad JG, Santapuram P, Wallace MT, Woynaroski TG. The Processing of Audiovisual Speech Is Linked with Vocabulary in Autistic and Nonautistic Children: An ERP Study. Brain Sci 2023; 13:1043. [PMID: 37508976 PMCID: PMC10377472 DOI: 10.3390/brainsci13071043] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2023] [Revised: 06/29/2023] [Accepted: 07/05/2023] [Indexed: 07/30/2023] Open
Abstract
Explaining individual differences in vocabulary in autism is critical, as understanding and using words to communicate are key predictors of long-term outcomes for autistic individuals. Differences in audiovisual speech processing may explain variability in vocabulary in autism. The efficiency of audiovisual speech processing can be indexed via amplitude suppression, wherein the amplitude of the event-related potential (ERP) is reduced at the P2 component in response to audiovisual speech compared to auditory-only speech. This study used electroencephalography (EEG) to measure P2 amplitudes in response to auditory-only and audiovisual speech and norm-referenced, standardized assessments to measure vocabulary in 25 autistic and 25 nonautistic children to determine whether amplitude suppression (a) differs or (b) explains variability in vocabulary in autistic and nonautistic children. A series of regression analyses evaluated associations between amplitude suppression and vocabulary scores. Both groups demonstrated P2 amplitude suppression, on average, in response to audiovisual speech relative to auditory-only speech. Between-group differences in mean amplitude suppression were nonsignificant. Individual differences in amplitude suppression were positively associated with expressive vocabulary through receptive vocabulary, as evidenced by a significant indirect effect observed across groups. The results suggest that efficiency of audiovisual speech processing may explain variance in vocabulary in autism.
Collapse
Affiliation(s)
- Kacie Dunham-Carr
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN 37232, USA
- Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, TN 37232, USA
| | - Jacob I Feldman
- Frist Center for Autism and Innovation, Vanderbilt University, Nashville, TN 37232, USA
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, TN 37232, USA
| | - David M Simon
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN 37232, USA
| | - Sarah R Edmunds
- Department of Psychology, University of Washington, Seattle, WA 98195, USA
- Department of Psychology, University of South Carolina, Columbia, SC 29208, USA
- Department of Educational Studies, University of South Carolina, Columbia, SC 29208, USA
| | - Alexander Tu
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN 37232, USA
- Department of Otolaryngology and Communication Sciences, Medical College of Wisconsin, Milwaukee, WI 53226, USA
| | - Wayne Kuang
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN 37232, USA
- Department of Pediatrics, Los Angeles General Medical Center, Keck School of Medicine of University of Southern California, Los Angeles, CA 90033, USA
| | - Julie G Conrad
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN 37232, USA
- College of Medicine, University of Illinois Hospital, Chicago, IL 60612, USA
| | - Pooja Santapuram
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN 37232, USA
- Department of Anesthesiology, Columbia University Irving Medical Center, New York City, NY 10032, USA
| | - Mark T Wallace
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN 37232, USA
- Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, TN 37232, USA
- Frist Center for Autism and Innovation, Vanderbilt University, Nashville, TN 37232, USA
- Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, TN 37232, USA
- Department of Psychology, Vanderbilt University, Nashville, TN 37232, USA
- Department of Pharmacology, Vanderbilt University, Nashville, TN 37232, USA
- Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center, Nashville, TN 37232, USA
| | - Tiffany G Woynaroski
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN 37232, USA
- Frist Center for Autism and Innovation, Vanderbilt University, Nashville, TN 37232, USA
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, TN 37232, USA
- Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, TN 37232, USA
- Department of Communication Sciences and Disorders, John A. Burns School of Medicine, University of Hawaii at Manoa, Honolulu, HI 96813, USA
| |
Collapse
|
10
|
Feldman JI, Tu A, Conrad JG, Kuang W, Santapuram P, Woynaroski TG. The Impact of Singing on Visual and Multisensory Speech Perception in Children on the Autism Spectrum. Multisens Res 2022; 36:57-74. [PMID: 36731528 PMCID: PMC9924934 DOI: 10.1163/22134808-bja10087] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2022] [Accepted: 11/22/2022] [Indexed: 12/31/2022]
Abstract
Autistic children show reduced multisensory integration of audiovisual speech stimuli in response to the McGurk illusion. Previously, it has been shown that adults can integrate sung McGurk tokens. These sung speech tokens offer more salient visual and auditory cues, in comparison to the spoken tokens, which may increase the identification and integration of visual speech cues in autistic children. Forty participants (20 autism, 20 non-autistic peers) aged 7-14 completed the study. Participants were presented with speech tokens in four modalities: auditory-only, visual-only, congruent audiovisual, and incongruent audiovisual (i.e., McGurk; auditory 'ba' and visual 'ga'). Tokens were also presented in two formats: spoken and sung. Participants indicated what they perceived via a four-button response box (i.e., 'ba', 'ga', 'da', or 'tha'). Accuracies and perception of the McGurk illusion were calculated for each modality and format. Analysis of visual-only identification indicated a significant main effect of format, whereby participants were more accurate in sung versus spoken trials, but no significant main effect of group or interaction effect. Analysis of the McGurk trials indicated no significant main effect of format or group and no significant interaction effect. Sung speech tokens improved identification of visual speech cues, but did not boost the integration of visual cues with heard speech across groups. Additional work is needed to determine what properties of spoken speech contributed to the observed improvement in visual accuracy and to evaluate whether more prolonged exposure to sung speech may yield effects on multisensory integration.
Collapse
Affiliation(s)
- Jacob I. Feldman
- Department of Hearing and Speech Sciences, Vanderbilt
University Medical Center, Nashville, TN, USA
- Frist Center for Autism and Innovation, Vanderbilt
University, Nashville, TN, USA
| | - Alexander Tu
- Neuroscience Undergraduate Program, Vanderbilt University,
Nashville, TN, USA
- Present Address: Department of Otolaryngology and
Communication Sciences, Medical College of Wisconsin, Milwaukee, WI, USA
| | - Julie G. Conrad
- Neuroscience Undergraduate Program, Vanderbilt University,
Nashville, TN, USA
- Present Address: Department of Pediatrics, University of
Illinois, Chicago, IL, USA
| | - Wayne Kuang
- Neuroscience Undergraduate Program, Vanderbilt University,
Nashville, TN, USA
- Present Address: Department of Pediatrics, Los Angeles
County and University of Southern California (LAC+USC) Medical Center, University of
Southern California, Los Angeles, CA, USA
| | - Pooja Santapuram
- Neuroscience Undergraduate Program, Vanderbilt University,
Nashville, TN, USA
- Present Address: Department of Anesthesiology, Columbia
University Irving Medical Center, New York, NY, USA
| | - Tiffany G. Woynaroski
- Department of Hearing and Speech Sciences, Vanderbilt
University Medical Center, Nashville, TN, USA
- Frist Center for Autism and Innovation, Vanderbilt
University, Nashville, TN, USA
- Vanderbilt Kennedy Center, Vanderbilt University Medical
Center, Nashville, TN, USA
- Vanderbilt Brain Institute, Vanderbilt University,
Nashville, TN, USA
| |
Collapse
|
11
|
Feng S, Lu H, Wang Q, Li T, Fang J, Chen L, Yi L. Face-viewing patterns predict audiovisual speech integration in autistic children. Autism Res 2021; 14:2592-2602. [PMID: 34415113 DOI: 10.1002/aur.2598] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2021] [Revised: 07/06/2021] [Accepted: 07/30/2021] [Indexed: 11/10/2022]
Abstract
Autistic children show audiovisual speech integration deficits, though the underlying mechanisms remain unclear. The present study examined how audiovisual speech integration deficits in autistic children could be affected by their looking patterns. We measured audiovisual speech integration in 26 autistic children and 26 typically developing (TD) children (4- to 7-year-old) employing the McGurk task (a videotaped speaker uttering phonemes with her eyes open or closed) and tracked their eye movements. We found that, compared with TD children, autistic children showed weaker audiovisual speech integration (i.e., the McGurk effect) in the open-eyes condition and similar audiovisual speech integration in the closed-eyes condition. Autistic children viewed the speaker's mouth less in non-McGurk trials than in McGurk trials in both conditions. Importantly, autistic children's weaker audiovisual speech integration could be predicted by their reduced mouth-looking time. The present study indicated that atypical face-viewing patterns could serve as one of the cognitive mechanisms of audiovisual speech integration deficits in autistic children. LAY SUMMARY: McGurk effect occurs when the visual part of a phoneme (e.g., "ga") and the auditory part of another phoneme (e.g., "ba") uttered by a speaker were integrated into a fused perception (e.g., "da"). The present study examined how McGurk effect in autistic children could be affected by their looking patterns for the speaker's face. We found that less looking time for the speaker's mouth in autistic children could predict weaker McGurk effect. As McGurk effect manifests audiovisual speech integration, our findings imply that we could improve audiovisual speech integration in autistic children by directing them to look at the speaker's mouth in future intervention.
Collapse
Affiliation(s)
- Shuyuan Feng
- Institute for Applied Linguistics, School of Foreign Languages, Central South University, Changsha, Hunan, China.,School of Psychological and Cognitive Sciences and Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing, China
| | - Haoyang Lu
- Academy for Advanced Interdisciplinary Studies, Peking University, Beijing, China.,Peking-Tsinghua Center for Life Sciences, Peking University, Beijing, China
| | - Qiandong Wang
- Beijing Key Laboratory of Applied Experimental Psychology, National Demonstration Center for Experimental Psychology Education, Faculty of Psychology, Beijing Normal University, Beijing, China
| | - Tianbi Li
- School of Psychological and Cognitive Sciences and Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing, China
| | - Jing Fang
- Qingdao Autism Research Institute, Qingdao, China
| | - Lihan Chen
- School of Psychological and Cognitive Sciences and Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing, China
| | - Li Yi
- School of Psychological and Cognitive Sciences and Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing, China.,IDG/McGovern Institute for Brain Research at PKU, Peking University, Beijing, China
| |
Collapse
|
12
|
Abstract
AbstractTemporal synchrony is the alignment of processes in time within or across individuals in social interaction and is observed and studied in various domains using wide-ranging paradigms. Evidence suggesting reduced temporal synchrony in autism (e.g. compared to neurotypicals) has hitherto not been reviewed. To systematically review the magnitude and generalisability of the difference across different tasks and contexts, EBSCO, OVID, Web of Science, and Scopus databases were searched. Thirty-two studies were identified that met our inclusion criteria in audio-visual, audio-motor, visuo-tactile, visuo-motor, social motor, and conversational synchrony domains. Additionally, two intervention studies were included. The findings suggest that autistic participants showed reduced synchrony tendencies in every category of temporal synchrony reviewed. Implications, methodological weaknesses, and evidence gaps are discussed.
Collapse
|
13
|
Ujiie Y, Takahashi K. Weaker McGurk Effect for Rubin's Vase-Type Speech in People With High Autistic Traits. Multisens Res 2021; 34:1-17. [PMID: 33873157 DOI: 10.1163/22134808-bja10047] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2020] [Accepted: 04/05/2021] [Indexed: 11/19/2022]
Abstract
While visual information from facial speech modulates auditory speech perception, it is less influential on audiovisual speech perception among autistic individuals than among typically developed individuals. In this study, we investigated the relationship between autistic traits (Autism-Spectrum Quotient; AQ) and the influence of visual speech on the recognition of Rubin's vase-type speech stimuli with degraded facial speech information. Participants were 31 university students (13 males and 18 females; mean age: 19.2, SD: 1.13 years) who reported normal (or corrected-to-normal) hearing and vision. All participants completed three speech recognition tasks (visual, auditory, and audiovisual stimuli) and the AQ-Japanese version. The results showed that accuracies of speech recognition for visual (i.e., lip-reading) and auditory stimuli were not significantly related to participants' AQ. In contrast, audiovisual speech perception was less susceptible to facial speech perception among individuals with high rather than low autistic traits. The weaker influence of visual information on audiovisual speech perception in autism spectrum disorder (ASD) was robust regardless of the clarity of the visual information, suggesting a difficulty in the process of audiovisual integration rather than in the visual processing of facial speech.
Collapse
Affiliation(s)
- Yuta Ujiie
- Graduate School of Psychology, Chukyo University, 101-2 Yagoto Honmachi, Showa-ku, Nagoya-shi, Aichi, 466-8666, Japan
- Japan Society for the Promotion of Science, Kojimachi Business Center Building, 5-3-1 Kojimachi, Chiyoda-ku, Tokyo 102-0083, Japan
- Research and Development Initiative, Chuo University, 1-13-27, Kasuga, Bunkyo-ku, Tokyo, 112-8551, Japan
| | - Kohske Takahashi
- School of Psychology, Chukyo University, 101-2 Yagoto Honmachi, Showa-ku, Nagoya-shi, Aichi, 466-8666, Japan
| |
Collapse
|
14
|
Irwin J, Avery T, Kleinman D, Landi N. Audiovisual Speech Perception in Children with Autism Spectrum Disorders: Evidence from Visual Phonemic Restoration. J Autism Dev Disord 2021; 52:28-37. [DOI: 10.1007/s10803-021-04916-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/08/2021] [Indexed: 10/22/2022]
|
15
|
Wada M, Ikeda H, Kumagaya S. Atypical Effects of Visual Interference on Tactile Temporal Order Judgment in Individuals With Autism Spectrum Disorder. Multisens Res 2020; 34:129-151. [PMID: 33706272 DOI: 10.1163/22134808-bja10033] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2020] [Accepted: 07/17/2020] [Indexed: 11/19/2022]
Abstract
Visual distractors interfere with tactile temporal order judgment (TOJ) at moderately short stimulus onset asynchronies (SOAs) in typically developing participants. Presentation of a rubber hand in a forward direction to the participant's hand enhances this effect, while that in an inverted direction weakens the effect. Individuals with autism spectrum disorder (ASD) have atypical multisensory processing; however, effects of interferences on atypical multisensory processing in ASD remain unclear. In this study, we examined the effects of visual interference on tactile TOJ in individuals with ASD. Two successive tactile stimuli were delivered to the index and ring fingers of a participant's right hand in an opaque box. A rubber hand was placed on the box in a forward or inverted direction. Concurrently, visual stimuli provided by light-emitting diodes on the fingers of the rubber hand were delivered in a congruent or incongruent order. Participants were required to judge the temporal order of the tactile stimuli regardless of visual distractors. In the absence of a visual stimulus, participants with ASD tended to judge the simultaneous stimuli as the ring finger being stimulated first during tactile TOJ compared with typically developing (TD) controls, and congruent visual stimuli eliminated the bias. When incongruent visual stimuli were delivered, judgment was notably reversed in participants with ASD, regardless of the direction of the rubber hand. The findings demonstrate that there are considerable effects of visual interferences on tactile TOJ in individuals with ASD.
Collapse
Affiliation(s)
- Makoto Wada
- 1Developmental Disorders Section, Department of Rehabilitation for Brain Functions, Research Institute of National Rehabilitation Center for Persons with Disabilities, Tokorozawa, Saitama, 359-8555, Japan.,2Faculty of Informatics, Shizuoka University, Hamamatsu, Shizuoka 432-8011, Japan
| | - Hanako Ikeda
- 1Developmental Disorders Section, Department of Rehabilitation for Brain Functions, Research Institute of National Rehabilitation Center for Persons with Disabilities, Tokorozawa, Saitama, 359-8555, Japan
| | - Shinichiro Kumagaya
- 3Research Center for Advanced Science and Technology, The University of Tokyo, Meguro, Tokyo 153-8904, Japan
| |
Collapse
|
16
|
Dunham K, Feldman JI, Liu Y, Cassidy M, Conrad JG, Santapuram P, Suzman E, Tu A, Butera I, Simon DM, Broderick N, Wallace MT, Lewkowicz D, Woynaroski TG. Stability of Variables Derived From Measures of Multisensory Function in Children With Autism Spectrum Disorder. AMERICAN JOURNAL ON INTELLECTUAL AND DEVELOPMENTAL DISABILITIES 2020; 125:287-303. [PMID: 32609807 PMCID: PMC8903073 DOI: 10.1352/1944-7558-125.4.287] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/01/2019] [Accepted: 10/11/2019] [Indexed: 06/11/2023]
Abstract
Children with autism spectrum disorder (ASD) display differences in multisensory function as quantified by several different measures. This study estimated the stability of variables derived from commonly used measures of multisensory function in school-aged children with ASD. Participants completed: a simultaneity judgment task for audiovisual speech, tasks designed to elicit the McGurk effect, listening-in-noise tasks, electroencephalographic recordings, and eye-tracking tasks. Results indicate the stability of indices derived from tasks tapping multisensory processing is variable. These findings have important implications for measurement in future research. Averaging scores across repeated observations will often be required to obtain acceptably stable estimates and, thus, to increase the likelihood of detecting effects of interest, as it relates to multisensory processing in children with ASD.
Collapse
Affiliation(s)
- Kacie Dunham
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - Jacob I. Feldman
- Department of Hearing & Speech Sciences, Vanderbilt University, Nashville, TN, USA
| | - Yupeng Liu
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN, USA
| | - Margaret Cassidy
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN, USA
| | - Julie G. Conrad
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN, USA
- Present Address: College of Medicine, University of Illinois, Chicago, IL, USA
| | - Pooja Santapuram
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN, USA
- Present Address: School of Medicine, Vanderbilt University, Nashville, TN, USA
| | - Evan Suzman
- Department of Biomedical Sciences, Vanderbilt University, Nashville, TN, USA
| | - Alexander Tu
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN, USA
- Present Address: College of Medicine, University of Nebraska Medical Center, Omaha, NE, USA
| | - Iliza Butera
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - David M. Simon
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
- Present Address: axialHealthcare, Nashville, TN, USA
| | - Neill Broderick
- Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, TN, USA
- Department of Pediatrics, Vanderbilt University Medical Center, Nashville, TN, USA
| | - Mark T. Wallace
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
- Department of Hearing & Speech Sciences, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, TN, USA
- Department of Psychology, Vanderbilt University, Nashville, TN, USA
- Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center, Nashville, TN, USA
- Department of Pharmacology, Vanderbilt University, Nashville, TN, USA
| | - David Lewkowicz
- Department of Communication Sciences & Disorders, Northeastern University, Boston, MA, USA
| | - Tiffany G. Woynaroski
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, TN, USA
- Department of Hearing & Speech Sciences, Vanderbilt University Medical Center, Nashville, TN, USA
| |
Collapse
|
17
|
Ujiie Y, Wakabayashi A. Intact lip-reading but weaker McGurk effect in individuals with high autistic traits. INTERNATIONAL JOURNAL OF DEVELOPMENTAL DISABILITIES 2019; 68:47-55. [PMID: 35173963 PMCID: PMC8843195 DOI: 10.1080/20473869.2019.1699350] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/12/2019] [Revised: 11/05/2019] [Accepted: 11/26/2019] [Indexed: 06/14/2023]
Abstract
A weaker McGurk effect is observed in individuals with autism spectrum disorder (ASD); weaker integration is considered to be the key to understanding how low-order atypical processing leads to their maladaptive social behaviors. However, the mechanism for this weaker McGurk effect has not been fully understood. Here, we investigated (1) whether the weaker McGurk effect in individuals with high autistic traits is caused by poor lip-reading ability and (2) whether the hearing environment modifies the weaker McGurk effect in individuals with high autistic traits. To confirm them, we conducted two analogue studies among university students, based on the dimensional model of ASD. Results showed that individuals with high autistic traits have intact lip-reading ability as well as abilities to listen and recognize audiovisual congruent speech (Experiment 1). Furthermore, a weaker McGurk effect in individuals with high autistic traits, which appear under the without-noise condition, would disappear under the high noise condition (Experiments 1 and 2). Our findings suggest that high background noise might shift weight on the visual cue, thereby increasing the strength of the McGurk effect among individuals with high autistic traits.
Collapse
Affiliation(s)
- Yuta Ujiie
- Graduate School of Psychology, Chukyo University, Nagoya-shi, Aichi, Japan
- Japan Society for the Promotion of Science, Tokyo, Japan
- Research and Development Initiative, Chuo University, Tokyo, Japan
| | | |
Collapse
|
18
|
Ostrolenk A, Bao VA, Mottron L, Collignon O, Bertone A. Reduced multisensory facilitation in adolescents and adults on the Autism Spectrum. Sci Rep 2019; 9:11965. [PMID: 31427634 PMCID: PMC6700191 DOI: 10.1038/s41598-019-48413-9] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2018] [Accepted: 07/29/2019] [Indexed: 01/01/2023] Open
Abstract
Individuals with autism are reported to integrate information from visual and auditory channels in an idiosyncratic way. Multisensory integration (MSI) of simple, non-social stimuli (i.e., flashes and beeps) was evaluated in adolescents and adults with (n = 20) and without autism (n = 19) using a reaction time (RT) paradigm using audio, visual, and audiovisual stimuli. For each participant, the race model analysis compares the RTs on the audiovisual condition to a bound value computed from the unimodal RTs that reflects the effect of redundancy. If the actual audiovisual RTs are significantly faster than this bound, the race model is violated, indicating evidence of MSI. Our results show that the race model violation occurred only for the typically-developing (TD) group. While the TD group shows evidence of MSI, the autism group does not. These results suggest that multisensory integration of simple information, void of social content or complexity, is altered in autism. Individuals with autism may not benefit from the advantage conferred by multisensory stimulation to the same extent as TD individuals. Altered MSI for simple, non-social information may have cascading effects on more complex perceptual processes related to language and behaviour in autism.
Collapse
Affiliation(s)
- Alexia Ostrolenk
- Perceptual Neuroscience Lab for Autism and Development (PNLab), McGill University, Montreal, Canada.,University of Montreal Center of Excellence for Pervasive Developmental Disorders (CETEDUM), CIUSSS du Nord-de-l'Île de Montréal, Montreal, Canada
| | - Vanessa A Bao
- Perceptual Neuroscience Lab for Autism and Development (PNLab), McGill University, Montreal, Canada.,School/Applied Child Psychology, Department of Education and Counselling Psychology, McGill University, Montreal, Canada
| | - Laurent Mottron
- University of Montreal Center of Excellence for Pervasive Developmental Disorders (CETEDUM), CIUSSS du Nord-de-l'Île de Montréal, Montreal, Canada
| | - Olivier Collignon
- Centre for Mind/Brain Science (CIMeC), University of Trento, Trento, Italy.,Institut de recherche en Psychologie (IPSY) et en Neuroscience (IoNS), Université de Louvain-la-Neuve, Ottignies-Louvain-la-Neuve, Belgium
| | - Armando Bertone
- Perceptual Neuroscience Lab for Autism and Development (PNLab), McGill University, Montreal, Canada. .,School/Applied Child Psychology, Department of Education and Counselling Psychology, McGill University, Montreal, Canada. .,University of Montreal Center of Excellence for Pervasive Developmental Disorders (CETEDUM), CIUSSS du Nord-de-l'Île de Montréal, Montreal, Canada.
| |
Collapse
|
19
|
van Laarhoven T, Stekelenburg JJ, Vroomen J. Increased sub-clinical levels of autistic traits are associated with reduced multisensory integration of audiovisual speech. Sci Rep 2019; 9:9535. [PMID: 31267024 PMCID: PMC6606565 DOI: 10.1038/s41598-019-46084-0] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2019] [Accepted: 06/20/2019] [Indexed: 12/21/2022] Open
Abstract
Recent studies suggest that sub-clinical levels of autistic symptoms may be related to reduced processing of artificial audiovisual stimuli. It is unclear whether these findings extent to more natural stimuli such as audiovisual speech. The current study examined the relationship between autistic traits measured by the Autism spectrum Quotient and audiovisual speech processing in a large non-clinical population using a battery of experimental tasks assessing audiovisual perceptual binding, visual enhancement of speech embedded in noise and audiovisual temporal processing. Several associations were found between autistic traits and audiovisual speech processing. Increased autistic-like imagination was related to reduced perceptual binding measured by the McGurk illusion. Increased overall autistic symptomatology was associated with reduced visual enhancement of speech intelligibility in noise. Participants reporting increased levels of rigid and restricted behaviour were more likely to bind audiovisual speech stimuli over longer temporal intervals, while an increased tendency to focus on local aspects of sensory inputs was related to a more narrow temporal binding window. These findings demonstrate that increased levels of autistic traits may be related to alterations in audiovisual speech processing, and are consistent with the notion of a spectrum of autistic traits that extends to the general population.
Collapse
Affiliation(s)
- Thijs van Laarhoven
- Department of Cognitive Neuropsychology, Tilburg University, P.O. Box 90153, 5000 LE, Tilburg, The Netherlands.
| | - Jeroen J Stekelenburg
- Department of Cognitive Neuropsychology, Tilburg University, P.O. Box 90153, 5000 LE, Tilburg, The Netherlands
| | - Jean Vroomen
- Department of Cognitive Neuropsychology, Tilburg University, P.O. Box 90153, 5000 LE, Tilburg, The Netherlands
| |
Collapse
|
20
|
McGurk Effect by Individuals with Autism Spectrum Disorder and Typically Developing Controls: A Systematic Review and Meta-analysis. J Autism Dev Disord 2019; 49:34-43. [PMID: 30019277 DOI: 10.1007/s10803-018-3680-0] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
Abstract
By synthesizing existing behavioural studies through a meta-analytic approach, the current study compared the performances of Autism spectrum disorder (ASD) and typically developing groups in audiovisual speech integration and investigated potential moderators that might contribute to the heterogeneity of the existing findings. In total, nine studies were included in the current study, and the pooled overall difference between the two groups was significant, g = - 0.835 (p < 0.001; 95% CI - 1.155 to - 0.516). Age and task scoring method were found to be associated with the inconsistencies of the findings reported by previous studies. These findings indicate that individuals with ASD show weaker McGurk effect than typically developing controls.
Collapse
|
21
|
Feldman JI, Kuang W, Conrad JG, Tu A, Santapuram P, Simon DM, Foss-Feig JH, Kwakye LD, Stevenson RA, Wallace MT, Woynaroski TG. Brief Report: Differences in Multisensory Integration Covary with Sensory Responsiveness in Children with and without Autism Spectrum Disorder. J Autism Dev Disord 2019; 49:397-403. [PMID: 30043353 DOI: 10.1007/s10803-018-3667-x] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
Research shows that children with autism spectrum disorder (ASD) differ in their behavioral patterns of responding to sensory stimuli (i.e., sensory responsiveness) and in various other aspects of sensory functioning relative to typical peers. This study explored relations between measures of sensory responsiveness and multisensory speech perception and integration in children with and without ASD. Participants were 8-17 year old children, 18 with ASD and 18 matched typically developing controls. Participants completed a psychophysical speech perception task, and parents reported on children's sensory responsiveness. Psychophysical measures (e.g., audiovisual accuracy, temporal binding window) were associated with patterns of sensory responsiveness (e.g., hyporesponsiveness, sensory seeking). Results indicate that differences in multisensory speech perception and integration covary with atypical patterns of sensory responsiveness.
Collapse
Affiliation(s)
- Jacob I Feldman
- Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, TN, USA
| | - Wayne Kuang
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN, USA
| | - Julie G Conrad
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN, USA
| | - Alexander Tu
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN, USA
| | - Pooja Santapuram
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN, USA
| | - David M Simon
- Neuroscience Graduate Program, Vanderbilt University, Nashville, TN, USA.,Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - Jennifer H Foss-Feig
- Department of Psychiatry, Seaver Autism Center for Research and Treatment at the Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Leslie D Kwakye
- Department of Neuroscience, Oberlin College, Oberlin, OH, USA
| | - Ryan A Stevenson
- Department of Psychology, The University of Western Ontario, London, ON, Canada.,Brain and Mind Institute, The University of Western Ontario, London, ON, Canada.,Department of Psychiatry, The Schulich School of Medicine and Dentistry, The University of Western Ontario, London, ON, Canada.,Program in Neuroscience, The Schulich School of Medicine and Dentistry, The University of Western Ontario, London, ON, Canada.,York University Centre for Vision Research, York University, Toronto, ON, Canada
| | - Mark T Wallace
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA.,Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, MCE 8310 South Tower, 1215 21st Avenue South, Nashville, TN, 37232, USA.,Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, TN, USA.,Department of Psychology, Vanderbilt University, Nashville, TN, USA.,Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center, Nashville, TN, USA.,Department of Pharmacology, Vanderbilt University, Nashville, TN, USA
| | - Tiffany G Woynaroski
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA. .,Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, MCE 8310 South Tower, 1215 21st Avenue South, Nashville, TN, 37232, USA. .,Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, TN, USA.
| |
Collapse
|
22
|
Dellapiazza F, Vernhet C, Blanc N, Miot S, Schmidt R, Baghdadli A. Links between sensory processing, adaptive behaviours, and attention in children with autism spectrum disorder: A systematic review. Psychiatry Res 2018; 270:78-88. [PMID: 30245380 DOI: 10.1016/j.psychres.2018.09.023] [Citation(s) in RCA: 45] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/12/2018] [Revised: 08/29/2018] [Accepted: 09/12/2018] [Indexed: 12/19/2022]
Abstract
Atypical sensory processing has been described in autism spectrum disorder. The goal of this systematic review is to investigate the links between sensory processing, adaptive behaviours, and attention skills in children with autism spectrum disorder. The PRISMA guidelines were followed and a search was conducted using electronic databases: Medline, PsychInfo and Eric. Among the 11 studies about sensory processing that were selected, 7 investigated the association with adaptive behaviours and 5 with attention. Atypical sensory processing was reported in 82% to 97% of the participants with ASD, depending on the study. This review found a significant impact of sensory abnormalities on adaptive behaviour. In addition, we found interrelations between sensory processing and attention skills. However, the current literature is too limited to definitively conclude the direction of these interactions and the theories concerning perceptive functioning are conflicting.
Collapse
Affiliation(s)
- Florine Dellapiazza
- Centre Ressources Autisme, CHU, F-34000 Montpellier, France; Univ Paul Valéry Montpellier 3, Univ. Montpellier, EPSYLON EA 4556, 34000, Montpellier, France; Centre de Recherche en Épidémiologie et Santé des Populations, UMR1178, INSERM, Paris, France.
| | - Christelle Vernhet
- Centre Ressources Autisme, CHU, F-34000 Montpellier, France; Univ Paul Valéry Montpellier 3, Univ. Montpellier, EPSYLON EA 4556, 34000, Montpellier, France; Centre de Recherche en Épidémiologie et Santé des Populations, UMR1178, INSERM, Paris, France
| | - Nathalie Blanc
- Univ Paul Valéry Montpellier 3, Univ. Montpellier, EPSYLON EA 4556, 34000, Montpellier, France
| | - Stéphanie Miot
- Centre Ressources Autisme, CHU, F-34000 Montpellier, France; Centre de Recherche en Épidémiologie et Santé des Populations, UMR1178, INSERM, Paris, France
| | - Richard Schmidt
- Department of Psychology, College of the Holy Cross,Worcester, MA, USA
| | - Amaria Baghdadli
- Centre Ressources Autisme, CHU, F-34000 Montpellier, France; Centre de Recherche en Épidémiologie et Santé des Populations, UMR1178, INSERM, Paris, France; Université de Médecine Montpellier, France
| |
Collapse
|
23
|
Feldman JI, Dunham K, Cassidy M, Wallace MT, Liu Y, Woynaroski TG. Audiovisual multisensory integration in individuals with autism spectrum disorder: A systematic review and meta-analysis. Neurosci Biobehav Rev 2018; 95:220-234. [PMID: 30287245 PMCID: PMC6291229 DOI: 10.1016/j.neubiorev.2018.09.020] [Citation(s) in RCA: 84] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2018] [Revised: 09/10/2018] [Accepted: 09/25/2018] [Indexed: 02/04/2023]
Abstract
An ever-growing literature has aimed to determine how individuals with autism spectrum disorder (ASD) differ from their typically developing (TD) peers on measures of multisensory integration (MSI) and to ascertain the degree to which differences in MSI are associated with the broad range of symptoms associated with ASD. Findings, however, have been highly variable across the studies carried out to date. The present work systematically reviews and quantitatively synthesizes the large literature on audiovisual MSI in individuals with ASD to evaluate the cumulative evidence for (a) group differences between individuals with ASD and TD peers, (b) correlations between MSI and autism symptoms in individuals with ASD and (c) study level factors that may moderate findings (i.e., explain differential effects) observed across studies. To identify eligible studies, a comprehensive search strategy was employed using the ProQuest search engine, PubMed database, forwards and backwards citation searches, direct author contact, and hand-searching of select conference proceedings. A significant between-group difference in MSI was evident in the literature, with individuals with ASD demonstrating worse audiovisual integration on average across studies compared to TD controls. This effect was moderated by mean participant age, such that between-group differences were more pronounced in younger samples. The mean correlation between MSI and autism and related symptomatology was also significant, indicating that increased audiovisual integration in individuals with ASD is associated with better language/communication abilities and/or reduced autism symptom severity in the extant literature. This effect was moderated by whether the stimuli were linguistic versus non-linguistic in nature, such that correlation magnitudes tended to be significantly greater when linguistic stimuli were utilized in the measure of MSI. Limitations and future directions for primary and meta-analytic research are discussed.
Collapse
Affiliation(s)
- Jacob I Feldman
- Department of Hearing and Speech Sciences, Vanderbilt University, 1215 21st Ave S, MCE South Tower 8310, Nashville, TN, 37232, USA.
| | - Kacie Dunham
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN, USA
| | - Margaret Cassidy
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN, USA
| | - Mark T Wallace
- Department of Psychology, Vanderbilt University, Nashville, TN, USA; Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center, Nashville, TN, USA; Department of Pharmacology, Vanderbilt University, Nashville, TN, USA; Vanderbilt Kennedy Center, Vanderbilt University Medical Center, 110 Magnolia Cir, Nashville, TN, 37203, USA; Vanderbilt Brain Institute, Vanderbilt University, 465 21st Avenue South, Nashville, TN, 37232, USA; Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, 1215 21st Ave S, MCE South Tower 8310, Nashville, TN, 27323, USA.
| | - Yupeng Liu
- Neuroscience Undergraduate Program, Vanderbilt University, Nashville, TN, USA
| | - Tiffany G Woynaroski
- Vanderbilt Kennedy Center, Vanderbilt University Medical Center, 110 Magnolia Cir, Nashville, TN, 37203, USA; Vanderbilt Brain Institute, Vanderbilt University, 465 21st Avenue South, Nashville, TN, 37232, USA; Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, 1215 21st Ave S, MCE South Tower 8310, Nashville, TN, 27323, USA.
| |
Collapse
|
24
|
Individual differences and the effect of face configuration information in the McGurk effect. Exp Brain Res 2018; 236:973-984. [PMID: 29383400 DOI: 10.1007/s00221-018-5188-4] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2017] [Accepted: 01/23/2018] [Indexed: 10/18/2022]
Abstract
The McGurk effect, which denotes the influence of visual information on audiovisual speech perception, is less frequently observed in individuals with autism spectrum disorder (ASD) compared to those without it; the reason for this remains unclear. Several studies have suggested that facial configuration context might play a role in this difference. More specifically, people with ASD show a local processing bias for faces-that is, they process global face information to a lesser extent. This study examined the role of facial configuration context in the McGurk effect in 46 healthy students. Adopting an analogue approach using the Autism-Spectrum Quotient (AQ), we sought to determine whether this facial configuration context is crucial to previously observed reductions in the McGurk effect in people with ASD. Lip-reading and audiovisual syllable identification tasks were assessed via presentation of upright normal, inverted normal, upright Thatcher-type, and inverted Thatcher-type faces. When the Thatcher-type face was presented, perceivers were found to be sensitive to the misoriented facial characteristics, causing them to perceive a weaker McGurk effect than when the normal face was presented (this is known as the McThatcher effect). Additionally, the McGurk effect was weaker in individuals with high AQ scores than in those with low AQ scores in the incongruent audiovisual condition, regardless of their ability to read lips or process facial configuration contexts. Our findings, therefore, do not support the assumption that individuals with ASD show a weaker McGurk effect due to a difficulty in processing facial configuration context.
Collapse
|
25
|
Irwin J, Avery T, Brancazio L, Turcios J, Ryherd K, Landi N. Electrophysiological Indices of Audiovisual Speech Perception: Beyond the McGurk Effect and Speech in Noise. Multisens Res 2018; 31:39-56. [PMID: 31264595 DOI: 10.1163/22134808-00002580] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2016] [Accepted: 05/15/2017] [Indexed: 11/19/2022]
Abstract
Visual information on a talker's face can influence what a listener hears. Commonly used approaches to study this include mismatched audiovisual stimuli (e.g., McGurk type stimuli) or visual speech in auditory noise. In this paper we discuss potential limitations of these approaches and introduce a novel visual phonemic restoration method. This method always presents the same visual stimulus (e.g., /ba/) dubbed with a matched auditory stimulus (/ba/) or one that has weakened consonantal information and sounds more /a/-like). When this reduced auditory stimulus (or /a/) is dubbed with the visual /ba/, a visual influence will result in effectively 'restoring' the weakened auditory cues so that the stimulus is perceived as a /ba/. An oddball design in which participants are asked to detect the /a/ among a stream of more frequently occurring /ba/s while either a speaking face or face with no visual speech was used. In addition, the same paradigm was presented for a second contrast in which participants detected /pa/ among /ba/s, a contrast which should be unaltered by the presence of visual speech. Behavioral and some ERP findings reflect the expected phonemic restoration for the /ba/ vs. /a/ contrast; specifically, we observed reduced accuracy and P300 response in the presence of visual speech. Further, we report an unexpected finding of reduced accuracy and P300 response for both speech contrasts in the presence of visual speech, suggesting overall modulation of the auditory signal in the presence of visual speech. Consistent with this, we observed a mismatch negativity (MMN) effect for the /ba/ vs. /pa/ contrast only that was larger in absence of visual speech. We discuss the potential utility for this paradigm for listeners who cannot respond actively, such as infants and individuals with developmental disabilities.
Collapse
Affiliation(s)
- Julia Irwin
- Haskins Laboratories, New Haven, CT, USA.,Southern Connecticut State University, New Haven, CT, USA
| | - Trey Avery
- Haskins Laboratories, New Haven, CT, USA
| | - Lawrence Brancazio
- Haskins Laboratories, New Haven, CT, USA.,Southern Connecticut State University, New Haven, CT, USA
| | - Jacqueline Turcios
- Haskins Laboratories, New Haven, CT, USA.,Southern Connecticut State University, New Haven, CT, USA
| | - Kayleigh Ryherd
- Haskins Laboratories, New Haven, CT, USA.,University of Connecticut, Storrs, CT, USA
| | - Nicole Landi
- Haskins Laboratories, New Haven, CT, USA.,University of Connecticut, Storrs, CT, USA
| |
Collapse
|
26
|
Beker S, Foxe JJ, Molholm S. Ripe for solution: Delayed development of multisensory processing in autism and its remediation. Neurosci Biobehav Rev 2018; 84:182-192. [PMID: 29162518 PMCID: PMC6389331 DOI: 10.1016/j.neubiorev.2017.11.008] [Citation(s) in RCA: 63] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2017] [Revised: 11/09/2017] [Accepted: 11/13/2017] [Indexed: 12/24/2022]
Abstract
Difficulty integrating inputs from different sensory sources is commonly reported in individuals with Autism Spectrum Disorder (ASD). Accumulating evidence consistently points to altered patterns of behavioral reactions and neural activity when individuals with ASD observe or act upon information arriving through multiple sensory systems. For example, impairments in the integration of seen and heard speech appear to be particularly acute, with obvious implications for interpersonal communication. Here, we explore the literature on multisensory processing in autism with a focus on developmental trajectories. While much remains to be understood, some consistent observations emerge. Broadly, sensory integration deficits are found in children with an ASD whereas these appear to be much ameliorated, or even fully recovered, in older teenagers and adults on the spectrum. This protracted delay in the development of multisensory processing raises the possibility of applying early intervention strategies focused on multisensory integration, to accelerate resolution of these functions. We also consider how dysfunctional cross-sensory oscillatory neural communication may be one key pathway to impaired multisensory processing in ASD.
Collapse
Affiliation(s)
- Shlomit Beker
- The Sheryl and Daniel R. Tishman Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine, Bronx, NY, United States; Rose F. Kennedy Intellectual and Developmental Disabilities Research Center (IDDRC), Department of Neuroscience, Albert Einstein College of Medicine, Bronx, NY, United States
| | - John J Foxe
- The Sheryl and Daniel R. Tishman Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine, Bronx, NY, United States; Rose F. Kennedy Intellectual and Developmental Disabilities Research Center (IDDRC), Department of Neuroscience, Albert Einstein College of Medicine, Bronx, NY, United States; The Ernest J. Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester Medical Center, Rochester, NY, United States
| | - Sophie Molholm
- The Sheryl and Daniel R. Tishman Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine, Bronx, NY, United States; Rose F. Kennedy Intellectual and Developmental Disabilities Research Center (IDDRC), Department of Neuroscience, Albert Einstein College of Medicine, Bronx, NY, United States; The Ernest J. Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester Medical Center, Rochester, NY, United States.
| |
Collapse
|
27
|
Thye MD, Bednarz HM, Herringshaw AJ, Sartin EB, Kana RK. The impact of atypical sensory processing on social impairments in autism spectrum disorder. Dev Cogn Neurosci 2018; 29:151-167. [PMID: 28545994 PMCID: PMC6987885 DOI: 10.1016/j.dcn.2017.04.010] [Citation(s) in RCA: 212] [Impact Index Per Article: 35.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2016] [Revised: 02/25/2017] [Accepted: 04/18/2017] [Indexed: 02/03/2023] Open
Abstract
Altered sensory processing has been an important feature of the clinical descriptions of autism spectrum disorder (ASD). There is evidence that sensory dysregulation arises early in the progression of ASD and impacts social functioning. This paper reviews behavioral and neurobiological evidence that describes how sensory deficits across multiple modalities (vision, hearing, touch, olfaction, gustation, and multisensory integration) could impact social functions in ASD. Theoretical models of ASD and their implications for the relationship between sensory and social functioning are discussed. Furthermore, neural differences in anatomy, function, and connectivity of different regions underlying sensory and social processing are also discussed. We conclude that there are multiple mechanisms through which early sensory dysregulation in ASD could cascade into social deficits across development. Future research is needed to clarify these mechanisms, and specific focus should be given to distinguish between deficits in primary sensory processing and altered top-down attentional and cognitive processes.
Collapse
Affiliation(s)
- Melissa D Thye
- Department of Psychology, University of Alabama at Birmingham, Birmingham, AL 35233, United States
| | - Haley M Bednarz
- Department of Psychology, University of Alabama at Birmingham, Birmingham, AL 35233, United States
| | - Abbey J Herringshaw
- Department of Psychology, University of Alabama at Birmingham, Birmingham, AL 35233, United States
| | - Emma B Sartin
- Department of Psychology, University of Alabama at Birmingham, Birmingham, AL 35233, United States
| | - Rajesh K Kana
- Department of Psychology, University of Alabama at Birmingham, Birmingham, AL 35233, United States.
| |
Collapse
|
28
|
Irwin J, Avery T, Turcios J, Brancazio L, Cook B, Landi N. Electrophysiological Indices of Audiovisual Speech Perception in the Broader Autism Phenotype. Brain Sci 2017; 7:E60. [PMID: 28574442 PMCID: PMC5483633 DOI: 10.3390/brainsci7060060] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2017] [Revised: 05/16/2017] [Accepted: 05/26/2017] [Indexed: 12/05/2022] Open
Abstract
When a speaker talks, the consequences of this can both be heard (audio) and seen (visual). A novel visual phonemic restoration task was used to assess behavioral discrimination and neural signatures (event-related potentials, or ERP) of audiovisual processing in typically developing children with a range of social and communicative skills assessed using the social responsiveness scale, a measure of traits associated with autism. An auditory oddball design presented two types of stimuli to the listener, a clear exemplar of an auditory consonant-vowel syllable /ba/ (the more frequently occurring standard stimulus), and a syllable in which the auditory cues for the consonant were substantially weakened, creating a stimulus which is more like /a/ (the infrequently presented deviant stimulus). All speech tokens were paired with a face producing /ba/ or a face with a pixelated mouth containing motion but no visual speech. In this paradigm, the visual /ba/ should cause the auditory /a/ to be perceived as /ba/, creating an attenuated oddball response; in contrast, a pixelated video (without articulatory information) should not have this effect. Behaviorally, participants showed visual phonemic restoration (reduced accuracy in detecting deviant /a/) in the presence of a speaking face. In addition, ERPs were observed in both an early time window (N100) and a later time window (P300) that were sensitive to speech context (/ba/ or /a/) and modulated by face context (speaking face with visible articulation or with pixelated mouth). Specifically, the oddball responses for the N100 and P300 were attenuated in the presence of a face producing /ba/ relative to a pixelated face, representing a possible neural correlate of the phonemic restoration effect. Notably, those individuals with more traits associated with autism (yet still in the non-clinical range) had smaller P300 responses overall, regardless of face context, suggesting generally reduced phonemic discrimination.
Collapse
Affiliation(s)
- Julia Irwin
- Haskins Laboratories, New Haven, CT 06511, USA.
- Department of Psychology, Southern Connecticut State University, New Haven, CT 06515, USA.
| | - Trey Avery
- Haskins Laboratories, New Haven, CT 06511, USA.
| | - Jacqueline Turcios
- Haskins Laboratories, New Haven, CT 06511, USA.
- Department of Communication Disorders, Southern Connecticut State University, New Haven, CT 06515, USA.
| | - Lawrence Brancazio
- Haskins Laboratories, New Haven, CT 06511, USA.
- Department of Psychology, Southern Connecticut State University, New Haven, CT 06515, USA.
| | - Barbara Cook
- Department of Communication Disorders, Southern Connecticut State University, New Haven, CT 06515, USA.
| | - Nicole Landi
- Haskins Laboratories, New Haven, CT 06511, USA.
- Psychological Sciences, University of Connecticut, Storrs, CT 06269, USA.
| |
Collapse
|
29
|
Multisensory Integration of Low-level Information in Autism Spectrum Disorder: Measuring Susceptibility to the Flash-Beep Illusion. J Autism Dev Disord 2017; 47:2535-2543. [DOI: 10.1007/s10803-017-3172-7] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
30
|
Stevenson RA, Segers M, Ncube BL, Black KR, Bebko JM, Ferber S, Barense MD. The cascading influence of multisensory processing on speech perception in autism. AUTISM : THE INTERNATIONAL JOURNAL OF RESEARCH AND PRACTICE 2017; 22:609-624. [PMID: 28506185 DOI: 10.1177/1362361317704413] [Citation(s) in RCA: 95] [Impact Index Per Article: 13.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/18/2022]
Abstract
It has been recently theorized that atypical sensory processing in autism relates to difficulties in social communication. Through a series of tasks concurrently assessing multisensory temporal processes, multisensory integration and speech perception in 76 children with and without autism, we provide the first behavioral evidence of such a link. Temporal processing abilities in children with autism contributed to impairments in speech perception. This relationship was significantly mediated by their abilities to integrate social information across auditory and visual modalities. These data describe the cascading impact of sensory abilities in autism, whereby temporal processing impacts multisensory information of social information, which, in turn, contributes to deficits in speech perception. These relationships were found to be specific to autism, specific to multisensory but not unisensory integration, and specific to the processing of social information.
Collapse
Affiliation(s)
| | | | | | | | | | - Susanne Ferber
- 3 University of Toronto, Canada.,4 Rotman Research Institute at Baycrest, Canada
| | - Morgan D Barense
- 3 University of Toronto, Canada.,4 Rotman Research Institute at Baycrest, Canada
| |
Collapse
|
31
|
Tillmann J, Olguin A, Tuomainen J, Swettenham J. The effect of visual perceptual load on auditory awareness in autism spectrum disorder. J Autism Dev Disord 2016; 45:3297-307. [PMID: 26043848 DOI: 10.1007/s10803-015-2491-9] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
Recent work on visual selective attention has shown that individuals with Autism Spectrum Disorder (ASD) demonstrate an increased perceptual capacity. The current study examined whether increasing visual perceptual load also has less of an effect on auditory awareness in children with ASD. Participants performed either a high- or low load version of a line discrimination task. On a critical trial, an unexpected, task-irrelevant auditory stimulus was played concurrently with the visual stimulus. In contrast to typically developing (TD) children, children with ASD demonstrated similar detection rates across perceptual load conditions, and reported greater awareness than TD children in the high perceptual load condition. These findings suggest an increased perceptual capacity in children with ASD that operates across sensory modalities.
Collapse
Affiliation(s)
- Julian Tillmann
- Department of Developmental Science, University College London, 2 Wakefield Street, WC1N 1PF, London, UK.
| | - Andrea Olguin
- Department of Developmental Science, University College London, 2 Wakefield Street, WC1N 1PF, London, UK
| | - Jyrki Tuomainen
- Speech, Hearing & Phonetic Sciences, University College London, 2 Wakefield Street, WC1N 1PF, London, UK
| | - John Swettenham
- Department of Developmental Science, University College London, 2 Wakefield Street, WC1N 1PF, London, UK
| |
Collapse
|
32
|
Abstract
It has been suggested that the sensory symptoms which affect many people with autism spectrum conditions (ASC) may be related to alterations in multisensory processing. Typically, the likelihood of interactions between the senses increases when information is temporally and spatially coincident. We explored visual-tactile interactions in adults with ASC for the first time in two experiments using low-level stimuli. Both participants with ASC and matched neurotypical controls only produced crossmodal interactions to near simultaneous stimuli, suggesting that temporal modulation is unaffected in the adult population. We also provide preliminary evidence that visual-tactile interactions may occur over greater spatial distances in participants with ASC, which merits further exploration.
Collapse
|
33
|
Hames EC, Murphy B, Rajmohan R, Anderson RC, Baker M, Zupancic S, O’Boyle M, Richman D. Visual, Auditory, and Cross Modal Sensory Processing in Adults with Autism: An EEG Power and BOLD fMRI Investigation. Front Hum Neurosci 2016; 10:167. [PMID: 27148020 PMCID: PMC4835455 DOI: 10.3389/fnhum.2016.00167] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2015] [Accepted: 04/04/2016] [Indexed: 01/09/2023] Open
Abstract
Electroencephalography (EEG) and blood oxygen level dependent functional magnetic resonance imagining (BOLD fMRI) assessed the neurocorrelates of sensory processing of visual and auditory stimuli in 11 adults with autism (ASD) and 10 neurotypical (NT) controls between the ages of 20-28. We hypothesized that ASD performance on combined audiovisual trials would be less accurate with observable decreased EEG power across frontal, temporal, and occipital channels and decreased BOLD fMRI activity in these same regions; reflecting deficits in key sensory processing areas. Analysis focused on EEG power, BOLD fMRI, and accuracy. Lower EEG beta power and lower left auditory cortex fMRI activity were seen in ASD compared to NT when they were presented with auditory stimuli as demonstrated by contrasting the activity from the second presentation of an auditory stimulus in an all auditory block vs. the second presentation of a visual stimulus in an all visual block (AA2-VV2).We conclude that in ASD, combined audiovisual processing is more similar than unimodal processing to NTs.
Collapse
Affiliation(s)
- Elizabeth’ C. Hames
- Department of Electrical and Computer Engineering, Texas Tech University, LubbockTX, USA
| | - Brandi Murphy
- Department of Audiology, Texas Tech University Health Sciences Center, LubbockTX, USA
| | - Ravi Rajmohan
- Department of Pharmacology and Neuroscience, Texas Tech University Health Sciences Center, LubbockTX, USA
| | - Ronald C. Anderson
- Department of Electrical and Computer Engineering, Texas Tech University, LubbockTX, USA
| | - Mary Baker
- Department of Electrical and Computer Engineering, Texas Tech University, LubbockTX, USA
| | - Stephen Zupancic
- Department of Audiology, Texas Tech University Health Sciences Center, LubbockTX, USA
| | - Michael O’Boyle
- College of Human Sciences, Texas Tech University, LubbockTX, USA
| | - David Richman
- Burkhart Center for Autism Education and Research, Texas Tech University, LubbockTX, USA
| |
Collapse
|
34
|
Seeing the Forest and the Trees: Default Local Processing in Individuals with High Autistic Traits Does Not Come at the Expense of Global Attention. J Autism Dev Disord 2016; 48:1382-1396. [DOI: 10.1007/s10803-016-2711-y] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
|
35
|
Baum SH, Stevenson RA, Wallace MT. Behavioral, perceptual, and neural alterations in sensory and multisensory function in autism spectrum disorder. Prog Neurobiol 2015; 134:140-60. [PMID: 26455789 PMCID: PMC4730891 DOI: 10.1016/j.pneurobio.2015.09.007] [Citation(s) in RCA: 231] [Impact Index Per Article: 25.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2015] [Revised: 08/21/2015] [Accepted: 09/05/2015] [Indexed: 01/24/2023]
Abstract
Although sensory processing challenges have been noted since the first clinical descriptions of autism, it has taken until the release of the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) in 2013 for sensory problems to be included as part of the core symptoms of autism spectrum disorder (ASD) in the diagnostic profile. Because sensory information forms the building blocks for higher-order social and cognitive functions, we argue that sensory processing is not only an additional piece of the puzzle, but rather a critical cornerstone for characterizing and understanding ASD. In this review we discuss what is currently known about sensory processing in ASD, how sensory function fits within contemporary models of ASD, and what is understood about the differences in the underlying neural processing of sensory and social communication observed between individuals with and without ASD. In addition to highlighting the sensory features associated with ASD, we also emphasize the importance of multisensory processing in building perceptual and cognitive representations, and how deficits in multisensory integration may also be a core characteristic of ASD.
Collapse
Affiliation(s)
- Sarah H Baum
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - Ryan A Stevenson
- Department of Psychology, University of Toronto, Toronto, ON, Canada
| | - Mark T Wallace
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA; Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, TN, USA; Department of Psychology, Vanderbilt University, Nashville, TN, USA; Department of Psychiatry, Vanderbilt University, Nashville, TN, USA.
| |
Collapse
|
36
|
Stevenson RA, Segers M, Ferber S, Barense MD, Camarata S, Wallace MT. Keeping time in the brain: Autism spectrum disorder and audiovisual temporal processing. Autism Res 2015; 9:720-38. [PMID: 26402725 DOI: 10.1002/aur.1566] [Citation(s) in RCA: 57] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2015] [Revised: 08/22/2015] [Accepted: 08/29/2015] [Indexed: 12/21/2022]
Abstract
A growing area of interest and relevance in the study of autism spectrum disorder (ASD) focuses on the relationship between multisensory temporal function and the behavioral, perceptual, and cognitive impairments observed in ASD. Atypical sensory processing is becoming increasingly recognized as a core component of autism, with evidence of atypical processing across a number of sensory modalities. These deviations from typical processing underscore the value of interpreting ASD within a multisensory framework. Furthermore, converging evidence illustrates that these differences in audiovisual processing may be specifically related to temporal processing. This review seeks to bridge the connection between temporal processing and audiovisual perception, and to elaborate on emerging data showing differences in audiovisual temporal function in autism. We also discuss the consequence of such changes, the specific impact on the processing of different classes of audiovisual stimuli (e.g. speech vs. nonspeech, etc.), and the presumptive brain processes and networks underlying audiovisual temporal integration. Finally, possible downstream behavioral implications, and possible remediation strategies are outlined. Autism Res 2016, 9: 720-738. © 2015 International Society for Autism Research, Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Ryan A Stevenson
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada
| | - Magali Segers
- Department of Psychology, York University, Toronto, Ontario, Canada
| | - Susanne Ferber
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada.,Rotman Research Institute, Toronto, Ontario, Canada
| | - Morgan D Barense
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada.,Rotman Research Institute, Toronto, Ontario, Canada
| | - Stephen Camarata
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, Tennessee.,Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, Tennessee
| | - Mark T Wallace
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, Tennessee.,Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, Tennessee.,Vanderbilt Brain Institute, Vanderbilt University Medical Center, Nashville, Tennessee.,Department of Psychology, Vanderbilt University, Nashville, Tennessee.,Department of Psychiatry, Vanderbilt University Medical Center, Nashville, Tennessee
| |
Collapse
|
37
|
Ujiie Y, Asai T, Wakabayashi A. The relationship between level of autistic traits and local bias in the context of the McGurk effect. Front Psychol 2015; 6:891. [PMID: 26175705 PMCID: PMC4484977 DOI: 10.3389/fpsyg.2015.00891] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2015] [Accepted: 06/15/2015] [Indexed: 11/13/2022] Open
Abstract
The McGurk effect is a well-known illustration that demonstrates the influence of visual information on hearing in the context of speech perception. Some studies have reported that individuals with autism spectrum disorder (ASD) display abnormal processing of audio-visual speech integration, while other studies showed contradictory results. Based on the dimensional model of ASD, we administered two analog studies to examine the link between level of autistic traits, as assessed by the Autism Spectrum Quotient (AQ), and the McGurk effect among a sample of university students. In the first experiment, we found that autistic traits correlated negatively with fused (McGurk) responses. Then, we manipulated presentation types of visual stimuli to examine whether the local bias toward visual speech cues modulated individual differences in the McGurk effect. The presentation included four types of visual images, comprising no image, mouth only, mouth and eyes, and full face. The results revealed that global facial information facilitates the influence of visual speech cues on McGurk stimuli. Moreover, individual differences between groups with low and high levels of autistic traits appeared when the full-face visual speech cue with an incongruent voice condition was presented. These results suggest that individual differences in the McGurk effect might be due to a weak ability to process global facial information in individuals with high levels of autistic traits.
Collapse
Affiliation(s)
- Yuta Ujiie
- Information Processing and Computer Sciences, Graduate School of Advanced Integration Science, Chiba University Chiba, Japan ; Japan Society for the Promotion of Science Tokyo, Japan
| | - Tomohisa Asai
- NTT Communication Science Laboratories, NTT Corporation Kanagawa, Japan
| | | |
Collapse
|
38
|
Stevenson RA, Siemann JK, Woynaroski TG, Schneider BC, Eberly HE, Camarata SM, Wallace MT. Evidence for diminished multisensory integration in autism spectrum disorders. J Autism Dev Disord 2015; 44:3161-7. [PMID: 25022248 DOI: 10.1007/s10803-014-2179-6] [Citation(s) in RCA: 88] [Impact Index Per Article: 9.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
Individuals with autism spectrum disorders (ASD) exhibit alterations in sensory processing, including changes in the integration of information across the different sensory modalities. In the current study, we used the sound-induced flash illusion to assess multisensory integration in children with ASD and typically-developing (TD) controls. Thirty-one children with ASD and 31 age and IQ matched TD children (average age = 12 years) were presented with simple visual (i.e., flash) and auditory (i.e., beep) stimuli of varying number. In illusory conditions, a single flash was presented with 2-4 beeps. In TD children, these conditions generally result in the perception of multiple flashes, implying a perceptual fusion across vision and audition. In the present study, children with ASD were significantly less likely to perceive the illusion relative to TD controls, suggesting that multisensory integration and cross-modal binding may be weaker in some children with ASD. These results are discussed in the context of previous findings for multisensory integration in ASD and future directions for research.
Collapse
Affiliation(s)
- Ryan A Stevenson
- Department of Hearing and Speech Sciences, Vanderbilt Kennedy Center, Vanderbilt Brain Institute, Vanderbilt University Medical Center, Nashville, TN, USA,
| | | | | | | | | | | | | |
Collapse
|
39
|
Grossman RB, Steinhart E, Mitchell T, McIlvane W. "Look who's talking!" Gaze Patterns for Implicit and Explicit Audio-Visual Speech Synchrony Detection in Children With High-Functioning Autism. Autism Res 2015; 8:307-16. [PMID: 25620208 DOI: 10.1002/aur.1447] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2013] [Accepted: 11/25/2014] [Indexed: 11/11/2022]
Abstract
Conversation requires integration of information from faces and voices to fully understand the speaker's message. To detect auditory-visual asynchrony of speech, listeners must integrate visual movements of the face, particularly the mouth, with auditory speech information. Individuals with autism spectrum disorder may be less successful at such multisensory integration, despite their demonstrated preference for looking at the mouth region of a speaker. We showed participants (individuals with and without high-functioning autism (HFA) aged 8-19) a split-screen video of two identical individuals speaking side by side. Only one of the speakers was in synchrony with the corresponding audio track and synchrony switched between the two speakers every few seconds. Participants were asked to watch the video without further instructions (implicit condition) or to specifically watch the in-synch speaker (explicit condition). We recorded which part of the screen and face their eyes targeted. Both groups looked at the in-synch video significantly more with explicit instructions. However, participants with HFA looked at the in-synch video less than typically developing (TD) peers and did not increase their gaze time as much as TD participants in the explicit task. Importantly, the HFA group looked significantly less at the mouth than their TD peers, and significantly more at non-face regions of the image. There were no between-group differences for eye-directed gaze. Overall, individuals with HFA spend less time looking at the crucially important mouth region of the face during auditory-visual speech integration, which is maladaptive gaze behavior for this type of task.
Collapse
Affiliation(s)
- Ruth B Grossman
- Emerson College, Department of Communication Sciences and Disorders, 120 Boylston Street, Boston, Massachusetts.,University of Massachusetts Medical School Shriver Center, 200 Trapelo Rd, Waltham, Massachusetts
| | - Erin Steinhart
- University of Massachusetts Medical School Shriver Center, 200 Trapelo Rd, Waltham, Massachusetts
| | - Teresa Mitchell
- University of Massachusetts Medical School Shriver Center, 200 Trapelo Rd, Waltham, Massachusetts
| | - William McIlvane
- University of Massachusetts Medical School Shriver Center, 200 Trapelo Rd, Waltham, Massachusetts
| |
Collapse
|
40
|
Visual abilities are important for auditory-only speech recognition: Evidence from autism spectrum disorder. Neuropsychologia 2014; 65:1-11. [DOI: 10.1016/j.neuropsychologia.2014.09.031] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2014] [Revised: 08/25/2014] [Accepted: 09/18/2014] [Indexed: 11/22/2022]
|
41
|
Wallace MT, Stevenson RA. The construct of the multisensory temporal binding window and its dysregulation in developmental disabilities. Neuropsychologia 2014; 64:105-23. [PMID: 25128432 PMCID: PMC4326640 DOI: 10.1016/j.neuropsychologia.2014.08.005] [Citation(s) in RCA: 195] [Impact Index Per Article: 19.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2014] [Revised: 08/04/2014] [Accepted: 08/05/2014] [Indexed: 01/18/2023]
Abstract
Behavior, perception and cognition are strongly shaped by the synthesis of information across the different sensory modalities. Such multisensory integration often results in performance and perceptual benefits that reflect the additional information conferred by having cues from multiple senses providing redundant or complementary information. The spatial and temporal relationships of these cues provide powerful statistical information about how these cues should be integrated or "bound" in order to create a unified perceptual representation. Much recent work has examined the temporal factors that are integral in multisensory processing, with many focused on the construct of the multisensory temporal binding window - the epoch of time within which stimuli from different modalities is likely to be integrated and perceptually bound. Emerging evidence suggests that this temporal window is altered in a series of neurodevelopmental disorders, including autism, dyslexia and schizophrenia. In addition to their role in sensory processing, these deficits in multisensory temporal function may play an important role in the perceptual and cognitive weaknesses that characterize these clinical disorders. Within this context, focus on improving the acuity of multisensory temporal function may have important implications for the amelioration of the "higher-order" deficits that serve as the defining features of these disorders.
Collapse
Affiliation(s)
- Mark T Wallace
- Vanderbilt Brain Institute, Vanderbilt University, 465 21st Avenue South, Nashville, TN 37232, USA; Department of Hearing & Speech Sciences, Vanderbilt University, Nashville, TN, USA; Department of Psychology, Vanderbilt University, Nashville, TN, USA; Department of Psychiatry, Vanderbilt University, Nashville, TN, USA.
| | - Ryan A Stevenson
- Department of Psychology, University of Toronto, Toronto, ON, Canada
| |
Collapse
|
42
|
Woynaroski TG, Kwakye LD, Foss-Feig JH, Stevenson RA, Stone WL, Wallace MT. Multisensory speech perception in children with autism spectrum disorders. J Autism Dev Disord 2014; 43:2891-902. [PMID: 23624833 DOI: 10.1007/s10803-013-1836-5] [Citation(s) in RCA: 108] [Impact Index Per Article: 10.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
Abstract
This study examined unisensory and multisensory speech perception in 8-17 year old children with autism spectrum disorders (ASD) and typically developing controls matched on chronological age, sex, and IQ. Consonant-vowel syllables were presented in visual only, auditory only, matched audiovisual, and mismatched audiovisual ("McGurk") conditions. Participants with ASD displayed deficits in visual only and matched audiovisual speech perception. Additionally, children with ASD reported a visual influence on heard speech in response to mismatched audiovisual syllables over a wider window of time relative to controls. Correlational analyses revealed associations between multisensory speech perception, communicative characteristics, and responses to sensory stimuli in ASD. Results suggest atypical speech perception is linked to broader behavioral characteristics of ASD.
Collapse
Affiliation(s)
- Tiffany G Woynaroski
- Department of Hearing and Speech Sciences, Vanderbilt University, 1211 Medical Center Drive, Nashville, TN, 37232, USA,
| | | | | | | | | | | |
Collapse
|
43
|
Stevenson RA, Siemann JK, Woynaroski TG, Schneider BC, Eberly HE, Camarata SM, Wallace MT. Brief report: Arrested development of audiovisual speech perception in autism spectrum disorders. J Autism Dev Disord 2014; 44:1470-7. [PMID: 24218241 PMCID: PMC4018423 DOI: 10.1007/s10803-013-1992-7] [Citation(s) in RCA: 52] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Abstract
Atypical communicative abilities are a core marker of Autism Spectrum Disorders (ASD). A number of studies have shown that, in addition to auditory comprehension differences, individuals with autism frequently show atypical responses to audiovisual speech, suggesting a multisensory contribution to these communicative differences from their typically developing peers. To shed light on possible differences in the maturation of audiovisual speech integration, we tested younger (ages 6-12) and older (ages 13-18) children with and without ASD on a task indexing such multisensory integration. To do this, we used the McGurk effect, in which the pairing of incongruent auditory and visual speech tokens typically results in the perception of a fused percept distinct from the auditory and visual signals, indicative of active integration of the two channels conveying speech information. Whereas little difference was seen in audiovisual speech processing (i.e., reports of McGurk fusion) between the younger ASD and TD groups, there was a significant difference at the older ages. While TD controls exhibited an increased rate of fusion (i.e., integration) with age, children with ASD failed to show this increase. These data suggest arrested development of audiovisual speech integration in ASD. The results are discussed in light of the extant literature and necessary next steps in research.
Collapse
Affiliation(s)
- Ryan A Stevenson
- Department of Hearing and Speech Sciences, Vanderbilt Kennedy Center, Vanderbilt Brain Institute, Vanderbilt University Medical Center, 7110 MRB III BioSci Bldg, 465 21st Ave South, Nashville, TN, 37232, USA,
| | | | | | | | | | | | | |
Collapse
|
44
|
Irwin JR, Brancazio L. Seeing to hear? Patterns of gaze to speaking faces in children with autism spectrum disorders. Front Psychol 2014; 5:397. [PMID: 24847297 PMCID: PMC4021198 DOI: 10.3389/fpsyg.2014.00397] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2014] [Accepted: 04/15/2014] [Indexed: 11/13/2022] Open
Abstract
Using eye-tracking methodology, gaze to a speaking face was compared in a group of children with autism spectrum disorders (ASD) and a group with typical development (TD). Patterns of gaze were observed under three conditions: audiovisual (AV) speech in auditory noise, visual only speech and an AV non-face, non-speech control. Children with ASD looked less to the face of the speaker and fixated less on the speakers’ mouth than TD controls. No differences in gaze were reported for the non-face, non-speech control task. Since the mouth holds much of the articulatory information available on the face, these findings suggest that children with ASD may have reduced access to critical linguistic information. This reduced access to visible articulatory information could be a contributor to the communication and language problems exhibited by children with ASD.
Collapse
Affiliation(s)
- Julia R Irwin
- Haskins Laboratories New Haven, CT, USA ; Department of Psychology, Southern Connecticut State University New Haven, CT, USA
| | - Lawrence Brancazio
- Haskins Laboratories New Haven, CT, USA ; Department of Psychology, Southern Connecticut State University New Haven, CT, USA
| |
Collapse
|
45
|
Irwin JR, Brancazio L. Seeing to hear? Patterns of gaze to speaking faces in children with autism spectrum disorders. Front Psychol 2014; 5:397. [PMID: 24847297 DOI: 10.3389/fpsyg.201400397] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2014] [Accepted: 04/15/2014] [Indexed: 05/25/2023] Open
Abstract
Using eye-tracking methodology, gaze to a speaking face was compared in a group of children with autism spectrum disorders (ASD) and a group with typical development (TD). Patterns of gaze were observed under three conditions: audiovisual (AV) speech in auditory noise, visual only speech and an AV non-face, non-speech control. Children with ASD looked less to the face of the speaker and fixated less on the speakers' mouth than TD controls. No differences in gaze were reported for the non-face, non-speech control task. Since the mouth holds much of the articulatory information available on the face, these findings suggest that children with ASD may have reduced access to critical linguistic information. This reduced access to visible articulatory information could be a contributor to the communication and language problems exhibited by children with ASD.
Collapse
Affiliation(s)
- Julia R Irwin
- Haskins Laboratories New Haven, CT, USA ; Department of Psychology, Southern Connecticut State University New Haven, CT, USA
| | - Lawrence Brancazio
- Haskins Laboratories New Haven, CT, USA ; Department of Psychology, Southern Connecticut State University New Haven, CT, USA
| |
Collapse
|
46
|
Bebko JM, Schroeder JH, Weiss JA. The McGurk effect in children with autism and Asperger syndrome. Autism Res 2013; 7:50-9. [PMID: 24136870 DOI: 10.1002/aur.1343] [Citation(s) in RCA: 44] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2013] [Accepted: 09/03/2013] [Indexed: 11/09/2022]
Abstract
Children with autism may have difficulties in audiovisual speech perception, which has been linked to speech perception and language development. However, little has been done to examine children with Asperger syndrome as a group on tasks assessing audiovisual speech perception, despite this group's often greater language skills. Samples of children with autism, Asperger syndrome, and Down syndrome, as well as a typically developing sample, were presented with an auditory-only condition, a speech-reading condition, and an audiovisual condition designed to elicit the McGurk effect. Children with autism demonstrated unimodal performance at the same level as the other groups, yet showed a lower rate of the McGurk effect compared with the Asperger, Down and typical samples. These results suggest that children with autism may have unique intermodal speech perception difficulties linked to their representations of speech sounds.
Collapse
|
47
|
Foxe JJ, Molholm S, Del Bene VA, Frey HP, Russo NN, Blanco D, Saint-Amour D, Ross LA. Severe multisensory speech integration deficits in high-functioning school-aged children with Autism Spectrum Disorder (ASD) and their resolution during early adolescence. ACTA ACUST UNITED AC 2013; 25:298-312. [PMID: 23985136 DOI: 10.1093/cercor/bht213] [Citation(s) in RCA: 140] [Impact Index Per Article: 12.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
Under noisy listening conditions, visualizing a speaker's articulations substantially improves speech intelligibility. This multisensory speech integration ability is crucial to effective communication, and the appropriate development of this capacity greatly impacts a child's ability to successfully navigate educational and social settings. Research shows that multisensory integration abilities continue developing late into childhood. The primary aim here was to track the development of these abilities in children with autism, since multisensory deficits are increasingly recognized as a component of the autism spectrum disorder (ASD) phenotype. The abilities of high-functioning ASD children (n = 84) to integrate seen and heard speech were assessed cross-sectionally, while environmental noise levels were systematically manipulated, comparing them with age-matched neurotypical children (n = 142). Severe integration deficits were uncovered in ASD, which were increasingly pronounced as background noise increased. These deficits were evident in school-aged ASD children (5-12 year olds), but were fully ameliorated in ASD children entering adolescence (13-15 year olds). The severity of multisensory deficits uncovered has important implications for educators and clinicians working in ASD. We consider the observation that the multisensory speech system recovers substantially in adolescence as an indication that it is likely amenable to intervention during earlier childhood, with potentially profound implications for the development of social communication abilities in ASD children.
Collapse
Affiliation(s)
- John J Foxe
- Department of Pediatrics, Department of Neuroscience, The Sheryl and Daniel R. Tishman Cognitive Neurophysiology Laboratory, Children's Evaluation and Rehabilitation Center (CERC), Department of Psychology, The Cognitive Neurophysiology Laboratory, Program in Cognitive Neuroscience, City College of the City University of New York, New York, NY 10031, USA Department of Biology, The Cognitive Neurophysiology Laboratory, Program in Cognitive Neuroscience, City College of the City University of New York, New York, NY 10031, USA
| | - Sophie Molholm
- Department of Pediatrics, Department of Neuroscience, The Sheryl and Daniel R. Tishman Cognitive Neurophysiology Laboratory, Children's Evaluation and Rehabilitation Center (CERC), Department of Psychology, The Cognitive Neurophysiology Laboratory, Program in Cognitive Neuroscience, City College of the City University of New York, New York, NY 10031, USA Department of Biology, The Cognitive Neurophysiology Laboratory, Program in Cognitive Neuroscience, City College of the City University of New York, New York, NY 10031, USA
| | - Victor A Del Bene
- Department of Pediatrics, Department of Neuroscience, The Sheryl and Daniel R. Tishman Cognitive Neurophysiology Laboratory, Children's Evaluation and Rehabilitation Center (CERC), Ferkauf Graduate School of Psychology, Albert Einstein College of Medicine, Bronx, NY 10461, USA
| | - Hans-Peter Frey
- Department of Pediatrics, Department of Neuroscience, The Sheryl and Daniel R. Tishman Cognitive Neurophysiology Laboratory, Children's Evaluation and Rehabilitation Center (CERC)
| | - Natalie N Russo
- Department of Pediatrics, Department of Neuroscience, The Sheryl and Daniel R. Tishman Cognitive Neurophysiology Laboratory, Children's Evaluation and Rehabilitation Center (CERC), Department of Psychology, Syracuse University, Syracuse, NY 13244, USA
| | - Daniella Blanco
- Department of Pediatrics, Department of Neuroscience, The Sheryl and Daniel R. Tishman Cognitive Neurophysiology Laboratory, Children's Evaluation and Rehabilitation Center (CERC), Department of Psychology, The Cognitive Neurophysiology Laboratory, Program in Cognitive Neuroscience, City College of the City University of New York, New York, NY 10031, USA Department of Biology, The Cognitive Neurophysiology Laboratory, Program in Cognitive Neuroscience, City College of the City University of New York, New York, NY 10031, USA
| | - Dave Saint-Amour
- Centre de Recherche, CHU Sainte-Justine, 3175, Côte-Sainte-Catherine Montréal, Montréal, QC, Canada H3T 1C5 Département de Psychologie, Université du Québec à Montréal (UQAM), Montréal, QC, Canada H3C 3P8 and
| | - Lars A Ross
- Department of Pediatrics, Department of Neuroscience, The Sheryl and Daniel R. Tishman Cognitive Neurophysiology Laboratory, Children's Evaluation and Rehabilitation Center (CERC), The Gordon F. Derner Institute of Advanced Psychological Studies, Adelphi University, Garden City, NY 11530, USA
| |
Collapse
|
48
|
Brandwein AB, Foxe JJ, Butler JS, Russo NN, Altschuler TS, Gomes H, Molholm S. The development of multisensory integration in high-functioning autism: high-density electrical mapping and psychophysical measures reveal impairments in the processing of audiovisual inputs. Cereb Cortex 2013; 23:1329-41. [PMID: 22628458 PMCID: PMC3643715 DOI: 10.1093/cercor/bhs109] [Citation(s) in RCA: 148] [Impact Index Per Article: 13.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023] Open
Abstract
Successful integration of auditory and visual inputs is crucial for both basic perceptual functions and for higher-order processes related to social cognition. Autism spectrum disorders (ASD) are characterized by impairments in social cognition and are associated with abnormalities in sensory and perceptual processes. Several groups have reported that individuals with ASD are impaired in their ability to integrate socially relevant audiovisual (AV) information, and it has been suggested that this contributes to the higher-order social and cognitive deficits observed in ASD. However, successful integration of auditory and visual inputs also influences detection and perception of nonsocial stimuli, and integration deficits may impair earlier stages of information processing, with cascading downstream effects. To assess the integrity of basic AV integration, we recorded high-density electrophysiology from a cohort of high-functioning children with ASD (7-16 years) while they performed a simple AV reaction time task. Children with ASD showed considerably less behavioral facilitation to multisensory inputs, deficits that were paralleled by less effective neural integration. Evidence for processing differences relative to typically developing children was seen as early as 100 ms poststimulation, and topographic analysis suggested that children with ASD relied on different cortical networks during this early multisensory processing stage.
Collapse
Affiliation(s)
- Alice B Brandwein
- Department of Pediatrics, The Sheryl and Daniel R. Tishman Cognitive Neurophysiology Laboratory, Children’s Evaluation and Rehabilitation Center, Albert Einstein College of Medicine, Van Etten Building-Wing 1C, 1225 Morris Park Avenue, Bronx, NY 10461, USA
| | | | | | | | | | | | | |
Collapse
|
49
|
DePape AMR, Hall GBC, Tillmann B, Trainor LJ. Auditory processing in high-functioning adolescents with Autism Spectrum Disorder. PLoS One 2012; 7:e44084. [PMID: 22984462 PMCID: PMC3440400 DOI: 10.1371/journal.pone.0044084] [Citation(s) in RCA: 70] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2012] [Accepted: 07/31/2012] [Indexed: 11/18/2022] Open
Abstract
Autism Spectrum Disorder (ASD) is a pervasive developmental disorder including abnormalities in perceptual processing. We measure perception in a battery of tests across speech (filtering, phoneme categorization, multisensory integration) and music (pitch memory, meter categorization, harmonic priming). We found that compared to controls, the ASD group showed poorer filtering, less audio-visual integration, less specialization for native phonemic and metrical categories, and a higher instance of absolute pitch. No group differences were found in harmonic priming. Our results are discussed in a developmental framework where culture-specific knowledge acquired early compared to late in development is most impaired, perhaps because of early-accelerated brain growth in ASD. These results suggest that early auditory remediation is needed for good communication and social functioning.
Collapse
Affiliation(s)
- Anne-Marie R. DePape
- Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, Ontario, Canada
| | - Geoffrey B. C. Hall
- Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, Ontario, Canada
- Psychiatry and Behavioural Neurosciences, McMaster University, Hamilton, Ontario, Canada
| | - Barbara Tillmann
- Lyon Neuroscience Research Center, Auditory Cognition and Psychoacoustics Team, CNRS-UMR 5292, INSERM U1028, Université Lyon 1, Lyon, Rhône-Alpes, France
| | - Laurel J. Trainor
- Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, Ontario, Canada
- Rotman Research Institute, Baycrest Hospital, Toronto, Ontario, Canada
- * E-mail:
| |
Collapse
|
50
|
Megnin O, Flitton A, Jones CRG, de Haan M, Baldeweg T, Charman T. Audiovisual speech integration in autism spectrum disorders: ERP evidence for atypicalities in lexical-semantic processing. Autism Res 2011; 5:39-48. [PMID: 22162387 PMCID: PMC3586407 DOI: 10.1002/aur.231] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2010] [Accepted: 08/22/2011] [Indexed: 02/05/2023]
Abstract
In typically developing (TD) individuals, behavioral and event-related potential (ERP) studies suggest that audiovisual (AV) integration enables faster and more efficient processing of speech. However, little is known about AV speech processing in individuals with autism spectrum disorders (ASD). This study examined ERP responses to spoken words to elucidate the effects of visual speech (the lip movements accompanying a spoken word) on the range of auditory speech processing stages from sound onset detection to semantic integration. The study also included an AV condition, which paired spoken words with a dynamic scrambled face in order to highlight AV effects specific to visual speech. Fourteen adolescent boys with ASD (15-17 years old) and 14 age- and verbal IQ-matched TD boys participated. The ERP of the TD group showed a pattern and topography of AV interaction effects consistent with activity within the superior temporal plane, with two dissociable effects over frontocentral and centroparietal regions. The posterior effect (200-300 ms interval) was specifically sensitive to lip movements in TD boys, and no AV modulation was observed in this region for the ASD group. Moreover, the magnitude of the posterior AV effect to visual speech correlated inversely with ASD symptomatology. In addition, the ASD boys showed an unexpected effect (P2 time window) over the frontocentral region (pooled electrodes F3, Fz, F4, FC1, FC2, FC3, FC4), which was sensitive to scrambled face stimuli. These results suggest that the neural networks facilitating processing of spoken words by visual speech are altered in individuals with ASD.
Collapse
Affiliation(s)
- Odette Megnin
- Behavioural and Brain Sciences Unit, UCL Institute of Child Health, London, United Kingdom.
| | | | | | | | | | | |
Collapse
|