1
|
Bao X, Lomber SG. Visual modulation of auditory evoked potentials in the cat. Sci Rep 2024; 14:7177. [PMID: 38531940 DOI: 10.1038/s41598-024-57075-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2023] [Accepted: 03/14/2024] [Indexed: 03/28/2024] Open
Abstract
Visual modulation of the auditory system is not only a neural substrate for multisensory processing, but also serves as a backup input underlying cross-modal plasticity in deaf individuals. Event-related potential (ERP) studies in humans have provided evidence of a multiple-stage audiovisual interactions, ranging from tens to hundreds of milliseconds after the presentation of stimuli. However, it is still unknown if the temporal course of visual modulation in the auditory ERPs can be characterized in animal models. EEG signals were recorded in sedated cats from subdermal needle electrodes. The auditory stimuli (clicks) and visual stimuli (flashes) were timed by two independent Poison processes and were presented either simultaneously or alone. The visual-only ERPs were subtracted from audiovisual ERPs before being compared to the auditory-only ERPs. N1 amplitude showed a trend of transiting from suppression-to-facilitation with a disruption at ~ 100-ms flash-to-click delay. We concluded that visual modulation as a function of SOA with extended range is more complex than previously characterized with short SOAs and its periodic pattern can be interpreted with "phase resetting" hypothesis.
Collapse
Affiliation(s)
- Xiaohan Bao
- Integrated Program in Neuroscience, McGill University, Montreal, QC, H3G 1Y6, Canada
| | - Stephen G Lomber
- Department of Physiology, McGill University, McIntyre Medical Sciences Building, Rm 1223, 3655 Promenade Sir William Osler, Montreal, QC, H3G 1Y6, Canada.
| |
Collapse
|
2
|
Yu L, Xu J. The Development of Multisensory Integration at the Neuronal Level. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1437:153-172. [PMID: 38270859 DOI: 10.1007/978-981-99-7611-9_10] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
Multisensory integration is a fundamental function of the brain. In the typical adult, multisensory neurons' response to paired multisensory (e.g., audiovisual) cues is significantly more robust than the corresponding best unisensory response in many brain regions. Synthesizing sensory signals from multiple modalities can speed up sensory processing and improve the salience of outside events or objects. Despite its significance, multisensory integration is testified to be not a neonatal feature of the brain. Neurons' ability to effectively combine multisensory information does not occur rapidly but develops gradually during early postnatal life (for cats, 4-12 weeks required). Multisensory experience is critical for this developing process. If animals were restricted from sensing normal visual scenes or sounds (deprived of the relevant multisensory experience), the development of the corresponding integrative ability could be blocked until the appropriate multisensory experience is obtained. This section summarizes the extant literature on the development of multisensory integration (mainly using cat superior colliculus as a model), sensory-deprivation-induced cross-modal plasticity, and how sensory experience (sensory exposure and perceptual learning) leads to the plastic change and modification of neural circuits in cortical and subcortical areas.
Collapse
Affiliation(s)
- Liping Yu
- Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai), School of Life Sciences, East China Normal University, Shanghai, China.
| | - Jinghong Xu
- Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai), School of Life Sciences, East China Normal University, Shanghai, China
| |
Collapse
|
3
|
Ghaneirad E, Borgolte A, Sinke C, Čuš A, Bleich S, Szycik GR. The effect of multisensory semantic congruency on unisensory object recognition in schizophrenia. Front Psychiatry 2023; 14:1246879. [PMID: 38025441 PMCID: PMC10646423 DOI: 10.3389/fpsyt.2023.1246879] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/24/2023] [Accepted: 10/16/2023] [Indexed: 12/01/2023] Open
Abstract
Multisensory, as opposed to unisensory processing of stimuli, has been found to enhance the performance (e.g., reaction time, accuracy, and discrimination) of healthy individuals across various tasks. However, this enhancement is not as pronounced in patients with schizophrenia (SZ), indicating impaired multisensory integration (MSI) in these individuals. To the best of our knowledge, no study has yet investigated the impact of MSI deficits in the context of working memory, a domain highly reliant on multisensory processing and substantially impaired in schizophrenia. To address this research gap, we employed two adopted versions of the continuous object recognition task to investigate the effect of single-trail multisensory encoding on subsequent object recognition in 21 schizophrenia patients and 21 healthy controls (HC). Participants were tasked with discriminating between initial and repeated presentations. For the initial presentations, half of the stimuli were audiovisual pairings, while the other half were presented unimodal. The task-relevant stimuli were then presented a second time in a unisensory manner (either auditory stimuli in the auditory task or visual stimuli in the visual task). To explore the impact of semantic context on multisensory encoding, half of the audiovisual pairings were selected to be semantically congruent, while the remaining pairs were not semantically related to each other. Consistent with prior studies, our findings demonstrated that the impact of single-trial multisensory presentation during encoding remains discernible during subsequent object recognition. This influence could be distinguished based on the semantic congruity between the auditory and visual stimuli presented during the encoding. This effect was more robust in the auditory task. In the auditory task, when congruent multisensory pairings were encoded, both participant groups demonstrated a multisensory facilitation effect. This effect resulted in improved accuracy and RT performance. Regarding incongruent audiovisual encoding, as expected, HC did not demonstrate an evident multisensory facilitation effect on memory performance. In contrast, SZs exhibited an atypically accelerated reaction time during the subsequent auditory object recognition. Based on the predictive coding model we propose that this observed deviations indicate a reduced semantic modulatory effect and anomalous predictive errors signaling, particularly in the context of conflicting cross-modal sensory inputs in SZ.
Collapse
Affiliation(s)
- Erfan Ghaneirad
- Department of Psychiatry, Social Psychiatry and Psychotherapy, Hannover Medical School, Hanover, Germany
| | - Anna Borgolte
- Department of Psychiatry, Social Psychiatry and Psychotherapy, Hannover Medical School, Hanover, Germany
| | - Christopher Sinke
- Department of Psychiatry, Social Psychiatry and Psychotherapy, Division of Clinical Psychology and Sexual Medicine, Hannover Medical School, Hannover, Germany
| | - Anja Čuš
- Department of Psychiatry, Social Psychiatry and Psychotherapy, Hannover Medical School, Hanover, Germany
| | - Stefan Bleich
- Department of Psychiatry, Social Psychiatry and Psychotherapy, Hannover Medical School, Hanover, Germany
- Center for Systems Neuroscience, University of Veterinary Medicine, Hanover, Germany
| | - Gregor R. Szycik
- Department of Psychiatry, Social Psychiatry and Psychotherapy, Hannover Medical School, Hanover, Germany
| |
Collapse
|
4
|
Al-youzbaki MU, Schormans AL, Allman BL. Past and present experience shifts audiovisual temporal perception in rats. Front Behav Neurosci 2023; 17:1287587. [PMID: 37908200 PMCID: PMC10613659 DOI: 10.3389/fnbeh.2023.1287587] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2023] [Accepted: 09/25/2023] [Indexed: 11/02/2023] Open
Abstract
Our brains have a propensity to integrate closely-timed auditory and visual stimuli into a unified percept; a phenomenon that is highly malleable based on prior sensory experiences, and is known to be altered in clinical populations. While the neural correlates of audiovisual temporal perception have been investigated using neuroimaging and electroencephalography techniques in humans, animal research will be required to uncover the underlying cellular and molecular mechanisms. Prior to conducting such mechanistic studies, it is important to first confirm the translational potential of any prospective animal model. Thus, in the present study, we conducted a series of experiments to determine if rats show the hallmarks of audiovisual temporal perception observed in neurotypical humans, and whether the rat behavioral paradigms could reveal when they experienced perceptual disruptions akin to those observed in neurodevelopmental disorders. After training rats to perform a temporal order judgment (TOJ) or synchrony judgment (SJ) task, we found that the rats' perception was malleable based on their past and present sensory experiences. More specifically, passive exposure to asynchronous audiovisual stimulation in the minutes prior to behavioral testing caused the rats' perception to predictably shift in the direction of the leading stimulus; findings which represent the first time that this form of audiovisual perceptual malleability has been reported in non-human subjects. Furthermore, rats performing the TOJ task also showed evidence of rapid recalibration, in which their audiovisual temporal perception on the current trial was predictably influenced by the timing lag between the auditory and visual stimuli in the preceding trial. Finally, by manipulating either experimental testing parameters or altering the rats' neurochemistry with a systemic injection of MK-801, we showed that the TOJ and SJ tasks could identify when the rats had difficulty judging the timing of audiovisual stimuli. These findings confirm that the behavioral paradigms are indeed suitable for future testing of rats with perceptual disruptions in audiovisual processing. Overall, our collective results highlight that rats represent an excellent animal model to study the cellular and molecular mechanisms underlying the acuity and malleability of audiovisual temporal perception, as they showcase the perceptual hallmarks commonly observed in humans.
Collapse
|
5
|
Takeshima Y. Change of rapid temporal recalibration magnitude for audiovisual asynchrony with modulation of temporal binding window width: A preliminary investigation. Iperception 2023; 14:20416695231193280. [PMID: 37600069 PMCID: PMC10439762 DOI: 10.1177/20416695231193280] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2023] [Accepted: 07/24/2023] [Indexed: 08/22/2023] Open
Abstract
The subjective synchrony perception for audiovisual stimuli is affected by previous temporal information. The point of subjective simultaneity is shifted toward the same asynchronous direction of audiovisual stimuli in a previous trial. This phenomenon is called "rapid temporal recalibration." The factors that modulate the magnitude of rapid temporal recalibration have not been fully investigated. Previously, a positive correlation has been found between the magnitude of rapid temporal recalibration and the width of the temporal binding window (TBW). This preliminary study examined the causal relationship between TBW size and rapid recalibration magnitude using a single experimental group comparison design. In this experiment, the magnitude of rapid recalibration was compared before and after perceptual training, which narrowed the TBW width. The results indicated that the magnitude of rapid recalibration was reduced by perceptual training. Therefore, it was speculated that TBW size determined the magnitude of rapid recalibration. This causal relationship helps elucidate the mechanisms of the adaptation for temporal lags between visual and auditory sensations.
Collapse
|
6
|
Townsend B, Legere JK, von Mohrenschildt M, Shedden JM. Stimulus Onset Asynchrony Affects Weighting-related Event-related Spectral Power in Self-motion Perception. J Cogn Neurosci 2023; 35:1092-1107. [PMID: 37043240 DOI: 10.1162/jocn_a_01994] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/13/2023]
Abstract
Self-motion perception relies primarily on the integration of the visual, vestibular, proprioceptive, and somatosensory systems. There is a gap in understanding how a temporal lag between visual and vestibular motion cues affects visual-vestibular weighting during self-motion perception. The beta band is an index of visual-vestibular weighting, in that robust beta event-related synchronization (ERS) is associated with visual weighting bias, and robust beta event-related desynchronization is associated with vestibular weighting bias. The present study examined modulation of event-related spectral power during a heading judgment task in which participants attended to either visual (optic flow) or physical (inertial cues stimulating the vestibular, proprioceptive and somatosensory systems) motion cues from a motion simulator mounted on a MOOG Stewart Platform. The temporal lag between the onset of visual and physical motion cues was manipulated to produce three lag conditions: simultaneous onset, visual before physical motion onset, and physical before visual motion onset. There were two main findings. First, we demonstrated that when the attended motion cue was presented before an ignored cue, the power of beta associated with the attended modality was greater than when visual-vestibular cues were presented simultaneously or when the ignored cue was presented first. This was the case for beta ERS when the visual-motion cue was attended to, and beta event-related desynchronization when the physical-motion cue was attended to. Second, we tested whether the power of feature-binding gamma ERS (demonstrated in audiovisual and visual-tactile integration studies) increased when the visual-vestibular cues were presented simultaneously versus with temporal asynchrony. We did not observe an increase in gamma ERS when cues were presented simultaneously, suggesting that electrophysiological markers of visual-vestibular binding differ from markers of audiovisual and visual-tactile integration. All event-related spectral power reported in this study were generated from dipoles projecting from the left and right motor areas, based on the results of Measure Projection Analysis.
Collapse
|
7
|
Sun Y, Fu Q. How do irrelevant stimuli from another modality influence responses to the targets in a same-different task. Conscious Cogn 2023; 107:103455. [PMID: 36586291 DOI: 10.1016/j.concog.2022.103455] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2022] [Revised: 11/13/2022] [Accepted: 12/13/2022] [Indexed: 12/30/2022]
Abstract
It remains unclear whether multisensory interaction can implicitly occur at the abstract level. To address this issue, a same-different task was used to select comparable images and sounds in Experiment 1. Then, the stimuli with various levels of discrimination difficulty were adopted in a modified same-different task in Experiments 2, 3, and 4. The resultsshowed that only when the irrelevant stimuli were easily distinguishable, aconsistency effectcould beobservedin the testing phase. Moreover, when easily distinguishableirrelevant stimuliwere simultaneously presented with difficulttarget stimuli, irrelevant auditorystimuli facilitated responses to visual targets whereas irrelevant visual stimuli interfered with responses to auditorytargetsin the training phase,indicating an asymmetry in the role of visual and auditory in abstract multisensory integration. The results suggested that abstract multisensory information could be implicitly integrated and the inverse effectiveness principle might not apply to high-level processing of abstract multisensory integration.
Collapse
Affiliation(s)
- Ying Sun
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Qiufang Fu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, China.
| |
Collapse
|
8
|
Zhou H, Liu X, Yu J, Yue C, Wang A, Zhang M. Compensation Mechanisms May Not Always Account for Enhanced Multisensory Illusion in Older Adults: Evidence from Sound-Induced Flash Illusion. Brain Sci 2022; 12:brainsci12101418. [PMID: 36291351 PMCID: PMC9599837 DOI: 10.3390/brainsci12101418] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2022] [Revised: 10/14/2022] [Accepted: 10/20/2022] [Indexed: 11/16/2022] Open
Abstract
Sound-induced flash illusion (SiFI) is typical auditory dominance phenomenon in multisensory illusion. Although a number of studies have explored the SiFI in terms of age-related effects, the reasons for the enhanced SiFI in older adults are still controversial. In the present study, older and younger adults with equal visual discrimination were selected to explore age differences in SiFI effects, and to explore the neural indicators by resting-state functional magnetic resonance imaging (rs-fMRI) signals. A correlation analysis was calculated to examine the relationship between regional homogeneity (ReHo) and the SiFI. The results showed that both younger and older adults experienced significant fission and fusion illusions, and fission illusions of older adults were greater than that of younger adults. In addition, our results showed ReHo values of the left middle frontal gyrus (MFG), the right inferior frontal gyrus (IFG) and right superior frontal gyrus (SFG) were significantly positively correlated with the SiFI in older adults. More importantly, the comparison between older and younger adults showed that ReHo values of the right superior temporal gyrus (STG) decreased in older adults, and this was independent of the SiFI. The results indicated that when there was no difference in unisensory ability, the enhancement of multisensory illusion in older adults may not always be explained by compensation mechanisms.
Collapse
Affiliation(s)
- Heng Zhou
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou 215123, China
| | - Xiaole Liu
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou 215123, China
| | - Junming Yu
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou 215123, China
| | - Chunlin Yue
- School of Physical Education and Sport Science, Soochow University, Suzhou 215021, China
| | - Aijun Wang
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou 215123, China
- Correspondence:
| | - Ming Zhang
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou 215123, China
| |
Collapse
|
9
|
Crosse MJ, Foxe JJ, Tarrit K, Freedman EG, Molholm S. Resolution of impaired multisensory processing in autism and the cost of switching sensory modality. Commun Biol 2022; 5:601. [PMID: 35773473 PMCID: PMC9246932 DOI: 10.1038/s42003-022-03519-1] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2021] [Accepted: 05/23/2022] [Indexed: 11/09/2022] Open
Abstract
Children with autism spectrum disorders (ASD) exhibit alterations in multisensory processing, which may contribute to the prevalence of social and communicative deficits in this population. Resolution of multisensory deficits has been observed in teenagers with ASD for complex, social speech stimuli; however, whether this resolution extends to more basic multisensory processing deficits remains unclear. Here, in a cohort of 364 participants we show using simple, non-social audiovisual stimuli that deficits in multisensory processing observed in high-functioning children and teenagers with ASD are not evident in adults with the disorder. Computational modelling indicated that multisensory processing transitions from a default state of competition to one of facilitation, and that this transition is delayed in ASD. Further analysis revealed group differences in how sensory channels are weighted, and how this is impacted by preceding cross-sensory inputs. Our findings indicate that there is a complex and dynamic interplay among the sensory systems that differs considerably in individuals with ASD. Crosse et al. study a cohort of 364 participants with autism spectrum disorders (ASD) and matched controls, and show that deficits in multisensory processing observed in high-functioning children and teenagers with ASD are not evident in adults with the disorder. Using computational modelling they go on to demonstrate that there is a delayed transition of multisensory processing from a default state of competition to one of facilitation in ASD, as well as differences in sensory weighting and the ability to switch between sensory modalities, which sheds light on the interplay among sensory systems that differ in ASD individuals.
Collapse
Affiliation(s)
- Michael J Crosse
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine, Bronx, NY, USA. .,The Dominick P. Purpura Department of Neuroscience, Rose F. Kennedy Intellectual and Developmental Disabilities Research Center, Albert Einstein College of Medicine, Bronx, NY, USA. .,Trinity Centre for Biomedical Engineering, Department of Mechanical, Manufacturing & Biomedical Engineering, Trinity College Dublin, Dublin, Ireland.
| | - John J Foxe
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine, Bronx, NY, USA.,The Dominick P. Purpura Department of Neuroscience, Rose F. Kennedy Intellectual and Developmental Disabilities Research Center, Albert Einstein College of Medicine, Bronx, NY, USA.,The Cognitive Neurophysiology Laboratory, Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Katy Tarrit
- The Cognitive Neurophysiology Laboratory, Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Edward G Freedman
- The Cognitive Neurophysiology Laboratory, Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Sophie Molholm
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine, Bronx, NY, USA. .,The Dominick P. Purpura Department of Neuroscience, Rose F. Kennedy Intellectual and Developmental Disabilities Research Center, Albert Einstein College of Medicine, Bronx, NY, USA. .,The Cognitive Neurophysiology Laboratory, Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA.
| |
Collapse
|
10
|
London RE, Benwell CSY, Cecere R, Quak M, Thut G, Talsma D. EEG Alpha power predicts the temporal sensitivity of multisensory perception. Eur J Neurosci 2022; 55:3241-3255. [DOI: 10.1111/ejn.15719] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2020] [Revised: 05/04/2022] [Accepted: 05/10/2022] [Indexed: 11/28/2022]
Affiliation(s)
| | | | - Roberto Cecere
- Institute of Neuroscience and Psychology University of Glasgow UK
| | - Michel Quak
- Department of Experimental Psychology Ghent University Belgium
| | - Gregor Thut
- Institute of Neuroscience and Psychology University of Glasgow UK
| | - Durk Talsma
- Department of Experimental Psychology Ghent University Belgium
| |
Collapse
|
11
|
Visual field differences in temporal synchrony processing for audio-visual stimuli. PLoS One 2021; 16:e0261129. [PMID: 34914735 PMCID: PMC8675747 DOI: 10.1371/journal.pone.0261129] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2020] [Accepted: 11/24/2021] [Indexed: 11/19/2022] Open
Abstract
Audio-visual integration relies on temporal synchrony between visual and auditory inputs. However, differences in traveling and transmitting speeds between visual and auditory stimuli exist; therefore, audio-visual synchrony perception exhibits flexible functions. The processing speed of visual stimuli affects the perception of audio-visual synchrony. The present study examined the effects of visual fields, in which visual stimuli are presented, for the processing of audio-visual temporal synchrony. The point of subjective simultaneity, the temporal binding window, and the rapid recalibration effect were measured using temporal order judgment, simultaneity judgment, and stream/bounce perception, because different mechanisms of temporal processing have been suggested among these three paradigms. The results indicate that auditory stimuli should be presented earlier for visual stimuli in the central visual field than in the peripheral visual field condition in order to perceive subjective simultaneity in the temporal order judgment task conducted in this study. Meanwhile, the subjective simultaneity bandwidth was broader in the central visual field than in the peripheral visual field during the simultaneity judgment task. In the stream/bounce perception task, neither the point of subjective simultaneity nor the temporal binding window differed between the two types of visual fields. Moreover, rapid recalibration occurred in both visual fields during the simultaneity judgment tasks. However, during the temporal order judgment task and stream/bounce perception, rapid recalibration occurred only in the central visual field. These results suggest that differences in visual processing speed based on the visual field modulate the temporal processing of audio-visual stimuli. Furthermore, these three tasks, temporal order judgment, simultaneity judgment, and stream/bounce perception, each have distinct functional characteristics for audio-visual synchrony perception. Future studies are necessary to confirm the effects of compensation regarding differences in the temporal resolution of the visual field in later cortical visual pathways on visual field differences in audio-visual temporal synchrony.
Collapse
|
12
|
Scurry AN, Lovelady Z, Jiang F. Task-dependent audiovisual temporal sensitivity is not affected by stimulus intensity levels. Vision Res 2021; 186:71-79. [PMID: 34058622 PMCID: PMC8273142 DOI: 10.1016/j.visres.2021.05.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2020] [Revised: 04/30/2021] [Accepted: 05/12/2021] [Indexed: 10/21/2022]
Abstract
Flexibility and robustness of multisensory temporal recalibration is paramount for maintaining perceptual constancy of the surrounding natural world. Different environments impart various impediments, distances and routes that alter the propagation times of sight and sound cues comprising a multimodal event. One's ability to rapidly calibrate and account for these external variations allows for maintained perception of synchrony which is crucial for coherent and consistent perception. The two common paradigms used to compare precision of temporal processing between experimental and control groups, the simultaneity judgment (SJ) and temporal order judgment (TOJ) tasks, often use supra-threshold stimuli. However, few studies have specifically examined the effects of normalizing stimulus intensities to participant's unisensory detection thresholds. The current project presented multiple combinations of auditory and visual stimulus intensity levels, based on individual detection thresholds, during a TOJ and a SJ task. While no effect of stimulus intensity was found on temporal sensitivity or perceived temporal synchrony, there was a significant difference in point of subjective simultaneity (PSS) measures between tasks. In addition, PSS estimates were audio-leading, rather than visual-leading as previously reported, suggesting that exposure to the particular combinations of stimulus intensity levels used influenced temporal synchrony perception. Overall, these results support the use of supra-threshold stimuli in TOJ and SJ tasks as a way of minimizing the confound from differences in unisensory processing.
Collapse
Affiliation(s)
- Alexandra N Scurry
- Department of Psychology, University of Nevada, 1664 N. Virginia St., Reno, NV 89557, USA.
| | - Zachary Lovelady
- Department of Psychology, University of Nevada, 1664 N. Virginia St., Reno, NV 89557, USA
| | - Fang Jiang
- Department of Psychology, University of Nevada, 1664 N. Virginia St., Reno, NV 89557, USA
| |
Collapse
|
13
|
Li S, Ding Q, Yuan Y, Yue Z. Audio-Visual Causality and Stimulus Reliability Affect Audio-Visual Synchrony Perception. Front Psychol 2021; 12:629996. [PMID: 33679553 PMCID: PMC7930005 DOI: 10.3389/fpsyg.2021.629996] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2020] [Accepted: 01/28/2021] [Indexed: 11/18/2022] Open
Abstract
People can discriminate the synchrony between audio-visual scenes. However, the sensitivity of audio-visual synchrony perception can be affected by many factors. Using a simultaneity judgment task, the present study investigated whether the synchrony perception of complex audio-visual stimuli was affected by audio-visual causality and stimulus reliability. In Experiment 1, the results showed that audio-visual causality could increase one's sensitivity to audio-visual onset asynchrony (AVOA) of both action stimuli and speech stimuli. Moreover, participants were more tolerant of AVOA of speech stimuli than that of action stimuli in the high causality condition, whereas no significant difference between these two kinds of stimuli was found in the low causality condition. In Experiment 2, the speech stimuli were manipulated with either high or low stimulus reliability. The results revealed a significant interaction between audio-visual causality and stimulus reliability. Under the low causality condition, the percentage of “synchronous” responses of audio-visual intact stimuli was significantly higher than that of visual_intact/auditory_blurred stimuli and audio-visual blurred stimuli. In contrast, no significant difference among all levels of stimulus reliability was observed under the high causality condition. Our study supported the synergistic effect of top-down processing and bottom-up processing in audio-visual synchrony perception.
Collapse
Affiliation(s)
- Shao Li
- Department of Psychology, Sun Yat-sen University, Guangzhou, China
| | - Qi Ding
- Department of Psychology, Sun Yat-sen University, Guangzhou, China
| | - Yichen Yuan
- Department of Psychology, Sun Yat-sen University, Guangzhou, China
| | - Zhenzhu Yue
- Department of Psychology, Sun Yat-sen University, Guangzhou, China
| |
Collapse
|
14
|
Opoku-Baah C, Wallace MT. Binocular Enhancement of Multisensory Temporal Perception. Invest Ophthalmol Vis Sci 2021; 62:7. [PMID: 33661284 PMCID: PMC7938005 DOI: 10.1167/iovs.62.3.7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023] Open
Abstract
Purpose The goal of this study was to examine the behavioral effects and to suggest possible underlying mechanisms of binocularity on audiovisual temporal perception in normally-sighted individuals. Methods Participants performed two audiovisual simultaneity judgment tasks-one using simple flashes and beeps and the other using audiovisual speech stimuli-with the left eye, right eye, and both eyes. Two measures, the point of subjective simultaneity (PSS) and the temporal binding window (TBW), an index for audiovisual temporal acuity, were derived for each viewing condition, stimulus type, and participant. The data were then modeled using causal inference, allowing us to determine whether binocularity affected low-level unisensory mechanisms (i.e., sensory noise level) or high-level multisensory mechanisms (i.e., prior probability of interring a common cause, pC=1). Results Whereas for the PSS there was no significant effect of viewing condition, for the TBW, a significant interaction between stimulus type and viewing condition was found. Post hoc analyses revealed a significantly narrower TBW during binocular than monocular viewing (average of left and right eyes) for the flash-beep condition but no difference between the viewing conditions for the speech stimuli. Modeling results showed no significant difference in pC=1 but a significant reduction in sensory noise during binocular performance on flash-beep trials. Conclusions Binocular viewing was found to enhance audiovisual temporal acuity as indexed by the TBW for simple low-level audiovisual stimuli. Furthermore, modeling results suggest that this effect may stem from enhanced sensory representations evidenced as a reduction in sensory noise affecting the measurement of physical asynchrony during audiovisual temporal perception.
Collapse
Affiliation(s)
- Collins Opoku-Baah
- Neuroscience Graduate Program, Vanderbilt University, Nashville, Tennessee, United States.,Vanderbilt Brain Institute, Vanderbilt University, Nashville, Tennessee, United States
| | - Mark T Wallace
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, Tennessee, United States.,Department of Psychology, Vanderbilt University, Nashville, Tennessee, United States.,Department of Hearing and Speech, Vanderbilt University Medical Center, Nashville, Tennessee, United States.,Vanderbilt Vision Research Center, Nashville, Tennessee, United States.,Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center, Nashville, Tennessee, United States.,Department of Pharmacology, Vanderbilt University, Nashville, Tennessee, United States
| |
Collapse
|
15
|
Opoku-Baah C, Wallace MT. Brief period of monocular deprivation drives changes in audiovisual temporal perception. J Vis 2020; 20:8. [PMID: 32761108 PMCID: PMC7438662 DOI: 10.1167/jov.20.8.8] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2023] Open
Abstract
The human brain retains a striking degree of plasticity into adulthood. Recent studies have demonstrated that a short period of altered visual experience (via monocular deprivation) can change the dynamics of binocular rivalry in favor of the deprived eye, a compensatory action thought to be mediated by an upregulation of cortical gain control mechanisms. Here, we sought to better understand the impact of monocular deprivation on multisensory abilities, specifically examining audiovisual temporal perception. Using an audiovisual simultaneity judgment task, we discovered that 90 minutes of monocular deprivation produced opposing effects on the temporal binding window depending on the eye used in the task. Thus, in those who performed the task with their deprived eye there was a narrowing of the temporal binding window, whereas in those performing the task with their nondeprived eye there was a widening of the temporal binding window. The effect was short lived, being observed only in the first 10 minutes of postdeprivation testing. These findings indicate that changes in visual experience in the adult can rapidly impact multisensory perceptual processes, a finding that has important clinical implications for those patients with adult-onset visual deprivation and for therapies founded on monocular deprivation.
Collapse
Affiliation(s)
| | - Mark T Wallace
- ,.,,.,,.,,.,,.,,
| |
Collapse
|
16
|
Individual differences in sensory integration predict differences in time perception and individual levels of schizotypy. Conscious Cogn 2020; 84:102979. [DOI: 10.1016/j.concog.2020.102979] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2020] [Revised: 06/17/2020] [Accepted: 06/17/2020] [Indexed: 12/13/2022]
|
17
|
Zhou HY, Cheung EFC, Chan RCK. Audiovisual temporal integration: Cognitive processing, neural mechanisms, developmental trajectory and potential interventions. Neuropsychologia 2020; 140:107396. [PMID: 32087206 DOI: 10.1016/j.neuropsychologia.2020.107396] [Citation(s) in RCA: 28] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2019] [Revised: 02/14/2020] [Accepted: 02/15/2020] [Indexed: 12/21/2022]
Abstract
To integrate auditory and visual signals into a unified percept, the paired stimuli must co-occur within a limited time window known as the Temporal Binding Window (TBW). The width of the TBW, a proxy of audiovisual temporal integration ability, has been found to be correlated with higher-order cognitive and social functions. A comprehensive review of studies investigating audiovisual TBW reveals several findings: (1) a wide range of top-down processes and bottom-up features can modulate the width of the TBW, facilitating adaptation to the changing and multisensory external environment; (2) a large-scale brain network works in coordination to ensure successful detection of audiovisual (a)synchrony; (3) developmentally, audiovisual TBW follows a U-shaped pattern across the lifespan, with a protracted developmental course into late adolescence and rebounding in size again in late life; (4) an enlarged TBW is characteristic of a number of neurodevelopmental disorders; and (5) the TBW is highly flexible via perceptual and musical training. Interventions targeting the TBW may be able to improve multisensory function and ameliorate social communicative symptoms in clinical populations.
Collapse
Affiliation(s)
- Han-Yu Zhou
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | | | - Raymond C K Chan
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, China.
| |
Collapse
|
18
|
Individual Differences in Multisensory Interactions:The Influence of Temporal Phase Coherence and Auditory Salience on Visual Contrast Sensitivity. Vision (Basel) 2020; 4:vision4010012. [PMID: 32033350 PMCID: PMC7157667 DOI: 10.3390/vision4010012] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2019] [Revised: 01/21/2020] [Accepted: 01/30/2020] [Indexed: 11/16/2022] Open
Abstract
While previous research has investigated key factors contributing to multisensory integration in isolation, relatively little is known regarding how these factors interact, especially when considering the enhancement of visual contrast sensitivity by a task-irrelevant sound. Here we explored how auditory stimulus properties, namely salience and temporal phase coherence in relation to the visual target, jointly affect the extent to which a sound can enhance visual contrast sensitivity. Visual contrast sensitivity was measured by a psychophysical task, where human adult participants reported the location of a visual Gabor pattern presented at various contrast levels. We expected the most enhanced contrast sensitivity, the lowest contrast threshold, when the visual stimulus was accompanied by a task-irrelevant sound, weak in auditory salience, modulated in-phase with the visual stimulus (strong temporal phase coherence). Our expectations were confirmed, but only if we accounted for individual differences in optimal auditory salience level to induce maximal multisensory enhancement effects. Our findings highlight the importance of interactions between temporal phase coherence and stimulus effectiveness in determining the strength of multisensory enhancement of visual contrast as well as highlighting the importance of accounting for individual differences.
Collapse
|
19
|
Wallace MT, Woynaroski TG, Stevenson RA. Multisensory Integration as a Window into Orderly and Disrupted Cognition and Communication. Annu Rev Psychol 2020; 71:193-219. [DOI: 10.1146/annurev-psych-010419-051112] [Citation(s) in RCA: 37] [Impact Index Per Article: 9.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
During our everyday lives, we are confronted with a vast amount of information from several sensory modalities. This multisensory information needs to be appropriately integrated for us to effectively engage with and learn from our world. Research carried out over the last half century has provided new insights into the way such multisensory processing improves human performance and perception; the neurophysiological foundations of multisensory function; the time course for its development; how multisensory abilities differ in clinical populations; and, most recently, the links between multisensory processing and cognitive abilities. This review summarizes the extant literature on multisensory function in typical and atypical circumstances, discusses the implications of the work carried out to date for theory and research, and points toward next steps for advancing the field.
Collapse
Affiliation(s)
- Mark T. Wallace
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, Tennessee 37232, USA;,
- Departments of Psychology and Pharmacology, Vanderbilt University, Nashville, Tennessee 37232, USA
- Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center, Nashville, Tennessee 37232, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, Tennessee 37232, USA
- Vanderbilt Kennedy Center, Nashville, Tennessee 37203, USA
| | - Tiffany G. Woynaroski
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, Tennessee 37232, USA;,
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, Tennessee 37232, USA
- Vanderbilt Kennedy Center, Nashville, Tennessee 37203, USA
| | - Ryan A. Stevenson
- Departments of Psychology and Psychiatry and Program in Neuroscience, University of Western Ontario, London, Ontario N6A 3K7, Canada
- Brain and Mind Institute, University of Western Ontario, London, Ontario N6A 3K7, Canada
| |
Collapse
|
20
|
Sanders P, Thompson B, Corballis P, Searchfield G. On the Timing of Signals in Multisensory Integration and Crossmodal Interactions: a Scoping Review. Multisens Res 2019; 32:533-573. [PMID: 31137004 DOI: 10.1163/22134808-20191331] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2018] [Accepted: 04/24/2019] [Indexed: 11/19/2022]
Abstract
A scoping review was undertaken to explore research investigating early interactions and integration of auditory and visual stimuli in the human brain. The focus was on methods used to study low-level multisensory temporal processing using simple stimuli in humans, and how this research has informed our understanding of multisensory perception. The study of multisensory temporal processing probes how the relative timing between signals affects perception. Several tasks, illusions, computational models, and neuroimaging techniques were identified in the literature search. Research into early audiovisual temporal processing in special populations was also reviewed. Recent research has continued to provide support for early integration of crossmodal information. These early interactions can influence higher-level factors, and vice versa. Temporal relationships between auditory and visual stimuli influence multisensory perception, and likely play a substantial role in solving the 'correspondence problem' (how the brain determines which sensory signals belong together, and which should be segregated).
Collapse
Affiliation(s)
- Philip Sanders
- 1Section of Audiology, University of Auckland, Auckland, New Zealand.,2Centre for Brain Research, University of Auckland, New Zealand.,3Brain Research New Zealand - Rangahau Roro Aotearoa, New Zealand
| | - Benjamin Thompson
- 2Centre for Brain Research, University of Auckland, New Zealand.,4School of Optometry and Vision Science, University of Auckland, Auckland, New Zealand.,5School of Optometry and Vision Science, University of Waterloo, Waterloo, Canada
| | - Paul Corballis
- 2Centre for Brain Research, University of Auckland, New Zealand.,6Department of Psychology, University of Auckland, Auckland, New Zealand
| | - Grant Searchfield
- 1Section of Audiology, University of Auckland, Auckland, New Zealand.,2Centre for Brain Research, University of Auckland, New Zealand.,3Brain Research New Zealand - Rangahau Roro Aotearoa, New Zealand
| |
Collapse
|
21
|
Sachgau C, Chung W, Barnett-Cowan M. Perceived timing of active head movement at different speeds. Neurosci Lett 2018; 687:253-258. [PMID: 30287302 DOI: 10.1016/j.neulet.2018.09.065] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2018] [Revised: 09/21/2018] [Accepted: 09/29/2018] [Indexed: 10/28/2022]
Abstract
The central nervous system must determine which sensory events occur at the same time. Actively moving the head corresponds with large changes in the relationship between the observer and the environment, sensorimotor processing, and spatiotemporal perception. Active head movement perception has been shown to be dependent on head movement velocity where participants who move their head fastest require the head to move earlier than comparison stimuli for perceived simultaneity more so than those who move their head slower. Such between-subject results cannot address whether active head movement perception changes with velocity. The present study used a within-subjects design to measure the point of subjective simultaneity (PSS) between active head movement speeds and a comparison sound stimulus to characterize the relationship between the velocity and perception of head movement onset. Our results clearly show that i) head movement perception is faster with faster head movements within-subjects, ii) active head movement onset must still precede the onset of other sensory events (average PSS: -123 ms to -52 ms; median PSS: -42 ms to -100 ms) in order to be perceived as occurring simultaneously even at the fastest speeds (average peak velocity: 76°/s-257°/s; median peak velocity 72 ms-257 ms). We conclude that head movement perception is slow, but that this delay is minimized with increased speed. These within-subject results are contrary to previous and present study between-subject results and are in agreement with literature where perception of auditory, visual and vestibular stimulus onset is less delayed with increased stimulus intensity.
Collapse
Affiliation(s)
- Carolin Sachgau
- Department of Kinesiology, University of Waterloo, Waterloo, Ontario, N2L 3G1, Canada.
| | - William Chung
- Department of Kinesiology, University of Waterloo, Waterloo, Ontario, N2L 3G1, Canada
| | - Michael Barnett-Cowan
- Department of Kinesiology, University of Waterloo, Waterloo, Ontario, N2L 3G1, Canada
| |
Collapse
|
22
|
Schormans AL, Allman BL. Behavioral Plasticity of Audiovisual Perception: Rapid Recalibration of Temporal Sensitivity but Not Perceptual Binding Following Adult-Onset Hearing Loss. Front Behav Neurosci 2018; 12:256. [PMID: 30429780 PMCID: PMC6220077 DOI: 10.3389/fnbeh.2018.00256] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2018] [Accepted: 10/11/2018] [Indexed: 11/13/2022] Open
Abstract
The ability to accurately integrate or bind stimuli from more than one sensory modality is highly dependent on the features of the stimuli, such as their intensity and relative timing. Previous studies have demonstrated that the ability to perceptually bind stimuli is impaired in various clinical conditions such as autism, dyslexia, schizophrenia, as well as aging. However, it remains unknown if adult-onset hearing loss, separate from aging, influences audiovisual temporal acuity. In the present study, rats were trained using appetitive operant conditioning to perform an audiovisual temporal order judgment (TOJ) task or synchrony judgment (SJ) task in order to investigate the nature and extent that audiovisual temporal acuity is affected by adult-onset hearing loss, with a specific focus on the time-course of perceptual changes following loud noise exposure. In our first series of experiments, we found that audiovisual temporal acuity in normal-hearing rats was influenced by sound intensity, such that when a quieter sound was presented, the rats were biased to perceive the audiovisual stimuli as asynchronous (SJ task), or as though the visual stimulus was presented first (TOJ task). Psychophysical testing demonstrated that noise-induced hearing loss did not alter the rats' temporal sensitivity 2-3 weeks post-noise exposure, despite rats showing an initial difficulty in differentiating the temporal order of audiovisual stimuli. Furthermore, consistent with normal-hearing rats, the timing at which the stimuli were perceived as simultaneous (i.e., the point of subjective simultaneity, PSS) remained sensitive to sound intensity following hearing loss. Contrary to the TOJ task, hearing loss resulted in persistent impairments in asynchrony detection during the SJ task, such that a greater proportion of trials were now perceived as synchronous. Moreover, psychophysical testing found that noise-exposed rats had altered audiovisual synchrony perception, consistent with impaired audiovisual perceptual binding (e.g., an increase in the temporal window of integration on the right side of simultaneity; right temporal binding window (TBW)). Ultimately, our collective results show for the first time that adult-onset hearing loss leads to behavioral plasticity of audiovisual perception, characterized by a rapid recalibration of temporal sensitivity but a persistent impairment in the perceptual binding of audiovisual stimuli.
Collapse
Affiliation(s)
- Ashley L Schormans
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario, London, ON, Canada
| | - Brian L Allman
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario, London, ON, Canada
| |
Collapse
|
23
|
Finotti G, Migliorati D, Costantini M. Multisensory integration, body representation and hyperactivity of the immune system. Conscious Cogn 2018; 63:61-73. [PMID: 29957448 DOI: 10.1016/j.concog.2018.06.009] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2017] [Revised: 06/05/2018] [Accepted: 06/06/2018] [Indexed: 10/28/2022]
Abstract
Multisensory stimuli are integrated over a delimited window of temporal asynchronies. This window is highly variable across individuals, but the origins of this variability are still not clear. We hypothesized that immune system functioning could partially account for this variability. In two experiments, we investigated the relationship between key aspects of multisensory integration in allergic participants and healthy controls. First, we tested the temporal constraint of multisensory integration, as measured by the temporal binding window. Second, we tested multisensory body representation, as indexed by the Rubber Hand Illusion (RHI). Results showed that allergic participants have a narrower temporal binding window and are less susceptible to the RHI than healthy controls. Overall, we provide evidence linking multisensory integration processes and the activity of the immune system. The present findings are discussed within the context of the effect of immune molecules on the brain mechanisms enabling multisensory integration and multisensory body representation.
Collapse
Affiliation(s)
- Gianluca Finotti
- Centre for Brain Science, Department of Psychology, University of Essex, United Kingdom; Department of Neuroscience, Imaging and Clinical Sciences, University G. d'Annunzio, Chieti, Italy; Institute for Advanced Biomedical Technologies - ITAB, University G. d'Annunzio, Chieti, Italy.
| | - Daniele Migliorati
- Department of Neuroscience, Imaging and Clinical Sciences, University G. d'Annunzio, Chieti, Italy; Institute for Advanced Biomedical Technologies - ITAB, University G. d'Annunzio, Chieti, Italy
| | - Marcello Costantini
- Centre for Brain Science, Department of Psychology, University of Essex, United Kingdom; Department of Neuroscience, Imaging and Clinical Sciences, University G. d'Annunzio, Chieti, Italy; Institute for Advanced Biomedical Technologies - ITAB, University G. d'Annunzio, Chieti, Italy.
| |
Collapse
|
24
|
Shayman CS, Seo JH, Oh Y, Lewis RF, Peterka RJ, Hullar TE. Relationship between vestibular sensitivity and multisensory temporal integration. J Neurophysiol 2018; 120:1572-1577. [PMID: 30020839 DOI: 10.1152/jn.00379.2018] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
A single event can generate asynchronous sensory cues due to variable encoding, transmission, and processing delays. To be interpreted as being associated in time, these cues must occur within a limited time window, referred to as a "temporal binding window" (TBW). We investigated the hypothesis that vestibular deficits could disrupt temporal visual-vestibular integration by determining the relationships between vestibular threshold and TBW in participants with normal vestibular function and with vestibular hypofunction. Vestibular perceptual thresholds to yaw rotation were characterized and compared with the TBWs obtained from participants who judged whether a suprathreshold rotation occurred before or after a brief visual stimulus. Vestibular thresholds ranged from 0.7 to 16.5 deg/s and TBWs ranged from 13.8 to 395 ms. Among all participants, TBW and vestibular thresholds were well correlated ( R2 = 0.674, P < 0.001), with vestibular-deficient patients having higher thresholds and wider TBWs. Participants reported that the rotation onset needed to lead the light flash by an average of 80 ms for the visual and vestibular cues to be perceived as occurring simultaneously. The wide TBWs in vestibular-deficient participants compared with normal functioning participants indicate that peripheral sensory loss can lead to abnormal multisensory integration. A reduced ability to temporally combine sensory cues appropriately may provide a novel explanation for some symptoms reported by patients with vestibular deficits. Even among normal functioning participants, a high correlation between TBW and vestibular thresholds was observed, suggesting that these perceptual measurements are sensitive to small differences in vestibular function. NEW & NOTEWORTHY While spatial visual-vestibular integration has been well characterized, the temporal integration of these cues is not well understood. The relationship between sensitivity to whole body rotation and duration of the temporal window of visual-vestibular integration was examined using psychophysical techniques. These parameters were highly correlated for those with normal vestibular function and for patients with vestibular hypofunction. Reduced temporal integration performance in patients with vestibular hypofunction may explain some symptoms associated with vestibular loss.
Collapse
Affiliation(s)
- Corey S Shayman
- Department of Otolaryngology-Head and Neck Surgery, Oregon Health and Science University , Portland, Oregon
| | - Jae-Hyun Seo
- Department of Otolaryngology-Head and Neck Surgery, Oregon Health and Science University , Portland, Oregon.,Department of Otolaryngology-Head and Neck Surgery, The Catholic University of Korea, Seoul, Republic of Korea
| | - Yonghee Oh
- Department of Otolaryngology-Head and Neck Surgery, Oregon Health and Science University , Portland, Oregon
| | - Richard F Lewis
- Department of Otolaryngology, Harvard Medical School , Boston, Massachusetts.,Department of Neurology, Harvard Medical School , Boston, Massachusetts.,Jenks Vestibular Physiology Laboratory, Massachusetts Eye and Ear Infirmary, Boston, Massachusetts
| | - Robert J Peterka
- National Center for Rehabilitative Auditory Research-VA Portland Health Care System , Portland, Oregon.,Department of Neurology, Oregon Health and Science University , Portland, Oregon
| | - Timothy E Hullar
- Department of Otolaryngology-Head and Neck Surgery, Oregon Health and Science University , Portland, Oregon
| |
Collapse
|
25
|
Stevenson RA, Sheffield SW, Butera IM, Gifford RH, Wallace MT. Multisensory Integration in Cochlear Implant Recipients. Ear Hear 2018; 38:521-538. [PMID: 28399064 DOI: 10.1097/aud.0000000000000435] [Citation(s) in RCA: 47] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Speech perception is inherently a multisensory process involving integration of auditory and visual cues. Multisensory integration in cochlear implant (CI) recipients is a unique circumstance in that the integration occurs after auditory deprivation and the provision of hearing via the CI. Despite the clear importance of multisensory cues for perception, in general, and for speech intelligibility, specifically, the topic of multisensory perceptual benefits in CI users has only recently begun to emerge as an area of inquiry. We review the research that has been conducted on multisensory integration in CI users to date and suggest a number of areas needing further research. The overall pattern of results indicates that many CI recipients show at least some perceptual gain that can be attributable to multisensory integration. The extent of this gain, however, varies based on a number of factors, including age of implantation and specific task being assessed (e.g., stimulus detection, phoneme perception, word recognition). Although both children and adults with CIs obtain audiovisual benefits for phoneme, word, and sentence stimuli, neither group shows demonstrable gain for suprasegmental feature perception. Additionally, only early-implanted children and the highest performing adults obtain audiovisual integration benefits similar to individuals with normal hearing. Increasing age of implantation in children is associated with poorer gains resultant from audiovisual integration, suggesting a sensitive period in development for the brain networks that subserve these integrative functions, as well as length of auditory experience. This finding highlights the need for early detection of and intervention for hearing loss, not only in terms of auditory perception, but also in terms of the behavioral and perceptual benefits of audiovisual processing. Importantly, patterns of auditory, visual, and audiovisual responses suggest that underlying integrative processes may be fundamentally different between CI users and typical-hearing listeners. Future research, particularly in low-level processing tasks such as signal detection will help to further assess mechanisms of multisensory integration for individuals with hearing loss, both with and without CIs.
Collapse
Affiliation(s)
- Ryan A Stevenson
- 1Department of Psychology, University of Western Ontario, London, Ontario, Canada; 2Brain and Mind Institute, University of Western Ontario, London, Ontario, Canada; 3Walter Reed National Military Medical Center, Audiology and Speech Pathology Center, London, Ontario, Canada; 4Vanderbilt Brain Institute, Nashville, Tennesse; 5Vanderbilt Kennedy Center, Nashville, Tennesse; 6Department of Psychology, Vanderbilt University, Nashville, Tennesse; 7Department of Psychiatry, Vanderbilt University Medical Center, Nashville, Tennesse; and 8Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, Tennesse
| | | | | | | | | |
Collapse
|
26
|
Brooks CJ, Chan YM, Anderson AJ, McKendrick AM. Audiovisual Temporal Perception in Aging: The Role of Multisensory Integration and Age-Related Sensory Loss. Front Hum Neurosci 2018; 12:192. [PMID: 29867415 PMCID: PMC5954093 DOI: 10.3389/fnhum.2018.00192] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2017] [Accepted: 04/20/2018] [Indexed: 11/26/2022] Open
Abstract
Within each sensory modality, age-related deficits in temporal perception contribute to the difficulties older adults experience when performing everyday tasks. Since perceptual experience is inherently multisensory, older adults also face the added challenge of appropriately integrating or segregating the auditory and visual cues present in our dynamic environment into coherent representations of distinct objects. As such, many studies have investigated how older adults perform when integrating temporal information across audition and vision. This review covers both direct judgments about temporal information (the sound-induced flash illusion, temporal order, perceived synchrony, and temporal rate discrimination) and judgments regarding stimuli containing temporal information (the audiovisual bounce effect and speech perception). Although an age-related increase in integration has been demonstrated on a variety of tasks, research specifically investigating the ability of older adults to integrate temporal auditory and visual cues has produced disparate results. In this short review, we explore what factors could underlie these divergent findings. We conclude that both task-specific differences and age-related sensory loss play a role in the reported disparity in age-related effects on the integration of auditory and visual temporal information.
Collapse
Affiliation(s)
- Cassandra J Brooks
- Department of Optometry and Vision Sciences, The University of Melbourne, Melbourne, VIC, Australia
| | - Yu Man Chan
- Department of Optometry and Vision Sciences, The University of Melbourne, Melbourne, VIC, Australia
| | - Andrew J Anderson
- Department of Optometry and Vision Sciences, The University of Melbourne, Melbourne, VIC, Australia
| | - Allison M McKendrick
- Department of Optometry and Vision Sciences, The University of Melbourne, Melbourne, VIC, Australia
| |
Collapse
|
27
|
Stimulus duration has little effect on auditory, visual and audiovisual temporal order judgement. Exp Brain Res 2018; 236:1273-1282. [DOI: 10.1007/s00221-018-5218-2] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2017] [Accepted: 02/22/2018] [Indexed: 10/17/2022]
|
28
|
Myers MH, Iannaccone A, Bidelman GM. A pilot investigation of audiovisual processing and multisensory integration in patients with inherited retinal dystrophies. BMC Ophthalmol 2017; 17:240. [PMID: 29212538 PMCID: PMC5719743 DOI: 10.1186/s12886-017-0640-y] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2017] [Accepted: 11/29/2017] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND In this study, we examined audiovisual (AV) processing in normal and visually impaired individuals who exhibit partial loss of vision due to inherited retinal dystrophies (IRDs). METHODS Two groups were analyzed for this pilot study: Group 1 was composed of IRD participants: two with autosomal dominant retinitis pigmentosa (RP), two with autosomal recessive cone-rod dystrophy (CORD), and two with the related complex disorder, Bardet-Biedl syndrome (BBS); Group 2 was composed of 15 non-IRD participants (controls). Audiovisual looming and receding stimuli (conveying perceptual motion) were used to assess the cortical processing and integration of unimodal (A or V) and multimodal (AV) sensory cues. Electroencephalography (EEG) was used to simultaneously resolve the temporal and spatial characteristics of AV processing and assess differences in neural responses between groups. Measurement of AV integration was accomplished via quantification of the EEG's spectral power and event-related brain potentials (ERPs). RESULTS Results show that IRD individuals exhibit reduced AV integration for concurrent audio and visual (AV) stimuli but increased brain activity during the unimodal A (but not V) presentation. This was corroborated in behavioral responses, where IRD patients showed slower and less accurate judgments of AV and V stimuli but more accurate responses in the A-alone condition. CONCLUSIONS Collectively, our findings imply a neural compensation from auditory sensory brain areas due to visual deprivation.
Collapse
Affiliation(s)
- Mark H Myers
- Department of Anatomy and Neurobiology, University of Tennessee Health Sciences Center, Memphis, TN, 38163, USA.
| | - Alessandro Iannaccone
- Department of Ophthalmology, Center for Retinal Degenerations and Ophthalmic Genetic Diseases, Duke University School of Medicine, Durham, NC, USA
| | - Gavin M Bidelman
- Department of Anatomy and Neurobiology, University of Tennessee Health Sciences Center, Memphis, TN, 38163, USA.,School of Communication Sciences & Disorders, University of Memphis, Memphis, TN, USA.,Institute for Intelligent Systems, University of Memphis, Memphis, TN, USA
| |
Collapse
|
29
|
Abstract
Simultaneity judgments were used to measure temporal binding windows (TBW) for brief binaural events (changes in interaural time and/or level differences [ITD and ILD]) and test the hypothesis that ITD and ILD contribute to perception via separate sensory dimensions subject to binding via slow (100+ ms)—presumably cortical—mechanisms as in multisensory TBW. Stimuli were continuous low-frequency noises that included two brief shifts of either type (ITD or ILD), both of which are heard as lateral position changes. TBW for judgments within a single cue dimension were narrower for ITD (mean = 444 ms) than ILD (807 ms). TBW for judgments across cue dimensions (i.e., one ITD shift and one ILD shift) were similar to within-cue ILD (778 ms). The results contradict the original hypothesis, in that cross-cue comparisons were no slower than within-cue ILD comparisons. Rather, the wide TBW values—consistent with previous estimates of multisensory TBW—suggest slow integrative processing for both types of judgments. Narrower TBW for ITD than ILD judgments suggests important cue-specific differences in the neural mechanisms or the perceptual correlates of integration across binaural-cue dimensions.
Collapse
|
30
|
Stevenson RA, Toulmin JK, Youm A, Besney RMA, Schulz SE, Barense MD, Ferber S. Increases in the autistic trait of attention to detail are associated with decreased multisensory temporal adaptation. Sci Rep 2017; 7:14354. [PMID: 29085016 PMCID: PMC5662613 DOI: 10.1038/s41598-017-14632-1] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2017] [Accepted: 10/12/2017] [Indexed: 11/09/2022] Open
Abstract
Recent empirical evidence suggests that autistic individuals perceive the world differently than their typically-developed peers. One theoretical account, the predictive coding hypothesis, posits that autistic individuals show a decreased reliance on previous perceptual experiences, which may relate to autism symptomatology. We tested this through a well-characterized, audiovisual statistical-learning paradigm in which typically-developed participants were first adapted to consistent temporal relationships between audiovisual stimulus pairs (audio-leading, synchronous, visual-leading) and then performed a simultaneity judgement task with audiovisual stimulus pairs varying in temporal offset from auditory-leading to visual-leading. Following exposure to the visual-leading adaptation phase, participants' perception of synchrony was biased towards visual-leading presentations, reflecting the statistical regularities of their previously experienced environment. Importantly, the strength of adaptation was significantly related to the level of autistic traits that the participant exhibited, measured by the Autism Quotient (AQ). This was specific to the Attention to Detail subscale of the AQ that assesses the perceptual propensity to focus on fine-grain aspects of sensory input at the expense of more integrative perceptions. More severe Attention to Detail was related to weaker adaptation. These results support the predictive coding framework, and suggest that changes in sensory perception commonly reported in autism may contribute to autistic symptomatology.
Collapse
Affiliation(s)
- Ryan A Stevenson
- Western University, Department of Psychology, London, ON, Canada.
- Western University, Brain and Mind Institute, London, ON, Canada.
- Western University, Program in Neuroscience, London, ON, Canada.
- Western University, Department of Psychiatry, London, ON, Canada.
- York University, Centre for Vision Research, Toronto, ON, Canada.
| | - Jennifer K Toulmin
- The University of Toronto, Department of Psychology, Toronto, ON, Canada
| | - Ariana Youm
- The University of Toronto, Department of Psychology, Toronto, ON, Canada
| | | | - Samantha E Schulz
- Western University, Department of Psychology, London, ON, Canada
- Western University, Brain and Mind Institute, London, ON, Canada
| | - Morgan D Barense
- The University of Toronto, Department of Psychology, Toronto, ON, Canada
- The Rotman Research Institute, Toronto, ON, Canada
| | - Susanne Ferber
- The University of Toronto, Department of Psychology, Toronto, ON, Canada
- The Rotman Research Institute, Toronto, ON, Canada
| |
Collapse
|
31
|
Abstract
Purpose of Review The integration of information across sensory modalities into unified percepts is a fundamental sensory process upon which a multitude of cognitive processes are based. We review the body of literature exploring aging-related changes in audiovisual integration published over the last five years. Specifically, we review the impact of changes in temporal processing, the influence of the effectiveness of sensory inputs, the role of working memory, and the newer studies of intra-individual variability during these processes. Recent Findings Work in the last five years on bottom-up influences of sensory perception has garnered significant attention. Temporal processing, a driving factors of multisensory integration, has now been shown to decouple with multisensory integration in aging, despite their co-decline with aging. The impact of stimulus effectiveness also changes with age, where older adults show maximal benefit from multisensory gain at high signal-to-noise ratios. Following sensory decline, high working memory capacities have now been shown to be somewhat of a protective factor against age-related declines in audiovisual speech perception, particularly in noise. Finally, newer research is emerging focusing on the general intra-individual variability observed with aging. Summary Overall, the studies of the past five years have replicated and expanded on previous work that highlights the role of bottom-up sensory changes with aging and their influence on audiovisual integration, as well as the top-down influence of working memory.
Collapse
Affiliation(s)
- Sarah H Baum
- Department of Psychology, University of Washington
| | - Ryan Stevenson
- Department of Psychology, Western University.,Brain and Mind Institute, Western University.,Department of Psychiatry, Schulich School of Medicine and Dentistry, Western University.,Program in Neuroscience, Schulich School of Medicine and Dentistry, Western University.,Centre for Vision Research, York University
| |
Collapse
|
32
|
Targher S, Micciolo R, Occelli V, Zampini M. The Role of Temporal Disparity on Audiovisual Integration in Low-Vision Individuals. Perception 2017; 46:1356-1370. [PMID: 28718747 DOI: 10.1177/0301006617720124] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Recent findings have shown that sounds improve visual detection in low vision individuals when the audiovisual stimuli pairs of stimuli are presented simultaneously and from the same spatial position. The present study purports to investigate the temporal aspects of the audiovisual enhancement effect previously reported. Low vision participants were asked to detect the presence of a visual stimulus (yes/no task) presented either alone or together with an auditory stimulus at different stimulus onset asynchronies (SOAs). In the first experiment, the sound was presented either simultaneously or before the visual stimulus (i.e., SOAs 0, 100, 250, 400 ms). The results show that the presence of a task-irrelevant auditory stimulus produced a significant visual detection enhancement in all the conditions. In the second experiment, the sound was either synchronized with, or randomly preceded/lagged behind the visual stimulus (i.e., SOAs 0, ± 250, ± 400 ms). The visual detection enhancement was reduced in magnitude and limited only to the synchronous condition and to the condition in which the sound stimulus was presented 250 ms before the visual stimulus. Taken together, the evidence of the present study seems to suggest that audiovisual interaction in low vision individuals is highly modulated by top-down mechanisms.
Collapse
Affiliation(s)
- Stefano Targher
- Center for Mind/Brain Sciences (CIMeC), University of Trento, Rovereto, TN, Italy
| | - Rocco Micciolo
- Department of Psychology and Cognitive Sciences, University of Trento, Rovereto, TN, Italy
| | - Valeria Occelli
- Center for Mind/Brain Sciences (CIMeC), University of Trento, Rovereto, TN, Italy
| | - Massimiliano Zampini
- Center for Mind/Brain Sciences (CIMeC), University of Trento, Rovereto, TN, Italy; Department of Psychology and Cognitive Sciences, Unirvesity of Trento, Rovereto, TN, Italy
| |
Collapse
|
33
|
Stevenson RA, Baum SH, Krueger J, Newhouse PA, Wallace MT. Links between temporal acuity and multisensory integration across life span. J Exp Psychol Hum Percept Perform 2017; 44:106-116. [PMID: 28447850 DOI: 10.1037/xhp0000424] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
The temporal relationship between individual pieces of information from the different sensory modalities is one of the stronger cues to integrate such information into a unified perceptual gestalt, conveying numerous perceptual and behavioral advantages. Temporal acuity, however, varies greatly over the life span. It has previously been hypothesized that changes in temporal acuity in both development and healthy aging may thus play a key role in integrative abilities. This study tested the temporal acuity of 138 individuals ranging in age from 5 to 80. Temporal acuity and multisensory integration abilities were tested both within and across modalities (audition and vision) with simultaneity judgment and temporal order judgment tasks. We observed that temporal acuity, both within and across modalities, improved throughout development into adulthood and subsequently declined with healthy aging, as did the ability to integrate multisensory speech information. Of importance, throughout development, temporal acuity of simple stimuli (i.e., flashes and beeps) predicted individuals' abilities to integrate more complex speech information. However, in the aging population, although temporal acuity declined with healthy aging and was accompanied by declines in integrative abilities, temporal acuity was not able to predict integration at the individual level. Together, these results suggest that the impact of temporal acuity on multisensory integration varies throughout the life span. Although the maturation of temporal acuity drives the rise of multisensory integrative abilities during development, it is unable to account for changes in integrative abilities in healthy aging. The differential relationships between age, temporal acuity, and multisensory integration suggest an important role for experience in these processes. (PsycINFO Database Record
Collapse
Affiliation(s)
- Ryan A Stevenson
- Department of Psychology, Brain and Mind Institute, University of Western Ontario
| | - Sarah H Baum
- Department of Psychology, University of Washington
| | | | - Paul A Newhouse
- Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center
| | | |
Collapse
|
34
|
Juan C, Cappe C, Alric B, Roby B, Gilardeau S, Barone P, Girard P. The variability of multisensory processes of natural stimuli in human and non-human primates in a detection task. PLoS One 2017; 12:e0172480. [PMID: 28212416 PMCID: PMC5315309 DOI: 10.1371/journal.pone.0172480] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2016] [Accepted: 02/06/2017] [Indexed: 11/19/2022] Open
Abstract
Background Behavioral studies in both human and animals generally converge to the dogma that multisensory integration improves reaction times (RTs) in comparison to unimodal stimulation. These multisensory effects depend on diverse conditions among which the most studied were the spatial and temporal congruences. Further, most of the studies are using relatively simple stimuli while in everyday life, we are confronted to a large variety of complex stimulations constantly changing our attentional focus over time, a modality switch that can impact on stimuli detection. In the present study, we examined the potential sources of the variability in reaction times and multisensory gains with respect to the intrinsic features of a large set of natural stimuli. Methodology/Principle findings Rhesus macaque monkeys and human subjects performed a simple audio-visual stimulus detection task in which a large collection of unimodal and bimodal natural stimuli with semantic specificities was presented at different saliencies. Although we were able to reproduce the well-established redundant signal effect, we failed to reveal a systematic violation of the race model which is considered to demonstrate multisensory integration. In both monkeys and human species, our study revealed a large range of multisensory gains, with negative and positive values. While modality switch has clear effects on reaction times, one of the main causes of the variability of multisensory gains appeared to be linked to the intrinsic physical parameters of the stimuli. Conclusion/Significance Based on the variability of multisensory benefits, our results suggest that the neuronal mechanisms responsible of the redundant effect (interactions vs. integration) are highly dependent on the stimulus complexity suggesting different implications of uni- and multisensory brain regions. Further, in a simple detection task, the semantic values of individual stimuli tend to have no significant impact on task performances, an effect which is probably present in more cognitive tasks.
Collapse
Affiliation(s)
- Cécile Juan
- Cerco, CNRS UMR 5549, Toulouse, France
- Université de Toulouse, UPS, Centre de Recherche Cerveau et Cognition, Toulouse, France
| | - Céline Cappe
- Cerco, CNRS UMR 5549, Toulouse, France
- Université de Toulouse, UPS, Centre de Recherche Cerveau et Cognition, Toulouse, France
| | - Baptiste Alric
- Cerco, CNRS UMR 5549, Toulouse, France
- Université de Toulouse, UPS, Centre de Recherche Cerveau et Cognition, Toulouse, France
| | - Benoit Roby
- Cerco, CNRS UMR 5549, Toulouse, France
- Université de Toulouse, UPS, Centre de Recherche Cerveau et Cognition, Toulouse, France
| | - Sophie Gilardeau
- Cerco, CNRS UMR 5549, Toulouse, France
- Université de Toulouse, UPS, Centre de Recherche Cerveau et Cognition, Toulouse, France
| | - Pascal Barone
- Cerco, CNRS UMR 5549, Toulouse, France
- Université de Toulouse, UPS, Centre de Recherche Cerveau et Cognition, Toulouse, France
| | - Pascal Girard
- Cerco, CNRS UMR 5549, Toulouse, France
- Université de Toulouse, UPS, Centre de Recherche Cerveau et Cognition, Toulouse, France
- INSERM, Toulouse, France
- * E-mail:
| |
Collapse
|
35
|
Schormans AL, Scott KE, Vo AMQ, Tyker A, Typlt M, Stolzberg D, Allman BL. Audiovisual Temporal Processing and Synchrony Perception in the Rat. Front Behav Neurosci 2017; 10:246. [PMID: 28119580 PMCID: PMC5222817 DOI: 10.3389/fnbeh.2016.00246] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2016] [Accepted: 12/16/2016] [Indexed: 11/13/2022] Open
Abstract
Extensive research on humans has improved our understanding of how the brain integrates information from our different senses, and has begun to uncover the brain regions and large-scale neural activity that contributes to an observer’s ability to perceive the relative timing of auditory and visual stimuli. In the present study, we developed the first behavioral tasks to assess the perception of audiovisual temporal synchrony in rats. Modeled after the parameters used in human studies, separate groups of rats were trained to perform: (1) a simultaneity judgment task in which they reported whether audiovisual stimuli at various stimulus onset asynchronies (SOAs) were presented simultaneously or not; and (2) a temporal order judgment task in which they reported whether they perceived the auditory or visual stimulus to have been presented first. Furthermore, using in vivo electrophysiological recordings in the lateral extrastriate visual (V2L) cortex of anesthetized rats, we performed the first investigation of how neurons in the rat multisensory cortex integrate audiovisual stimuli presented at different SOAs. As predicted, rats (n = 7) trained to perform the simultaneity judgment task could accurately (~80%) identify synchronous vs. asynchronous (200 ms SOA) trials. Moreover, the rats judged trials at 10 ms SOA to be synchronous, whereas the majority (~70%) of trials at 100 ms SOA were perceived to be asynchronous. During the temporal order judgment task, rats (n = 7) perceived the synchronous audiovisual stimuli to be “visual first” for ~52% of the trials, and calculation of the smallest timing interval between the auditory and visual stimuli that could be detected in each rat (i.e., the just noticeable difference (JND)) ranged from 77 ms to 122 ms. Neurons in the rat V2L cortex were sensitive to the timing of audiovisual stimuli, such that spiking activity was greatest during trials when the visual stimulus preceded the auditory by 20–40 ms. Ultimately, given that our behavioral and electrophysiological results were consistent with studies conducted on human participants and previous recordings made in multisensory brain regions of different species, we suggest that the rat represents an effective model for studying audiovisual temporal synchrony at both the neuronal and perceptual level.
Collapse
Affiliation(s)
- Ashley L Schormans
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Kaela E Scott
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Albert M Q Vo
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Anna Tyker
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Marei Typlt
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Daniel Stolzberg
- Department of Physiology and Pharmacology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Brian L Allman
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| |
Collapse
|
36
|
Stevenson RA, Park S, Cochran C, McIntosh LG, Noel JP, Barense MD, Ferber S, Wallace MT. The associations between multisensory temporal processing and symptoms of schizophrenia. Schizophr Res 2017; 179:97-103. [PMID: 27746052 PMCID: PMC5463449 DOI: 10.1016/j.schres.2016.09.035] [Citation(s) in RCA: 92] [Impact Index Per Article: 13.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/23/2016] [Revised: 09/23/2016] [Accepted: 09/28/2016] [Indexed: 11/29/2022]
Abstract
Recent neurobiological accounts of schizophrenia have included an emphasis on changes in sensory processing. These sensory and perceptual deficits can have a cascading effect onto higher-level cognitive processes and clinical symptoms. One form of sensory dysfunction that has been consistently observed in schizophrenia is altered temporal processing. In this study, we investigated temporal processing within and across the auditory and visual modalities in individuals with schizophrenia (SCZ) and age-matched healthy controls. Individuals with SCZ showed auditory and visual temporal processing abnormalities, as well as multisensory temporal processing dysfunction that extended beyond that attributable to unisensory processing dysfunction. Most importantly, these multisensory temporal deficits were associated with the severity of hallucinations. This link between atypical multisensory temporal perception and clinical symptomatology suggests that clinical symptoms of schizophrenia may be at least partly a result of cascading effects from (multi)sensory disturbances. These results are discussed in terms of underlying neural bases and the possible implications for remediation.
Collapse
Affiliation(s)
- Ryan A. Stevenson
- The University of Western Ontario, Department of Psychology London, ON, Canada,The University of Western Ontario, Brain and Mind Institute London, ON, Canada
| | - Sohee Park
- Vanderbilt University, Department of Psychology Nashville, TN, USA
| | - Channing Cochran
- Vanderbilt University, Department of Psychology Nashville, TN, USA
| | | | - Jean-Paul Noel
- Vanderbilt Brain Institute, Vanderbilt University Nashville, TN, USA
| | - Morgan D. Barense
- University of Toronto, Department of Psychology Toronto, ON, Canada,Rotman Research Institute Toronto, ON, Canada
| | - Susanne Ferber
- University of Toronto, Department of Psychology Toronto, ON, Canada,Rotman Research Institute Toronto, ON, Canada
| | - Mark T. Wallace
- Vanderbilt Brain Institute, Vanderbilt University Nashville, TN, USA,Vanderbilt University Medical Center, Department of Hearing and Speech Sciences Nashville, TN, USA,Vanderbilt Kennedy Center, Vanderbilt University Medical Center Nashville, TN, USA,Vanderbilt University, Department of Psychology Nashville, TN, USA,Vanderbilt University Medical Center, Department of Psychiatry Nashville, TN, USA
| |
Collapse
|
37
|
van Leeuwen TM, Trautmann-Lengsfeld SA, Wallace MT, Engel AK, Murray MM. Bridging the gap: Synaesthesia and multisensory processes. Neuropsychologia 2016; 88:1-4. [DOI: 10.1016/j.neuropsychologia.2016.06.007] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
38
|
Interactions between space and effectiveness in human multisensory performance. Neuropsychologia 2016; 88:83-91. [PMID: 26826522 DOI: 10.1016/j.neuropsychologia.2016.01.031] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2015] [Revised: 12/30/2015] [Accepted: 01/26/2016] [Indexed: 11/23/2022]
Abstract
Several stimulus factors are important in multisensory integration, including the spatial and temporal relationships of the paired stimuli as well as their effectiveness. Changes in these factors have been shown to dramatically change the nature and magnitude of multisensory interactions. Typically, these factors are considered in isolation, although there is a growing appreciation for the fact that they are likely to be strongly interrelated. Here, we examined interactions between two of these factors - spatial location and effectiveness - in dictating performance in the localization of an audiovisual target. A psychophysical experiment was conducted in which participants reported the perceived location of visual flashes and auditory noise bursts presented alone and in combination. Stimuli were presented at four spatial locations relative to fixation (0°, 30°, 60°, 90°) and at two intensity levels (high, low). Multisensory combinations were always spatially coincident and of the matching intensity (high-high or low-low). In responding to visual stimuli alone, localization accuracy decreased and response times (RTs) increased as stimuli were presented at more eccentric locations. In responding to auditory stimuli, performance was poorest at the 30° and 60° locations. For both visual and auditory stimuli, accuracy was greater and RTs were faster for more intense stimuli. For responses to visual-auditory stimulus combinations, performance enhancements were found at locations in which the unisensory performance was lowest, results concordant with the concept of inverse effectiveness. RTs for these multisensory presentations frequently violated race-model predictions, implying integration of these inputs, and a significant location-by-intensity interaction was observed. Performance gains under multisensory conditions were larger as stimuli were positioned at more peripheral locations, and this increase was most pronounced for the low-intensity conditions. These results provide strong support that the effects of stimulus location and effectiveness on multisensory integration are interdependent, with both contributing to the overall effectiveness of the stimuli in driving the resultant multisensory response.
Collapse
|