1
|
Yu L, Xu J. The Development of Multisensory Integration at the Neuronal Level. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1437:153-172. [PMID: 38270859 DOI: 10.1007/978-981-99-7611-9_10] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
Multisensory integration is a fundamental function of the brain. In the typical adult, multisensory neurons' response to paired multisensory (e.g., audiovisual) cues is significantly more robust than the corresponding best unisensory response in many brain regions. Synthesizing sensory signals from multiple modalities can speed up sensory processing and improve the salience of outside events or objects. Despite its significance, multisensory integration is testified to be not a neonatal feature of the brain. Neurons' ability to effectively combine multisensory information does not occur rapidly but develops gradually during early postnatal life (for cats, 4-12 weeks required). Multisensory experience is critical for this developing process. If animals were restricted from sensing normal visual scenes or sounds (deprived of the relevant multisensory experience), the development of the corresponding integrative ability could be blocked until the appropriate multisensory experience is obtained. This section summarizes the extant literature on the development of multisensory integration (mainly using cat superior colliculus as a model), sensory-deprivation-induced cross-modal plasticity, and how sensory experience (sensory exposure and perceptual learning) leads to the plastic change and modification of neural circuits in cortical and subcortical areas.
Collapse
Affiliation(s)
- Liping Yu
- Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai), School of Life Sciences, East China Normal University, Shanghai, China.
| | - Jinghong Xu
- Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai), School of Life Sciences, East China Normal University, Shanghai, China
| |
Collapse
|
2
|
Jiang H, Stanford TR, Rowland BA, Stein BE. Association Cortex Is Essential to Reverse Hemianopia by Multisensory Training. Cereb Cortex 2021; 31:5015-5023. [PMID: 34056645 DOI: 10.1093/cercor/bhab138] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2021] [Revised: 04/19/2021] [Accepted: 04/19/2021] [Indexed: 11/14/2022] Open
Abstract
Hemianopia induced by unilateral visual cortex lesions can be resolved by repeatedly exposing the blinded hemifield to auditory-visual stimuli. This rehabilitative "training" paradigm depends on mechanisms of multisensory plasticity that restore the lost visual responsiveness of multisensory neurons in the ipsilesional superior colliculus (SC) so that they can once again support vision in the blinded hemifield. These changes are thought to operate via the convergent visual and auditory signals relayed to the SC from association cortex (the anterior ectosylvian sulcus [AES], in cat). The present study tested this assumption by cryogenically deactivating ipsilesional AES in hemianopic, anesthetized cats during weekly multisensory training sessions. No signs of visual recovery were evident in this condition, even after providing animals with up to twice the number of training sessions required for effective rehabilitation. Subsequent training under the same conditions, but with AES active, reversed the hemianopia within the normal timeframe. These results indicate that the corticotectal circuit that is normally engaged in SC multisensory plasticity has to be operational for the brain to use visual-auditory experience to resolve hemianopia.
Collapse
Affiliation(s)
- Huai Jiang
- Department of Neurobiology and Anatomy, Wake Forest School of Medicine, Medical Center Boulevard, Winston-Salem, NC 27157, USA
| | - Terrence R Stanford
- Department of Neurobiology and Anatomy, Wake Forest School of Medicine, Medical Center Boulevard, Winston-Salem, NC 27157, USA
| | - Benjamin A Rowland
- Department of Neurobiology and Anatomy, Wake Forest School of Medicine, Medical Center Boulevard, Winston-Salem, NC 27157, USA
| | - Barry E Stein
- Department of Neurobiology and Anatomy, Wake Forest School of Medicine, Medical Center Boulevard, Winston-Salem, NC 27157, USA
| |
Collapse
|
3
|
Smyre SA, Wang Z, Stein BE, Rowland BA. Multisensory enhancement of overt behavior requires multisensory experience. Eur J Neurosci 2021; 54:4514-4527. [PMID: 34013578 DOI: 10.1111/ejn.15315] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2021] [Revised: 05/11/2021] [Accepted: 05/14/2021] [Indexed: 11/27/2022]
Abstract
The superior colliculus (SC) is richly endowed with neurons that integrate cues from different senses to enhance their physiological responses and the overt behaviors they mediate. However, in the absence of experience with cross-modal combinations (e.g., visual-auditory), they fail to develop this characteristic multisensory capability: Their multisensory responses are no greater than their most effective unisensory responses. Presumably, this impairment in neural development would be reflected as corresponding impairments in SC-mediated behavioral capabilities such as detection and localization performance. Here, we tested that assumption directly in cats raised to adulthood in darkness. They, along with a normally reared cohort, were trained to approach brief visual or auditory stimuli. The animals were then tested with these stimuli individually and in combination under ambient light conditions consistent with their rearing conditions and home environment as well as under the opposite lighting condition. As expected, normally reared animals detected and localized the cross-modal combinations significantly better than their individual component stimuli. However, dark-reared animals showed significant defects in multisensory detection and localization performance. The results indicate that a physiological impairment in single multisensory SC neurons is predictive of an impairment in overt multisensory behaviors.
Collapse
Affiliation(s)
- Scott A Smyre
- Wake Forest School of Medicine, Medical Center Boulevard, Winston-Salem, NC, USA
| | - Zhengyang Wang
- Wake Forest School of Medicine, Medical Center Boulevard, Winston-Salem, NC, USA
| | - Barry E Stein
- Wake Forest School of Medicine, Medical Center Boulevard, Winston-Salem, NC, USA
| | - Benjamin A Rowland
- Wake Forest School of Medicine, Medical Center Boulevard, Winston-Salem, NC, USA
| |
Collapse
|
4
|
Bean NL, Stein BE, Rowland BA. Stimulus value gates multisensory integration. Eur J Neurosci 2021; 53:3142-3159. [PMID: 33667027 DOI: 10.1111/ejn.15167] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2020] [Revised: 02/18/2021] [Accepted: 02/22/2021] [Indexed: 11/28/2022]
Abstract
The brain enhances its perceptual and behavioral decisions by integrating information from its multiple senses in what are believed to be optimal ways. This phenomenon of "multisensory integration" appears to be pre-conscious, effortless, and highly efficient. The present experiments examined whether experience could modify this seemingly automatic process. Cats were trained in a localization task in which congruent pairs of auditory-visual stimuli are normally integrated to enhance detection and orientation/approach performance. Consistent with the results of previous studies, animals more reliably detected and approached cross-modal pairs than their modality-specific component stimuli, regardless of whether the pairings were novel or familiar. However, when provided evidence that one of the modality-specific component stimuli had no value (it was not rewarded) animals ceased integrating it with other cues, and it lost its previous ability to enhance approach behaviors. Cross-modal pairings involving that stimulus failed to elicit enhanced responses even when the paired stimuli were congruent and mutually informative. However, the stimulus regained its ability to enhance responses when it was associated with reward. This suggests that experience can selectively block access of stimuli (i.e., filter inputs) to the multisensory computation. Because this filtering process results in the loss of useful information, its operation and behavioral consequences are not optimal. Nevertheless, the process can be of substantial value in natural environments, rich in dynamic stimuli, by using experience to minimize the impact of stimuli unlikely to be of biological significance, and reducing the complexity of the problem of matching signals across the senses.
Collapse
Affiliation(s)
- Naomi L Bean
- Wake Forest School of Medicine, Winston-Salem, NC, USA
| | - Barry E Stein
- Wake Forest School of Medicine, Winston-Salem, NC, USA
| | | |
Collapse
|
5
|
Oess T, Löhr MPR, Schmid D, Ernst MO, Neumann H. From Near-Optimal Bayesian Integration to Neuromorphic Hardware: A Neural Network Model of Multisensory Integration. Front Neurorobot 2020; 14:29. [PMID: 32499692 PMCID: PMC7243343 DOI: 10.3389/fnbot.2020.00029] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2020] [Accepted: 04/22/2020] [Indexed: 11/18/2022] Open
Abstract
While interacting with the world our senses and nervous system are constantly challenged to identify the origin and coherence of sensory input signals of various intensities. This problem becomes apparent when stimuli from different modalities need to be combined, e.g., to find out whether an auditory stimulus and a visual stimulus belong to the same object. To cope with this problem, humans and most other animal species are equipped with complex neural circuits to enable fast and reliable combination of signals from various sensory organs. This multisensory integration starts in the brain stem to facilitate unconscious reflexes and continues on ascending pathways to cortical areas for further processing. To investigate the underlying mechanisms in detail, we developed a canonical neural network model for multisensory integration that resembles neurophysiological findings. For example, the model comprises multisensory integration neurons that receive excitatory and inhibitory inputs from unimodal auditory and visual neurons, respectively, as well as feedback from cortex. Such feedback projections facilitate multisensory response enhancement and lead to the commonly observed inverse effectiveness of neural activity in multisensory neurons. Two versions of the model are implemented, a rate-based neural network model for qualitative analysis and a variant that employs spiking neurons for deployment on a neuromorphic processing. This dual approach allows to create an evaluation environment with the ability to test model performances with real world inputs. As a platform for deployment we chose IBM's neurosynaptic chip TrueNorth. Behavioral studies in humans indicate that temporal and spatial offsets as well as reliability of stimuli are critical parameters for integrating signals from different modalities. The model reproduces such behavior in experiments with different sets of stimuli. In particular, model performance for stimuli with varying spatial offset is tested. In addition, we demonstrate that due to the emergent properties of network dynamics model performance is close to optimal Bayesian inference for integration of multimodal sensory signals. Furthermore, the implementation of the model on a neuromorphic processing chip enables a complete neuromorphic processing cascade from sensory perception to multisensory integration and the evaluation of model performance for real world inputs.
Collapse
Affiliation(s)
- Timo Oess
- Applied Cognitive Psychology, Institute of Psychology and Education, Ulm University, Ulm, Germany
| | - Maximilian P R Löhr
- Vision and Perception Science Lab, Institute of Neural Information Processing, Ulm University, Ulm, Germany
| | - Daniel Schmid
- Vision and Perception Science Lab, Institute of Neural Information Processing, Ulm University, Ulm, Germany
| | - Marc O Ernst
- Applied Cognitive Psychology, Institute of Psychology and Education, Ulm University, Ulm, Germany
| | - Heiko Neumann
- Vision and Perception Science Lab, Institute of Neural Information Processing, Ulm University, Ulm, Germany
| |
Collapse
|
6
|
Xu X, Hanganu-Opatz IL, Bieler M. Cross-Talk of Low-Level Sensory and High-Level Cognitive Processing: Development, Mechanisms, and Relevance for Cross-Modal Abilities of the Brain. Front Neurorobot 2020; 14:7. [PMID: 32116637 PMCID: PMC7034303 DOI: 10.3389/fnbot.2020.00007] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2019] [Accepted: 01/27/2020] [Indexed: 12/18/2022] Open
Abstract
The emergence of cross-modal learning capabilities requires the interaction of neural areas accounting for sensory and cognitive processing. Convergence of multiple sensory inputs is observed in low-level sensory cortices including primary somatosensory (S1), visual (V1), and auditory cortex (A1), as well as in high-level areas such as prefrontal cortex (PFC). Evidence shows that local neural activity and functional connectivity between sensory cortices participate in cross-modal processing. However, little is known about the functional interplay between neural areas underlying sensory and cognitive processing required for cross-modal learning capabilities across life. Here we review our current knowledge on the interdependence of low- and high-level cortices for the emergence of cross-modal processing in rodents. First, we summarize the mechanisms underlying the integration of multiple senses and how cross-modal processing in primary sensory cortices might be modified by top-down modulation of the PFC. Second, we examine the critical factors and developmental mechanisms that account for the interaction between neuronal networks involved in sensory and cognitive processing. Finally, we discuss the applicability and relevance of cross-modal processing for brain-inspired intelligent robotics. An in-depth understanding of the factors and mechanisms controlling cross-modal processing might inspire the refinement of robotic systems by better mimicking neural computations.
Collapse
Affiliation(s)
- Xiaxia Xu
- Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Ileana L Hanganu-Opatz
- Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Malte Bieler
- Laboratory for Neural Computation, Institute of Basic Medical Sciences, University of Oslo, Oslo, Norway
| |
Collapse
|
7
|
Sugiyama S, Kinukawa T, Takeuchi N, Nishihara M, Shioiri T, Inui K. Tactile Cross-Modal Acceleration Effects on Auditory Steady-State Response. Front Integr Neurosci 2019; 13:72. [PMID: 31920574 PMCID: PMC6927992 DOI: 10.3389/fnint.2019.00072] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2019] [Accepted: 12/02/2019] [Indexed: 01/09/2023] Open
Abstract
In the sensory cortex, cross-modal interaction occurs during the early cortical stages of processing; however, its effect on the speed of neuronal activity remains unclear. In this study, we used magnetoencephalography (MEG) to investigate whether tactile stimulation influences auditory steady-state responses (ASSRs). To this end, a 0.5-ms electrical pulse was randomly presented to the dorsum of the left or right hand of 12 healthy volunteers at 700 ms while a train of 25-ms pure tones were applied to the left or right side at 75 dB for 1,200 ms. Peak latencies of 40-Hz ASSR were measured. Our results indicated that tactile stimulation significantly shortened subsequent ASSR latency. This cross-modal effect was observed from approximately 50 ms to 125 ms after the onset of tactile stimulation. The somatosensory information that appeared to converge on the auditory system may have arisen during the early processing stages, with the reduced ASSR latency indicating that a new sensory event from the cross-modal inputs served to increase the speed of ongoing sensory processing. Collectively, our findings indicate that ASSR latency changes are a sensitive index of accelerated processing.
Collapse
Affiliation(s)
- Shunsuke Sugiyama
- Department of Psychiatry and Psychotherapy, Gifu University Graduate School of Medicine, Gifu, Japan
| | - Tomoaki Kinukawa
- Department of Anesthesiology, Nagoya University Graduate School of Medicine, Nagoya, Japan
| | | | - Makoto Nishihara
- Multidisciplinary Pain Center, Aichi Medical University, Nagakute, Japan
| | - Toshiki Shioiri
- Department of Psychiatry and Psychotherapy, Gifu University Graduate School of Medicine, Gifu, Japan
| | - Koji Inui
- Departmernt of Functioning and Disability, Institute for Developmental Research, Kasugai, Japan
| |
Collapse
|
8
|
Volpe G, Gori M. Multisensory Interactive Technologies for Primary Education: From Science to Technology. Front Psychol 2019; 10:1076. [PMID: 31316410 PMCID: PMC6611336 DOI: 10.3389/fpsyg.2019.01076] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2018] [Accepted: 04/25/2019] [Indexed: 12/02/2022] Open
Abstract
While technology is increasingly used in the classroom, we observe at the same time that making teachers and students accept it is more difficult than expected. In this work, we focus on multisensory technologies and we argue that the intersection between current challenges in pedagogical practices and recent scientific evidence opens novel opportunities for these technologies to bring a significant benefit to the learning process. In our view, multisensory technologies are ideal for effectively supporting an embodied and enactive pedagogical approach exploiting the best-suited sensory modality to teach a concept at school. This represents a great opportunity for designing technologies, which are both grounded on robust scientific evidence and tailored to the actual needs of teachers and students. Based on our experience in technology-enhanced learning projects, we propose six golden rules we deem important for catching this opportunity and fully exploiting it.
Collapse
Affiliation(s)
- Gualtiero Volpe
- Casa Paganini-InfoMus, DIBRIS, University of Genoa, Genoa, Italy
| | - Monica Gori
- U-Vip Unit, Istituto Italiano di Tecnologia, Genoa, Italy
| |
Collapse
|
9
|
Cross-Modal Competition: The Default Computation for Multisensory Processing. J Neurosci 2018; 39:1374-1385. [PMID: 30573648 DOI: 10.1523/jneurosci.1806-18.2018] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2018] [Revised: 12/04/2018] [Accepted: 12/08/2018] [Indexed: 11/21/2022] Open
Abstract
Mature multisensory superior colliculus (SC) neurons integrate information across the senses to enhance their responses to spatiotemporally congruent cross-modal stimuli. The development of this neurotypic feature of SC neurons requires experience with cross-modal cues. In the absence of such experience the response of an SC neuron to congruent cross-modal cues is no more robust than its response to the most effective component cue. This "default" or "naive" state is believed to be one in which cross-modal signals do not interact. The present results challenge this characterization by identifying interactions between visual-auditory signals in male and female cats reared without visual-auditory experience. By manipulating the relative effectiveness of the visual and auditory cross-modal cues that were presented to each of these naive neurons, an active competition between cross-modal signals was revealed. Although contrary to current expectations, this result is explained by a neuro-computational model in which the default interaction is mutual inhibition. These findings suggest that multisensory neurons at all maturational stages are capable of some form of multisensory integration, and use experience with cross-modal stimuli to transition from their initial state of competition to their mature state of cooperation. By doing so, they develop the ability to enhance the physiological salience of cross-modal events thereby increasing their impact on the sensorimotor circuitry of the SC, and the likelihood that biologically significant events will elicit SC-mediated overt behaviors.SIGNIFICANCE STATEMENT The present results demonstrate that the default mode of multisensory processing in the superior colliculus is competition, not non-integration as previously characterized. A neuro-computational model explains how these competitive dynamics can be implemented via mutual inhibition, and how this default mode is superseded by the emergence of cooperative interactions during development.
Collapse
|
10
|
Effect of acceleration of auditory inputs on the primary somatosensory cortex in humans. Sci Rep 2018; 8:12883. [PMID: 30150686 PMCID: PMC6110726 DOI: 10.1038/s41598-018-31319-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2018] [Accepted: 08/17/2018] [Indexed: 11/09/2022] Open
Abstract
Cross-modal interaction occurs during the early stages of processing in the sensory cortex; however, its effect on neuronal activity speed remains unclear. We used magnetoencephalography to investigate whether auditory stimulation influences the initial cortical activity in the primary somatosensory cortex. A 25-ms pure tone was randomly presented to the left or right side of healthy volunteers at 1000 ms when electrical pulses were applied to the left or right median nerve at 20 Hz for 1500 ms because we did not observe any cross-modal effect elicited by a single pulse. The latency of N20 m originating from Brodmann's area 3b was measured for each pulse. The auditory stimulation significantly shortened the N20 m latency at 1050 and 1100 ms. This reduction in N20 m latency was identical for the ipsilateral and contralateral sounds for both latency points. Therefore, somatosensory-auditory interaction, such as input to the area 3b from the thalamus, occurred during the early stages of synaptic transmission. Auditory information that converged on the somatosensory system was considered to have arisen from the early stages of the feedforward pathway. Acceleration of information processing through the cross-modal interaction seemed to be partly due to faster processing in the sensory cortex.
Collapse
|
11
|
Minakata K, Gondan M. Differential coactivation in a redundant signals task with weak and strong go/no-go stimuli. Q J Exp Psychol (Hove) 2018; 72:922-929. [PMID: 29642781 DOI: 10.1177/1747021818772033] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
When participants respond to stimuli of two sources, response times (RTs) are often faster when both stimuli are presented together relative to the RTs obtained when presented separately (redundant signals effect [RSE]). Race models and coactivation models can explain the RSE. In race models, separate channels process the two stimulus components, and the faster processing time determines the overall RT. In audiovisual experiments, the RSE is often higher than predicted by race models, and coactivation models have been proposed that assume integrated processing of the two stimuli. Where does coactivation occur? We implemented a go/no-go task with randomly intermixed weak and strong auditory, visual, and audiovisual stimuli. In one experimental session, participants had to respond to strong stimuli and withhold their response to weak stimuli. In the other session, these roles were reversed. Interestingly, coactivation was only observed in the experimental session in which participants had to respond to strong stimuli. If weak stimuli served as targets, results were widely consistent with the race model prediction. The pattern of results contradicts the inverse effectiveness law. We present two models that explain the result in terms of absolute and relative thresholds.
Collapse
Affiliation(s)
- Katsumi Minakata
- 1 DTU Management Engineering, Technical University of Denmark, Kongens Lyngby, Denmark
| | - Matthias Gondan
- 2 Department of Psychology, University of Copenhagen, Copenhagen, Denmark
| |
Collapse
|
12
|
Development of the Mechanisms Governing Midbrain Multisensory Integration. J Neurosci 2018; 38:3453-3465. [PMID: 29496891 DOI: 10.1523/jneurosci.2631-17.2018] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2017] [Revised: 12/15/2017] [Accepted: 01/19/2018] [Indexed: 11/21/2022] Open
Abstract
The ability to integrate information across multiple senses enhances the brain's ability to detect, localize, and identify external events. This process has been well documented in single neurons in the superior colliculus (SC), which synthesize concordant combinations of visual, auditory, and/or somatosensory signals to enhance the vigor of their responses. This increases the physiological salience of crossmodal events and, in turn, the speed and accuracy of SC-mediated behavioral responses to them. However, this capability is not an innate feature of the circuit and only develops postnatally after the animal acquires sufficient experience with covariant crossmodal events to form links between their modality-specific components. Of critical importance in this process are tectopetal influences from association cortex. Recent findings suggest that, despite its intuitive appeal, a simple generic associative rule cannot explain how this circuit develops its ability to integrate those crossmodal inputs to produce enhanced multisensory responses. The present neurocomputational model explains how this development can be understood as a transition from a default state in which crossmodal SC inputs interact competitively to one in which they interact cooperatively. Crucial to this transition is the operation of a learning rule requiring coactivation among tectopetal afferents for engagement. The model successfully replicates findings of multisensory development in normal cats and cats of either sex reared with special experience. In doing so, it explains how the cortico-SC projections can use crossmodal experience to craft the multisensory integration capabilities of the SC and adapt them to the environment in which they will be used.SIGNIFICANCE STATEMENT The brain's remarkable ability to integrate information across the senses is not present at birth, but typically develops in early life as experience with crossmodal cues is acquired. Recent empirical findings suggest that the mechanisms supporting this development must be more complex than previously believed. The present work integrates these data with what is already known about the underlying circuit in the midbrain to create and test a mechanistic model of multisensory development. This model represents a novel and comprehensive framework that explains how midbrain circuits acquire multisensory experience and reveals how disruptions in this neurotypic developmental trajectory yield divergent outcomes that will affect the multisensory processing capabilities of the mature brain.
Collapse
|
13
|
Xu J, Bi T, Keniston L, Zhang J, Zhou X, Yu L. Deactivation of Association Cortices Disrupted the Congruence of Visual and Auditory Receptive Fields in Superior Colliculus Neurons. Cereb Cortex 2017; 27:5568-5578. [PMID: 27797831 DOI: 10.1093/cercor/bhw324] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2016] [Indexed: 11/13/2022] Open
Abstract
Physiological and behavioral studies in cats show that corticotectal inputs play a critical role in the information-processing capabilities of neurons in the deeper layers of the superior colliculus (SC). Among them, the sensory inputs from functionally related associational cortices are especially critical for SC multisensory integration. However, the underlying mechanism supporting this influence is still unclear. Here, results demonstrate that deactivation of relevant cortices can both dislocate SC visual and auditory spatial receptive fields (RFs) and decrease their overall size, resulting in reduced alignment. Further analysis demonstrated that this RF separation is significantly correlated with the decrement of neurons' multisensory enhancement and is most pronounced in low stimulus intensity conditions. In addition, cortical deactivation could influence the degree of stimulus effectiveness, thereby illustrating the means by which higher order cortices may modify the multisensory activity of SC.
Collapse
Affiliation(s)
- Jinghong Xu
- Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai), School of Life Science, East China Normal University, Shanghai, 200062, China
| | - Tingting Bi
- Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai), School of Life Science, East China Normal University, Shanghai, 200062, China
| | - Les Keniston
- Department of Physical Therapy, University of Maryland Eastern Shore, Princess Anne, MD 21853, USA
| | - Jiping Zhang
- Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai), School of Life Science, East China Normal University, Shanghai, 200062, China
| | - Xiaoming Zhou
- Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai), School of Life Science, East China Normal University, Shanghai, 200062, China.,Collaborative Innovation Center for Brain Science, East China Normal University, Shanghai 200062, China
| | - Liping Yu
- Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai), School of Life Science, East China Normal University, Shanghai, 200062, China
| |
Collapse
|
14
|
Bremen P, Massoudi R, Van Wanrooij MM, Van Opstal AJ. Audio-Visual Integration in a Redundant Target Paradigm: A Comparison between Rhesus Macaque and Man. Front Syst Neurosci 2017; 11:89. [PMID: 29238295 PMCID: PMC5712580 DOI: 10.3389/fnsys.2017.00089] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2017] [Accepted: 11/16/2017] [Indexed: 11/13/2022] Open
Abstract
The mechanisms underlying multi-sensory interactions are still poorly understood despite considerable progress made since the first neurophysiological recordings of multi-sensory neurons. While the majority of single-cell neurophysiology has been performed in anesthetized or passive-awake laboratory animals, the vast majority of behavioral data stems from studies with human subjects. Interpretation of neurophysiological data implicitly assumes that laboratory animals exhibit perceptual phenomena comparable or identical to those observed in human subjects. To explicitly test this underlying assumption, we here characterized how two rhesus macaques and four humans detect changes in intensity of auditory, visual, and audio-visual stimuli. These intensity changes consisted of a gradual envelope modulation for the sound, and a luminance step for the LED. Subjects had to detect any perceived intensity change as fast as possible. By comparing the monkeys' results with those obtained from the human subjects we found that (1) unimodal reaction times differed across modality, acoustic modulation frequency, and species, (2) the largest facilitation of reaction times with the audio-visual stimuli was observed when stimulus onset asynchronies were such that the unimodal reactions would occur at the same time (response, rather than physical synchrony), and (3) the largest audio-visual reaction-time facilitation was observed when unimodal auditory stimuli were difficult to detect, i.e., at slow unimodal reaction times. We conclude that despite marked unimodal heterogeneity, similar multisensory rules applied to both species. Single-cell neurophysiology in the rhesus macaque may therefore yield valuable insights into the mechanisms governing audio-visual integration that may be informative of the processes taking place in the human brain.
Collapse
Affiliation(s)
- Peter Bremen
- Department of Biophysics, Donders Institute for Brain, Cognition, and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands.,Department of Neuroscience, Erasmus Medical Center, Rotterdam, Netherlands
| | - Rooholla Massoudi
- Department of Biophysics, Donders Institute for Brain, Cognition, and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands.,Department of Physiology, Development and Neuroscience, University of Cambridge, Cambridge, United Kingdom
| | - Marc M Van Wanrooij
- Department of Biophysics, Donders Institute for Brain, Cognition, and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
| | - A J Van Opstal
- Department of Biophysics, Donders Institute for Brain, Cognition, and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
| |
Collapse
|
15
|
The normal environment delays the development of multisensory integration. Sci Rep 2017; 7:4772. [PMID: 28684852 PMCID: PMC5500544 DOI: 10.1038/s41598-017-05118-1] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2016] [Accepted: 05/24/2017] [Indexed: 11/08/2022] Open
Abstract
Multisensory neurons in animals whose cross-modal experiences are compromised during early life fail to develop the ability to integrate information across those senses. Consequently, they lack the ability to increase the physiological salience of the events that provide the convergent cross-modal inputs. The present study demonstrates that superior colliculus (SC) neurons in animals whose visual-auditory experience is compromised early in life by noise-rearing can develop visual-auditory multisensory integration capabilities rapidly when periodically exposed to a single set of visual-auditory stimuli in a controlled laboratory paradigm. However, they remain compromised if their experiences are limited to a normal housing environment. These observations seem counterintuitive given that multisensory integrative capabilities ordinarily develop during early life in normal environments, in which a wide variety of sensory stimuli facilitate the functional organization of complex neural circuits at multiple levels of the neuraxis. However, the very richness and inherent variability of sensory stimuli in normal environments will lead to a less regular coupling of any given set of cross-modal cues than does the otherwise "impoverished" laboratory exposure paradigm. That this poses no significant problem for the neonate, but does for the adult, indicates a maturational shift in the requirements for the development of multisensory integration capabilities.
Collapse
|
16
|
Rowland BA, Stanford TR, Stein BE. A Model of the Neural Mechanisms Underlying Multisensory Integration in the Superior Colliculus. Perception 2016; 36:1431-43. [DOI: 10.1068/p5842] [Citation(s) in RCA: 41] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
Much of the information about multisensory integration is derived from studies of the cat superior colliculus (SC), a midbrain structure involved in orientation behaviors. This integration is apparent in the enhanced responses of SC neurons to cross-modal stimuli, responses that exceed those to any of the modality-specific component stimuli. The simplest model of multisensory integration is one in which the SC neuron simply sums its various sensory inputs. However, a number of empirical findings reveal the inadequacy of such a model; for example, the finding that deactivation of cortico-collicular inputs eliminates the enhanced response to a cross-modal stimulus without eliminating responses to the modality-specific component stimuli. These and other empirical findings inform a computational model that accounts for all of the most fundamental aspects of SC multisensory integration. The model is presented in two forms: an algebraic form that conveys the essential insights, and a compartmental form that represents the neuronal computations in a more biologically realistic way.
Collapse
Affiliation(s)
- Benjamin A Rowland
- Department of Neurobiology and Anatomy, Wake Forest University School of Medicine, Medical Center Blvd, Winston-Salem, NC 27157, USA
| | - Terrence R Stanford
- Department of Neurobiology and Anatomy, Wake Forest University School of Medicine, Medical Center Blvd, Winston-Salem, NC 27157, USA
| | - Barry E Stein
- Department of Neurobiology and Anatomy, Wake Forest University School of Medicine, Medical Center Blvd, Winston-Salem, NC 27157, USA
| |
Collapse
|
17
|
Yu L, Xu J, Rowland BA, Stein BE. Multisensory Plasticity in Superior Colliculus Neurons is Mediated by Association Cortex. Cereb Cortex 2014; 26:1130-7. [PMID: 25552270 DOI: 10.1093/cercor/bhu295] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
The ability to integrate information from different senses, and thereby facilitate detecting and localizing events, normally develops gradually in cat superior colliculus (SC) neurons as experience with cross-modal events is acquired. Here, we demonstrate that the portal for this experience-based change is association cortex. Unilaterally deactivating this cortex whenever visual-auditory events were present resulted in the failure of ipsilateral SC neurons to develop the ability to integrate those cross-modal inputs, even though they retained the ability to respond to them. In contrast, their counterparts in the opposite SC developed this capacity normally. The deficits were eliminated by providing cross-modal experience when cortex was active. These observations underscore the collaborative developmental processes that take place among different levels of the neuraxis to adapt the brain's multisensory (and sensorimotor) circuits to the environment in which they will be used.
Collapse
Affiliation(s)
- Liping Yu
- Key Laboratory of Brain Functional Genomics (East China Normal University), Ministry of Education, Shanghai Key Laboratory of Brain Functional Genomics (East China Normal University), School of Life Sciences, East China Normal University, Shanghai 200062, China
| | - Jinghong Xu
- Key Laboratory of Brain Functional Genomics (East China Normal University), Ministry of Education, Shanghai Key Laboratory of Brain Functional Genomics (East China Normal University), School of Life Sciences, East China Normal University, Shanghai 200062, China
| | - Benjamin A Rowland
- Department of Neurobiology and Anatomy, Wake Forest School of Medicine, Winston-Salem, NC 27157, USA
| | - Barry E Stein
- Department of Neurobiology and Anatomy, Wake Forest School of Medicine, Winston-Salem, NC 27157, USA
| |
Collapse
|
18
|
Stein BE, Stanford TR, Rowland BA. Development of multisensory integration from the perspective of the individual neuron. Nat Rev Neurosci 2014; 15:520-35. [PMID: 25158358 DOI: 10.1038/nrn3742] [Citation(s) in RCA: 211] [Impact Index Per Article: 21.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/18/2022]
Abstract
The ability to use cues from multiple senses in concert is a fundamental aspect of brain function. It maximizes the brain’s use of the information available to it at any given moment and enhances the physiological salience of external events. Because each sense conveys a unique perspective of the external world, synthesizing information across senses affords computational benefits that cannot otherwise be achieved. Multisensory integration not only has substantial survival value but can also create unique experiences that emerge when signals from different sensory channels are bound together. However, neurons in a newborn’s brain are not capable of multisensory integration, and studies in the midbrain have shown that the development of this process is not predetermined. Rather, its emergence and maturation critically depend on cross-modal experiences that alter the underlying neural circuit in such a way that optimizes multisensory integrative capabilities for the environment in which the animal will function.
Collapse
|
19
|
Abstract
Detecting and locating environmental events are markedly enhanced by the midbrain's ability to integrate visual and auditory cues. Its capacity for multisensory integration develops in cats 1-4 months after birth but only after acquiring extensive visual-auditory experience. However, briefly deactivating specific regions of association cortex during this period induced long-term disruption of this maturational process, such that even 1 year later animals were unable to integrate visual and auditory cues to enhance their behavioral performance. The data from this animal model reveal a window of sensitivity within which association cortex mediates the encoding of cross-modal experience in the midbrain. Surprisingly, however, 3 years later, and without any additional intervention, the capacity appeared fully developed. This suggests that, although sensitivity degrades with age, the potential for acquiring or modifying multisensory integration capabilities extends well into adulthood.
Collapse
|
20
|
Cuppini C, Magosso E, Rowland B, Stein B, Ursino M. Hebbian mechanisms help explain development of multisensory integration in the superior colliculus: a neural network model. BIOLOGICAL CYBERNETICS 2012; 106:691-713. [PMID: 23011260 PMCID: PMC3552306 DOI: 10.1007/s00422-012-0511-9] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/29/2011] [Accepted: 07/11/2012] [Indexed: 06/01/2023]
Abstract
The superior colliculus (SC) integrates relevant sensory information (visual, auditory, somatosensory) from several cortical and subcortical structures, to program orientation responses to external events. However, this capacity is not present at birth, and it is acquired only through interactions with cross-modal events during maturation. Mathematical models provide a quantitative framework, valuable in helping to clarify the specific neural mechanisms underlying the maturation of the multisensory integration in the SC. We extended a neural network model of the adult SC (Cuppini et al., Front Integr Neurosci 4:1-15, 2010) to describe the development of this phenomenon starting from an immature state, based on known or suspected anatomy and physiology, in which: (1) AES afferents are present but weak, (2) Responses are driven from non-AES afferents, and (3) The visual inputs have a marginal spatial tuning. Sensory experience was modeled by repeatedly presenting modality-specific and cross-modal stimuli. Synapses in the network were modified by simple Hebbian learning rules. As a consequence of this exposure, (1) Receptive fields shrink and come into spatial register, and (2) SC neurons gained the adult characteristic integrative properties: enhancement, depression, and inverse effectiveness. Importantly, the unique architecture of the model guided the development so that integration became dependent on the relationship between the cortical input and the SC. Manipulations of the statistics of the experience during the development changed the integrative profiles of the neurons, and results matched well with the results of physiological studies.
Collapse
Affiliation(s)
- C Cuppini
- Department of Electronics, Computer Science and Systems, University of Bologna, Bologna, Italy.
| | | | | | | | | |
Collapse
|
21
|
Perrault T, Rowland B, Stein B. The Organization and Plasticity of Multisensory Integration in the Midbrain. Front Neurosci 2011. [DOI: 10.1201/b11092-20] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
|
22
|
Perrault T, Rowland B, Stein B. The Organization and Plasticity of Multisensory Integration in the Midbrain. Front Neurosci 2011. [DOI: 10.1201/9781439812174-20] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
|
23
|
Cuppini C, Magosso E, Ursino M. Organization, maturation, and plasticity of multisensory integration: insights from computational modeling studies. Front Psychol 2011; 2:77. [PMID: 21687448 PMCID: PMC3110383 DOI: 10.3389/fpsyg.2011.00077] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2010] [Accepted: 04/12/2011] [Indexed: 11/15/2022] Open
Abstract
In this paper, we present two neural network models – devoted to two specific and widely investigated aspects of multisensory integration – in order to evidence the potentialities of computational models to gain insight into the neural mechanisms underlying organization, development, and plasticity of multisensory integration in the brain. The first model considers visual–auditory interaction in a midbrain structure named superior colliculus (SC). The model is able to reproduce and explain the main physiological features of multisensory integration in SC neurons and to describe how SC integrative capability – not present at birth – develops gradually during postnatal life depending on sensory experience with cross-modal stimuli. The second model tackles the problem of how tactile stimuli on a body part and visual (or auditory) stimuli close to the same body part are integrated in multimodal parietal neurons to form the perception of peripersonal (i.e., near) space. The model investigates how the extension of peripersonal space – where multimodal integration occurs – may be modified by experience such as use of a tool to interact with the far space. The utility of the modeling approach relies on several aspects: (i) The two models, although devoted to different problems and simulating different brain regions, share some common mechanisms (lateral inhibition and excitation, non-linear neuron characteristics, recurrent connections, competition, Hebbian rules of potentiation and depression) that may govern more generally the fusion of senses in the brain, and the learning and plasticity of multisensory integration. (ii) The models may help interpretation of behavioral and psychophysical responses in terms of neural activity and synaptic connections. (iii) The models can make testable predictions that can help guiding future experiments in order to validate, reject, or modify the main assumptions.
Collapse
Affiliation(s)
- Cristiano Cuppini
- Department of Electronics, Computer Science and Systems, University of Bologna Bologna, Italy
| | | | | |
Collapse
|
24
|
Stein BE, Rowland BA. Organization and plasticity in multisensory integration: early and late experience affects its governing principles. PROGRESS IN BRAIN RESEARCH 2011; 191:145-63. [PMID: 21741550 DOI: 10.1016/b978-0-444-53752-2.00007-2] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
Abstract
Neurons in the midbrain superior colliculus (SC) have the ability to integrate information from different senses to profoundly increase their sensitivity to external events. This not only enhances an organism's ability to detect and localize these events, but to program appropriate motor responses to them. The survival value of this process of multisensory integration is self-evident, and its physiological and behavioral manifestations have been studied extensively in adult and developing cats and monkeys. These studies have revealed, that contrary to expectations based on some developmental theories this process is not present in the newborn's brain. The data show that is acquired only gradually during postnatal life as a consequence of at least two factors: the maturation of cooperative interactions between association cortex and the SC, and extensive experience with cross-modal cues. Using these factors, the brain is able to craft the underlying neural circuits and the fundamental principles that govern multisensory integration so that they are adapted to the ecological circumstances in which they will be used.
Collapse
Affiliation(s)
- Barry E Stein
- Department of Neurobiology and Anatomy, Wake Forest School of Medicine, Winston-Salem, North Carolina, USA.
| | | |
Collapse
|
25
|
Sound-induced enhancement of low-intensity vision: multisensory influences on human sensory-specific cortices and thalamic bodies relate to perceptual enhancement of visual detection sensitivity. J Neurosci 2010; 30:13609-23. [PMID: 20943902 DOI: 10.1523/jneurosci.4524-09.2010] [Citation(s) in RCA: 112] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Combining information across modalities can affect sensory performance. We studied how co-occurring sounds modulate behavioral visual detection sensitivity (d'), and neural responses, for visual stimuli of higher or lower intensity. Co-occurrence of a sound enhanced human detection sensitivity for lower- but not higher-intensity visual targets. Functional magnetic resonance imaging (fMRI) linked this to boosts in activity-levels for sensory-specific visual and auditory cortex, plus multisensory superior temporal sulcus (STS), specifically for a lower-intensity visual event when paired with a sound. Thalamic structures in visual and auditory pathways, the lateral and medial geniculate bodies, respectively (LGB, MGB), showed a similar pattern. Subject-by-subject psychophysical benefits correlated with corresponding fMRI signals in visual, auditory, and multisensory regions. We also analyzed differential "coupling" patterns of LGB and MGB with other regions in the different experimental conditions. Effective-connectivity analyses showed enhanced coupling of sensory-specific thalamic bodies with the affected cortical sites during enhanced detection of lower-intensity visual events paired with sounds. Coupling strength between visual and auditory thalamus with cortical regions, including STS, covaried parametrically with the psychophysical benefit for this specific multisensory context. Our results indicate that multisensory enhancement of detection sensitivity for low-contrast visual stimuli by co-occurring sounds reflects a brain network involving not only established multisensory STS and sensory-specific cortex but also visual and auditory thalamus.
Collapse
|
26
|
Cuppini C, Ursino M, Magosso E, Rowland BA, Stein BE. An emergent model of multisensory integration in superior colliculus neurons. Front Integr Neurosci 2010; 4:6. [PMID: 20431725 PMCID: PMC2861478 DOI: 10.3389/fnint.2010.00006] [Citation(s) in RCA: 27] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2009] [Accepted: 03/03/2010] [Indexed: 11/21/2022] Open
Abstract
Neurons in the cat superior colliculus (SC) integrate information from different senses to enhance their responses to cross-modal stimuli. These multisensory SC neurons receive multiple converging unisensory inputs from many sources; those received from association cortex are critical for the manifestation of multisensory integration. The mechanisms underlying this characteristic property of SC neurons are not completely understood, but can be clarified with the use of mathematical models and computer simulations. Thus the objective of the current effort was to present a plausible model that can explain the main physiological features of multisensory integration based on the current neurological literature regarding the influences received by SC from cortical and subcortical sources. The model assumes the presence of competitive mechanisms between inputs, nonlinearities in NMDA receptor responses, and provides a priori synaptic weights to mimic the normal responses of SC neurons. As a result, it provides a basis for understanding the dependence of multisensory enhancement on an intact association cortex, and simulates the changes in the SC response that occur during NMDA receptor blockade. Finally, it makes testable predictions about why significant response differences are obtained in multisensory SC neurons when they are confronted with pairs of cross-modal and within-modal stimuli. By postulating plausible biological mechanisms to complement those that are already known, the model provides a basis for understanding how SC neurons are capable of engaging in this remarkable process.
Collapse
Affiliation(s)
- Cristiano Cuppini
- Department of Electronics, Computer Science and Systems, University of Bologna Bologna, Italy
| | | | | | | | | |
Collapse
|
27
|
Stein BE, Stanford TR, Rowland BA. The neural basis of multisensory integration in the midbrain: its organization and maturation. Hear Res 2009; 258:4-15. [PMID: 19345256 PMCID: PMC2787841 DOI: 10.1016/j.heares.2009.03.012] [Citation(s) in RCA: 83] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/09/2009] [Revised: 03/13/2009] [Accepted: 03/16/2009] [Indexed: 11/20/2022]
Abstract
Multisensory integration describes a process by which information from different sensory systems is combined to influence perception, decisions, and overt behavior. Despite a widespread appreciation of its utility in the adult, its developmental antecedents have received relatively little attention. Here we review what is known about the development of multisensory integration, with a focus on the circuitry and experiential antecedents of its development in the model system of the multisensory (i.e., deep) layers of the superior colliculus. Of particular interest here are two sets of experimental observations: (1) cortical influences appear essential for multisensory integration in the SC, and (2) postnatal experience guides its maturation. The current belief is that the experience normally gained during early life is instantiated in the cortico-SC projection, and that this is the primary route by which ecological pressures adapt SC multisensory integration to the particular environment in which it will be used.
Collapse
Affiliation(s)
- Barry E Stein
- Department of Neurobiology and Anatomy, Wake Forest University School of Medicine, Medical Center Blvd., Winston-Salem, NC 27157-1010, USA.
| | | | | |
Collapse
|
28
|
Stein BE, Perrault TJ, Stanford TR, Rowland BA. Postnatal experiences influence how the brain integrates information from different senses. Front Integr Neurosci 2009; 3:21. [PMID: 19838323 PMCID: PMC2762369 DOI: 10.3389/neuro.07.021.2009] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2009] [Accepted: 08/11/2009] [Indexed: 11/20/2022] Open
Abstract
Sensory processing disorder (SPD) is characterized by anomalous reactions to, and integration of, sensory cues. Although the underlying etiology of SPD is unknown, one brain region likely to reflect these sensory and behavioral anomalies is the superior colliculus (SC), a structure involved in the synthesis of information from multiple sensory modalities and the control of overt orientation responses. In the present review we describe normal functional properties of this structure, the manner in which its individual neurons integrate cues from different senses, and the overt SC-mediated behaviors that are believed to manifest this “multisensory integration.” Of particular interest here is how SC neurons develop their capacity to engage in multisensory integration during early postnatal life as a consequence of early sensory experience, and the intimate communication between cortex and the midbrain that makes this developmental process possible.
Collapse
Affiliation(s)
- Barry E Stein
- Department of Neurobiology and Anatomy, Wake Forest University School of Medicine Winston-Salem, NC, USA
| | | | | | | |
Collapse
|
29
|
Multisensory integration in the superior colliculus requires synergy among corticocollicular inputs. J Neurosci 2009; 29:6580-92. [PMID: 19458228 DOI: 10.1523/jneurosci.0525-09.2009] [Citation(s) in RCA: 47] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Influences from the visual (AEV), auditory (FAES), and somatosensory (SIV) divisions of the cat anterior ectosylvian sulcus (AES) play a critical role in rendering superior colliculus (SC) neurons capable of multisensory integration. However, it is not known whether this is accomplished via their independent sensory-specific action or via some cross-modal cooperative action that emerges as a consequence of their convergence on SC neurons. Using visual-auditory SC neurons as a model, we examined how selective and combined deactivation of FAES and AEV affected SC multisensory (visual-auditory) and unisensory (visual-visual) integration capabilities. As noted earlier, multisensory integration yielded SC responses that were significantly greater than those evoked by the most effective individual component stimulus. This multisensory "response enhancement" was more evident when the component stimuli were weakly effective. Conversely, unisensory integration was dominated by the lack of response enhancement. During cryogenic deactivation of FAES and/or AEV, the unisensory responses of SC neurons were only modestly affected; however, their multisensory response enhancement showed a significant downward shift and was eliminated. The shift was similar in magnitude for deactivation of either AES subregion and, in general, only marginally greater when both were deactivated simultaneously. These data reveal that SC multisensory integration is dependent on the cooperative action of distinct subsets of unisensory corticofugal afferents, afferents whose sensory combination matches the multisensory profile of their midbrain target neurons, and whose functional synergy is specific to rendering SC neurons capable of synthesizing information from those particular senses.
Collapse
|
30
|
Pluta SR, Kawasaki M. Multisensory enhancement of electromotor responses to a single moving object. J Exp Biol 2008; 211:2919-30. [DOI: 10.1242/jeb.016154] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
SUMMARY
Weakly electric fish possess three cutaneous sensory organs structured in arrays with overlapping receptive fields. Theoretically, these tuberous electrosensory, ampullary electrosensory and mechanosensory lateral line receptors receive spatiotemporally congruent stimulation in the presence of a moving object. The current study is the first to quantify the magnitude of multisensory enhancement across these mechanosensory and electrosensory systems during moving-object recognition. We used the novelty response of a pulse-type weakly electric fish to quantitatively compare multisensory responses to their component unisensory responses. Principally, we discovered that multisensory novelty responses are significantly larger than their arithmetically summed component unisensory responses. Additionally, multimodal stimulation yielded a significant increase in novelty response amplitude,probability and the rate of a high-frequency burst, known as a `scallop'. Supralinear multisensory enhancement of the novelty response may signify an augmentation of perception driven by the ecological significance of multimodal stimuli. Scalloping may function as a sensory scan aimed at rapidly facilitating the electrolocation of novel stimuli.
Collapse
Affiliation(s)
- Scott R. Pluta
- Department of Biology, University of Virginia, Charlottesville, VA 22904,USA
| | - Masashi Kawasaki
- Department of Biology, University of Virginia, Charlottesville, VA 22904,USA
| |
Collapse
|
31
|
Abstract
This chapter reviews several highly convergent behavioral findings that provide strong evidence for the existence of multimodal integration systems subserving spatial representation in humans. These systems generally function through the multisensory coding of visuoauditory and visuotactile events but vary in their specific functional and anatomical characteristics. The chapter will also consider the adaptive advantages of multisensory integration systems; these systems might modulate the level of activation in cortical areas in short- and long-term ways, thereby providing a mechanism for permanent recovery from sensory and spatial deficits.
Collapse
Affiliation(s)
- Elisabetta Làdavas
- Dipartimento di Psicologia, Università di Bologna, 40127 Bologna, Italy.
| |
Collapse
|
32
|
Fuentes-Santamaria V, McHaffie JG, Stein BE. Maturation of multisensory integration in the superior colliculus: expression of nitric oxide synthase and neurofilament SMI-32. Brain Res 2008; 1242:45-53. [PMID: 18486108 DOI: 10.1016/j.brainres.2008.03.073] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2008] [Revised: 03/19/2008] [Accepted: 03/22/2008] [Indexed: 10/22/2022]
Abstract
Nitric oxide (NO) containing (nitrergic) interneurons are well-positioned to convey the cortical influences that are crucial for multisensory integration in superior colliculus (SC) output neurons. However, it is not known whether nitrergic interneurons are in this position early in life, and might, therefore, also play a role in the functional maturation of this circuit. In the present study, we investigated the postnatal developmental relationship between these two populations of neurons using Beta-nicotinamide adenine dinucleotide phosphate-diaphorase (NADPH) histochemistry and SMI-32 immunocytochemistry to label presumptive interneurons and output neurons, respectively. SMI-32 immunostained neurons were proved to mature and retained immature anatomical features until approximately 8 postnatal weeks. In contrast, nitrergic interneurons developed more rapidly. They had achieved their adult-like anatomy by 4 postnatal weeks and were in a position to influence the dendritic elaboration of output neurons. It is this dendritic substrate through which much of the cortico-collicular influence is expressed. Double-labeling experiments showed that the dendritic and axonal processes of nitrergic interneurons already apposed the somata and dendrites of SMI-32 labeled neurons even at the earliest age examined. The results suggest that nitrergic interneurons play a role in refining the cortico-collicular projection patterns that are believed to be essential for SC output neurons to engage in multisensory integration and to support normal orientation responses to cross-modal stimuli.
Collapse
Affiliation(s)
- Veronica Fuentes-Santamaria
- Department of Neurobiology and Anatomy, Wake Forest University School of Medicine, Winston-Salem, NC 27157, USA.
| | | | | |
Collapse
|
33
|
Driver J, Noesselt T. Multisensory interplay reveals crossmodal influences on 'sensory-specific' brain regions, neural responses, and judgments. Neuron 2008; 57:11-23. [PMID: 18184561 PMCID: PMC2427054 DOI: 10.1016/j.neuron.2007.12.013] [Citation(s) in RCA: 626] [Impact Index Per Article: 39.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
Although much traditional sensory research has studied each sensory modality in isolation, there has been a recent explosion of interest in causal interplay between different senses. Various techniques have now identified numerous multisensory convergence zones in the brain. Some convergence may arise surprisingly close to low-level sensory-specific cortex, and some direct connections may exist even between primary sensory cortices. A variety of multisensory phenomena have now been reported in which sensory-specific brain responses and perceptual judgments concerning one sense can be affected by relations with other senses. We survey recent progress in this multisensory field, foregrounding human studies against the background of invasive animal work and highlighting possible underlying mechanisms. These include rapid feedforward integration, possible thalamic influences, and/or feedback from multisensory regions to sensory-specific brain areas. Multisensory interplay is more prevalent than classic modular approaches assumed, and new methods are now available to determine the underlying circuits.
Collapse
Affiliation(s)
- Jon Driver
- UCL Institute of Cognitive Neuroscience, University College London, 17 Queen Square, London WC1N 3AR, UK.
| | | |
Collapse
|
34
|
Abstract
Converging cortical influences from the anterior ectosylvian sulcus and the rostral lateral suprasylvian sulcus were shown to have a multisensory-specific role in the integration of sensory information in superior colliculus (SC) neurons. These observations were based on changes induced by cryogenic deactivation of these cortico-SC projections. Thus, although the results indicated that they played a critical role in integrating SC responses to stimuli derived from different senses (i.e., visual-auditory), they played no role in synthesizing its responses to stimuli derived from within the same sense (visual-visual). This was evident even in the same multisensory neurons. The results suggest that very different neural circuits have evolved to code combinations of cross-modal and within-modal stimuli in the SC, and that the differences in multisensory and unisensory integration are likely caused by differences in the configuration of each neuron's functional inputs rather than to any inherent differences among the neurons themselves. The specificity of these descending influences was also apparent in the very different ways in which they affected responses to the component cross-modal stimuli and their actual integration. Furthermore, they appeared to target only multisensory neurons and not their unisensory neighbors.
Collapse
|
35
|
Leo F, Bertini C, di Pellegrino G, Làdavas E. Multisensory integration for orienting responses in humans requires the activation of the superior colliculus. Exp Brain Res 2007; 186:67-77. [PMID: 18008066 DOI: 10.1007/s00221-007-1204-9] [Citation(s) in RCA: 38] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2007] [Accepted: 10/30/2007] [Indexed: 11/25/2022]
Abstract
Animal studies have shown that the superior colliculus (SC) is important for synthesising information from multiple senses into a unified map of space. Here, we tested whether the SC is a critical neural substrate for multisensory spatial integration in humans. To do so, we took advantage of neurophysiological findings revealing that the SC does not receive direct projections from short-wavelength-sensitive S cones. In a simple reaction-time task, participants responded more quickly to concurrent peripheral (extra-foveal) audiovisual (AV) stimuli than to an auditory or visual stimulus alone, a phenomenon known as the redundant target effect (RTE). We show that the nature of this RTE was dependent on the colour of the visual stimulus. When using purple short-wavelength stimuli, to which the SC is blind, RTE was simply explained by probability summation, indicating that the redundant auditory and visual channels are independent. Conversely, with red long-wavelength stimuli, visible to the SC, the RTE was related to nonlinear neural summation, which constitutes evidence of integration of different sensory information. We also demonstrate that when AV stimuli were presented at fixation, so that the spatial orienting component of the task was reduced, neural summation was possible regardless of stimulus colour. Together, these findings provide support for a pivotal role of the SC in mediating multisensory spatial integration in humans, when behaviour involves spatial orienting responses.
Collapse
Affiliation(s)
- Fabrizio Leo
- Centro Studi e Ricerche in Neuroscienze Cognitive, Cesena, Italy
| | | | | | | |
Collapse
|