1
|
Miller LJ, Marco EJ, Chu RC, Camarata S. Editorial: Sensory Processing Across the Lifespan: A 25-Year Initiative to Understand Neurophysiology, Behaviors, and Treatment Effectiveness for Sensory Processing. Front Integr Neurosci 2021; 15:652218. [PMID: 33897385 PMCID: PMC8063042 DOI: 10.3389/fnint.2021.652218] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Key Words] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2021] [Accepted: 02/24/2021] [Indexed: 11/13/2022] Open
Affiliation(s)
- Lucy Jane Miller
- Department of Pediatrics (Emeritus), University of Colorado, Denver, CO, United States.,Sensory Therapies and Research Institute for Sensory Processing Disorder, Centennial, CO, United States
| | - Elysa J Marco
- Cortica (United States), San Diego, CA, United States
| | - Robyn C Chu
- Radiology & Biomedical Imaging, University of California San Francisco, San Francisco, CA, United States.,Growing Healthy Children Therapy Services, Rescue, CA, United States
| | - Stephen Camarata
- School of Medicine, Vanderbilt University, Nashville, TN, United States
| |
Collapse
|
2
|
Differential Rapid Plasticity in Auditory and Visual Responses in the Primarily Multisensory Orbitofrontal Cortex. eNeuro 2020; 7:ENEURO.0061-20.2020. [PMID: 32424057 PMCID: PMC7294472 DOI: 10.1523/eneuro.0061-20.2020] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Accepted: 03/26/2020] [Indexed: 01/17/2023] Open
Abstract
Given the connectivity of orbitofrontal cortex (OFC) with the sensory areas and areas involved in goal execution, it is likely that OFC, along with its function in reward processing, also has a role to play in perception-based multisensory decision-making. To understand mechanisms involved in multisensory decision-making, it is important to first know the encoding of different sensory stimuli in single neurons of the mouse OFC. Ruling out effects of behavioral state, memory, and others, we studied the anesthetized mouse OFC responses to auditory, visual, and audiovisual/multisensory stimuli, multisensory associations and sensory-driven input organization to the OFC. Almost all, OFC single neurons were found to be multisensory in nature, with sublinear to supralinear integration of the component unisensory stimuli. With a novel multisensory oddball stimulus set, we show that the OFC receives both unisensory as well as multisensory inputs, further corroborated by retrograde tracers showing labeling in secondary auditory and visual cortices, which we find to also have similar multisensory integration and responses. With long audiovisual pairing/association, we show rapid plasticity in OFC single neurons, with a strong visual bias, leading to a strong depression of auditory responses and effective enhancement of visual responses. Such rapid multisensory association driven plasticity is absent in the auditory and visual cortices, suggesting its emergence in the OFC. Based on the above results, we propose a hypothetical local circuit model in the OFC that integrates auditory and visual information which participates in computing stimulus value in dynamic multisensory environments.
Collapse
|
3
|
Colonius H, Diederich A. Formal models and quantitative measures of multisensory integration: a selective overview. Eur J Neurosci 2020; 51:1161-1178. [DOI: 10.1111/ejn.13813] [Citation(s) in RCA: 21] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2017] [Revised: 12/18/2017] [Accepted: 12/20/2017] [Indexed: 11/26/2022]
Affiliation(s)
- Hans Colonius
- Department of Psychology Carl von Ossietzky Universität Oldenburg Oldenburg 26111 Germany
- Department of Psychological Sciences Purdue University West Lafayette IN USA
| | - Adele Diederich
- Department of Psychological Sciences Purdue University West Lafayette IN USA
- Life Sciences and Chemistry Jacobs University Bremen Bremen Germany
| |
Collapse
|
4
|
Modified Origins of Cortical Projections to the Superior Colliculus in the Deaf: Dispersion of Auditory Efferents. J Neurosci 2018; 38:4048-4058. [PMID: 29610441 DOI: 10.1523/jneurosci.2858-17.2018] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2017] [Revised: 03/13/2018] [Accepted: 03/16/2018] [Indexed: 11/21/2022] Open
Abstract
Following the loss of a sensory modality, such as deafness or blindness, crossmodal plasticity is commonly identified in regions of the cerebrum that normally process the deprived modality. It has been hypothesized that significant changes in the patterns of cortical afferent and efferent projections may underlie these functional crossmodal changes. However, studies of thalamocortical and corticocortical connections have refuted this hypothesis, instead revealing a profound resilience of cortical afferent projections following deafness and blindness. This report is the first study of cortical outputs following sensory deprivation, characterizing cortical projections to the superior colliculus in mature cats (N = 5, 3 female) with perinatal-onset deafness. The superior colliculus was exposed to a retrograde pathway tracer, and subsequently labeled cells throughout the cerebrum were identified and quantified. Overall, the percentage of cortical projections arising from auditory cortex was substantially increased, not decreased, in early-deaf cats compared with intact animals. Furthermore, the distribution of labeled cortical neurons was no longer localized to a particular cortical subregion of auditory cortex but dispersed across auditory cortical regions. Collectively, these results demonstrate that, although patterns of cortical afferents are stable following perinatal deafness, the patterns of cortical efferents to the superior colliculus are highly mutable.SIGNIFICANCE STATEMENT When a sense is lost, the remaining senses are functionally enhanced through compensatory crossmodal plasticity. In deafness, brain regions that normally process sound contribute to enhanced visual and somatosensory perception. We demonstrate that hearing loss alters connectivity between sensory cortex and the superior colliculus, a midbrain region that integrates sensory representations to guide orientation behavior. Contrasting expectation, the proportion of projections from auditory cortex increased in deaf animals compared with normal hearing, with a broad distribution across auditory fields. This is the first description of changes in cortical efferents following sensory loss and provides support for models predicting an inability to form a coherent, multisensory percept of the environment following periods of abnormal development.
Collapse
|
5
|
Development of the Mechanisms Governing Midbrain Multisensory Integration. J Neurosci 2018; 38:3453-3465. [PMID: 29496891 DOI: 10.1523/jneurosci.2631-17.2018] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2017] [Revised: 12/15/2017] [Accepted: 01/19/2018] [Indexed: 11/21/2022] Open
Abstract
The ability to integrate information across multiple senses enhances the brain's ability to detect, localize, and identify external events. This process has been well documented in single neurons in the superior colliculus (SC), which synthesize concordant combinations of visual, auditory, and/or somatosensory signals to enhance the vigor of their responses. This increases the physiological salience of crossmodal events and, in turn, the speed and accuracy of SC-mediated behavioral responses to them. However, this capability is not an innate feature of the circuit and only develops postnatally after the animal acquires sufficient experience with covariant crossmodal events to form links between their modality-specific components. Of critical importance in this process are tectopetal influences from association cortex. Recent findings suggest that, despite its intuitive appeal, a simple generic associative rule cannot explain how this circuit develops its ability to integrate those crossmodal inputs to produce enhanced multisensory responses. The present neurocomputational model explains how this development can be understood as a transition from a default state in which crossmodal SC inputs interact competitively to one in which they interact cooperatively. Crucial to this transition is the operation of a learning rule requiring coactivation among tectopetal afferents for engagement. The model successfully replicates findings of multisensory development in normal cats and cats of either sex reared with special experience. In doing so, it explains how the cortico-SC projections can use crossmodal experience to craft the multisensory integration capabilities of the SC and adapt them to the environment in which they will be used.SIGNIFICANCE STATEMENT The brain's remarkable ability to integrate information across the senses is not present at birth, but typically develops in early life as experience with crossmodal cues is acquired. Recent empirical findings suggest that the mechanisms supporting this development must be more complex than previously believed. The present work integrates these data with what is already known about the underlying circuit in the midbrain to create and test a mechanistic model of multisensory development. This model represents a novel and comprehensive framework that explains how midbrain circuits acquire multisensory experience and reveals how disruptions in this neurotypic developmental trajectory yield divergent outcomes that will affect the multisensory processing capabilities of the mature brain.
Collapse
|
6
|
Ito S, Feldheim DA. The Mouse Superior Colliculus: An Emerging Model for Studying Circuit Formation and Function. Front Neural Circuits 2018; 12:10. [PMID: 29487505 PMCID: PMC5816945 DOI: 10.3389/fncir.2018.00010] [Citation(s) in RCA: 88] [Impact Index Per Article: 14.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2017] [Accepted: 01/22/2018] [Indexed: 11/30/2022] Open
Abstract
The superior colliculus (SC) is a midbrain area where visual, auditory and somatosensory information are integrated to initiate motor commands. The SC plays a central role in visual information processing in the mouse; it receives projections from 85% to 90% of the retinal ganglion cells (RGCs). While the mouse SC has been a long-standing model used to study retinotopic map formation, a number of technological advances in mouse molecular genetic techniques, large-scale physiological recordings and SC-dependent visual behavioral assays have made the mouse an even more ideal model to understand the relationship between circuitry and behavior.
Collapse
Affiliation(s)
- Shinya Ito
- Santa Cruz Institute for Particle Physics, University of California, Santa Cruz, Santa Cruz, CA, United States
| | - David A Feldheim
- Department of Molecular, Cell and Developmental Biology, University of California, Santa Cruz, Santa Cruz, CA, United States
| |
Collapse
|
7
|
Sutton EE, Demir A, Stamper SA, Fortune ES, Cowan NJ. Dynamic modulation of visual and electrosensory gains for locomotor control. J R Soc Interface 2017; 13:rsif.2016.0057. [PMID: 27170650 DOI: 10.1098/rsif.2016.0057] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2016] [Accepted: 04/13/2016] [Indexed: 11/12/2022] Open
Abstract
Animal nervous systems resolve sensory conflict for the control of movement. For example, the glass knifefish, Eigenmannia virescens, relies on visual and electrosensory feedback as it swims to maintain position within a moving refuge. To study how signals from these two parallel sensory streams are used in refuge tracking, we constructed a novel augmented reality apparatus that enables the independent manipulation of visual and electrosensory cues to freely swimming fish (n = 5). We evaluated the linearity of multisensory integration, the change to the relative perceptual weights given to vision and electrosense in relation to sensory salience, and the effect of the magnitude of sensory conflict on sensorimotor gain. First, we found that tracking behaviour obeys superposition of the sensory inputs, suggesting linear sensorimotor integration. In addition, fish rely more on vision when electrosensory salience is reduced, suggesting that fish dynamically alter sensorimotor gains in a manner consistent with Bayesian integration. However, the magnitude of sensory conflict did not significantly affect sensorimotor gain. These studies lay the theoretical and experimental groundwork for future work investigating multisensory control of locomotion.
Collapse
Affiliation(s)
- Erin E Sutton
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA
| | - Alican Demir
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA
| | - Sarah A Stamper
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA
| | - Eric S Fortune
- Department of Biological Sciences, New Jersey Institute of Technology, Newark, NJ, USA
| | - Noah J Cowan
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA
| |
Collapse
|
8
|
Ohshiro T, Angelaki DE, DeAngelis GC. A Neural Signature of Divisive Normalization at the Level of Multisensory Integration in Primate Cortex. Neuron 2017; 95:399-411.e8. [PMID: 28728025 DOI: 10.1016/j.neuron.2017.06.043] [Citation(s) in RCA: 41] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2017] [Revised: 06/19/2017] [Accepted: 06/26/2017] [Indexed: 10/19/2022]
Abstract
Studies of multisensory integration by single neurons have traditionally emphasized empirical principles that describe nonlinear interactions between inputs from two sensory modalities. We previously proposed that many of these empirical principles could be explained by a divisive normalization mechanism operating in brain regions where multisensory integration occurs. This normalization model makes a critical diagnostic prediction: a non-preferred sensory input from one modality, which activates the neuron on its own, should suppress the response to a preferred input from another modality. We tested this prediction by recording from neurons in macaque area MSTd that integrate visual and vestibular cues regarding self-motion. We show that many MSTd neurons exhibit the diagnostic form of cross-modal suppression, whereas unisensory neurons in area MT do not. The normalization model also fits population responses better than a model based on subtractive inhibition. These findings provide strong support for a divisive normalization mechanism in multisensory integration.
Collapse
Affiliation(s)
- Tomokazu Ohshiro
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14611, USA; Department of Physiology, Tohoku University School of Medicine, Sendai 980-8575, Japan
| | - Dora E Angelaki
- Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA
| | - Gregory C DeAngelis
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14611, USA.
| |
Collapse
|
9
|
Multisensory Integration Uses a Real-Time Unisensory-Multisensory Transform. J Neurosci 2017; 37:5183-5194. [PMID: 28450539 DOI: 10.1523/jneurosci.2767-16.2017] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2016] [Revised: 03/01/2017] [Accepted: 03/06/2017] [Indexed: 11/21/2022] Open
Abstract
The manner in which the brain integrates different sensory inputs to facilitate perception and behavior has been the subject of numerous speculations. By examining multisensory neurons in cat superior colliculus, the present study demonstrated that two operational principles are sufficient to understand how this remarkable result is achieved: (1) unisensory signals are integrated continuously and in real time as soon as they arrive at their common target neuron and (2) the resultant multisensory computation is modified in shape and timing by a delayed, calibrating inhibition. These principles were tested for descriptive sufficiency by embedding them in a neurocomputational model and using it to predict a neuron's moment-by-moment multisensory response given only knowledge of its responses to the individual modality-specific component cues. The predictions proved to be highly accurate, reliable, and unbiased and were, in most cases, not statistically distinguishable from the neuron's actual instantaneous multisensory response at any phase throughout its entire duration. The model was also able to explain why different multisensory products are often observed in different neurons at different time points, as well as the higher-order properties of multisensory integration, such as the dependency of multisensory products on the temporal alignment of crossmodal cues. These observations not only reveal this fundamental integrative operation, but also identify quantitatively the multisensory transform used by each neuron. As a result, they provide a means of comparing the integrative profiles among neurons and evaluating how they are affected by changes in intrinsic or extrinsic factors.SIGNIFICANCE STATEMENT Multisensory integration is the process by which the brain combines information from multiple sensory sources (e.g., vision and audition) to maximize an organism's ability to identify and respond to environmental stimuli. The actual transformative process by which the neural products of multisensory integration are achieved is poorly understood. By focusing on the millisecond-by-millisecond differences between a neuron's unisensory component responses and its integrated multisensory response, it was found that this multisensory transform can be described by two basic principles: unisensory information is integrated in real time and the multisensory response is shaped by calibrating inhibition. It is now possible to use these principles to predict a neuron's multisensory response accurately armed only with knowledge of its unisensory responses.
Collapse
|
10
|
Jiang H, Stein BE, McHaffie JG. Multisensory training reverses midbrain lesion-induced changes and ameliorates haemianopia. Nat Commun 2015; 6:7263. [PMID: 26021613 PMCID: PMC6193257 DOI: 10.1038/ncomms8263] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2014] [Accepted: 04/23/2015] [Indexed: 11/09/2022] Open
Abstract
Failure to attend to visual cues is a common consequence of visual cortex injury. Here, we report on a behavioural strategy whereby cross-modal (auditory-visual) training reinstates visuomotor competencies in animals rendered haemianopic by complete unilateral visual cortex ablation. The re-emergence of visual behaviours is correlated with the reinstatement of visual responsiveness in deep layer neurons of the ipsilesional superior colliculus (SC). This functional recovery is produced by training-induced alterations in descending influences from association cortex that allowed these midbrain neurons to once again transform visual cues into appropriate orientation behaviours. The findings underscore the inherent plasticity and functional breadth of phylogenetically older visuomotor circuits that can express visual capabilities thought to have been subsumed by more recently evolved brain regions. These observations suggest the need for reevaluating current concepts of functional segregation in the visual system and have important implications for strategies aimed at ameliorating trauma-induced visual deficits in humans.
Collapse
Affiliation(s)
- Huai Jiang
- Department of Neurobiology and Anatomy, Wake Forest School of Medicine, Winston-Salem, North Carolina 27157-1010 USA
| | - Barry E Stein
- Department of Neurobiology and Anatomy, Wake Forest School of Medicine, Winston-Salem, North Carolina 27157-1010 USA
| | - John G McHaffie
- Department of Neurobiology and Anatomy, Wake Forest School of Medicine, Winston-Salem, North Carolina 27157-1010 USA
| |
Collapse
|
11
|
Stein BE, Stanford TR, Rowland BA. Development of multisensory integration from the perspective of the individual neuron. Nat Rev Neurosci 2014; 15:520-35. [PMID: 25158358 DOI: 10.1038/nrn3742] [Citation(s) in RCA: 220] [Impact Index Per Article: 22.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/18/2022]
Abstract
The ability to use cues from multiple senses in concert is a fundamental aspect of brain function. It maximizes the brain’s use of the information available to it at any given moment and enhances the physiological salience of external events. Because each sense conveys a unique perspective of the external world, synthesizing information across senses affords computational benefits that cannot otherwise be achieved. Multisensory integration not only has substantial survival value but can also create unique experiences that emerge when signals from different sensory channels are bound together. However, neurons in a newborn’s brain are not capable of multisensory integration, and studies in the midbrain have shown that the development of this process is not predetermined. Rather, its emergence and maturation critically depend on cross-modal experiences that alter the underlying neural circuit in such a way that optimizes multisensory integrative capabilities for the environment in which the animal will function.
Collapse
|
12
|
Rowland BA, Stein BE. A model of the temporal dynamics of multisensory enhancement. Neurosci Biobehav Rev 2013; 41:78-84. [PMID: 24374382 DOI: 10.1016/j.neubiorev.2013.12.003] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2013] [Revised: 11/04/2013] [Accepted: 12/10/2013] [Indexed: 11/29/2022]
Abstract
The senses transduce different forms of environmental energy, and the brain synthesizes information across them to enhance responses to salient biological events. We hypothesize that the potency of multisensory integration is attributable to the convergence of independent and temporally aligned signals derived from cross-modal stimulus configurations onto multisensory neurons. The temporal profile of multisensory integration in neurons of the deep superior colliculus (SC) is consistent with this hypothesis. The responses of these neurons to visual, auditory, and combinations of visual-auditory stimuli reveal that multisensory integration takes place in real-time; that is, the input signals are integrated as soon as they arrive at the target neuron. Interactions between cross-modal signals may appear to reflect linear or nonlinear computations on a moment-by-moment basis, the aggregate of which determines the net product of multisensory integration. Modeling observations presented here suggest that the early nonlinear components of the temporal profile of multisensory integration can be explained with a simple spiking neuron model, and do not require more sophisticated assumptions about the underlying biology. A transition from nonlinear "super-additive" computation to linear, additive computation can be accomplished via scaled inhibition. The findings provide a set of design constraints for artificial implementations seeking to exploit the basic principles and potency of biological multisensory integration in contexts of sensory substitution or augmentation.
Collapse
Affiliation(s)
| | - Barry E Stein
- Wake Forest School of Medicine, Winston-Salem, NC 27157, United States.
| |
Collapse
|
13
|
Lanz F, Moret V, Rouiller EM, Loquet G. Multisensory Integration in Non-Human Primates during a Sensory-Motor Task. Front Hum Neurosci 2013; 7:799. [PMID: 24319421 PMCID: PMC3837444 DOI: 10.3389/fnhum.2013.00799] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2013] [Accepted: 11/03/2013] [Indexed: 12/12/2022] Open
Abstract
Daily our central nervous system receives inputs via several sensory modalities, processes them and integrates information in order to produce a suitable behavior. The amazing part is that such a multisensory integration brings all information into a unified percept. An approach to start investigating this property is to show that perception is better and faster when multimodal stimuli are used as compared to unimodal stimuli. This forms the first part of the present study conducted in a non-human primate's model (n = 2) engaged in a detection sensory-motor task where visual and auditory stimuli were displayed individually or simultaneously. The measured parameters were the reaction time (RT) between stimulus and onset of arm movement, successes and errors percentages, as well as the evolution as a function of time of these parameters with training. As expected, RTs were shorter when the subjects were exposed to combined stimuli. The gains for both subjects were around 20 and 40 ms, as compared with the auditory and visual stimulus alone, respectively. Moreover the number of correct responses increased in response to bimodal stimuli. We interpreted such multisensory advantage through redundant signal effect which decreases perceptual ambiguity, increases speed of stimulus detection, and improves performance accuracy. The second part of the study presents single-unit recordings derived from the premotor cortex (PM) of the same subjects during the sensory-motor task. Response patterns to sensory/multisensory stimulation are documented and specific type proportions are reported. Characterization of bimodal neurons indicates a mechanism of audio-visual integration possibly through a decrease of inhibition. Nevertheless the neural processing leading to faster motor response from PM as a polysensory association cortical area remains still unclear.
Collapse
Affiliation(s)
- Florian Lanz
- Domain of Physiology, Department of Medicine, Fribourg Cognition Center, University of Fribourg , Fribourg , Switzerland
| | | | | | | |
Collapse
|
14
|
Nagalski A, Irimia M, Szewczyk L, Ferran JL, Misztal K, Kuznicki J, Wisniewska MB. Postnatal isoform switch and protein localization of LEF1 and TCF7L2 transcription factors in cortical, thalamic, and mesencephalic regions of the adult mouse brain. Brain Struct Funct 2013; 218:1531-49. [PMID: 23152144 PMCID: PMC3825142 DOI: 10.1007/s00429-012-0474-6] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2012] [Accepted: 10/25/2012] [Indexed: 02/06/2023]
Abstract
β-Catenin signaling, leading to the activation of lymphoid enhancer-binding factor 1/T cell factor (LEF1/TCF) transcription factors, plays a well-established role in transcription regulation during development and tissue homeostasis. In the adult organism, the activity of this pathway has been found in stem cell niches and postmitotic thalamic neurons. Recently, studies show that mutations in components of β-catenin signaling networks have been associated with several psychiatric disorders, indicating the involvement of β-catenin and LEF1/TCF proteins in the proper functioning of the brain. Here, we report a comprehensive analysis of LEF1/TCF protein localization and the expression profile of their isoforms in cortical, thalamic, and midbrain regions in mice. We detected LEF1 and TCF7L2 proteins in neurons of the thalamus and dorsal midbrain, i.e., subcortical regions specialized in the integration of diverse sources of sensory information. These neurons also exhibited nuclear localization of β-catenin, suggesting the involvement of β-catenin/LEF1/TCF7L2 in the regulation of gene expression in these regions. Analysis of alternative splicing and promoter usage identified brain-specific TCF7L2 isoforms and revealed a developmentally coordinated transition in the composition of LEF1 and TCF7L2 isoforms. In the case of TCF7L2, the typical brain isoforms lack the so-called C clamp; in addition, the dominant-negative isoforms are predominant in the embryonic thalamus but disappear postnatally. The present study provides a necessary framework to understand the role of LEF1/TCF factors in thalamic and midbrain development until adulthood and predicts that the regulatory role of these proteins in the adult brain is significantly different from their role in the embryonic brain or other non-neural tissues.
Collapse
Affiliation(s)
- A. Nagalski
- Laboratory of Neurodegeneration, International Institute of Molecular and Cell Biology, 4 Ks. Trojdena Street, 02-109 Warsaw, Poland
| | - M. Irimia
- Banting and Best Department of Medical Research, Donnelly Centre, University of Toronto, Toronto, ON M5S 3E1 Canada
| | - L. Szewczyk
- Laboratory of Neurodegeneration, International Institute of Molecular and Cell Biology, 4 Ks. Trojdena Street, 02-109 Warsaw, Poland
| | - J. L. Ferran
- Department of Human Anatomy and Psychobiology, School of Medicine, University of Murcia, Murcia, E30071 Spain
| | - K. Misztal
- Laboratory of Neurodegeneration, International Institute of Molecular and Cell Biology, 4 Ks. Trojdena Street, 02-109 Warsaw, Poland
| | - J. Kuznicki
- Laboratory of Neurodegeneration, International Institute of Molecular and Cell Biology, 4 Ks. Trojdena Street, 02-109 Warsaw, Poland
- Department of Molecular and Cellular Neurobiology, Nencki Institute of Experimental Biology, 3 Pasteur Street, 02-093 Warsaw, Poland
| | - M. B. Wisniewska
- Laboratory of Neurodegeneration, International Institute of Molecular and Cell Biology, 4 Ks. Trojdena Street, 02-109 Warsaw, Poland
| |
Collapse
|
15
|
A neural network model can explain ventriloquism aftereffect and its generalization across sound frequencies. BIOMED RESEARCH INTERNATIONAL 2013; 2013:475427. [PMID: 24228250 PMCID: PMC3818813 DOI: 10.1155/2013/475427] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/29/2013] [Revised: 08/28/2013] [Accepted: 08/28/2013] [Indexed: 11/17/2022]
Abstract
Exposure to synchronous but spatially disparate auditory and visual stimuli produces a perceptual shift of sound location towards the visual stimulus (ventriloquism effect). After adaptation to a ventriloquism situation, enduring sound shift is observed in the absence of the visual stimulus (ventriloquism aftereffect). Experimental studies report opposing results as to aftereffect generalization across sound frequencies varying from aftereffect being confined to the frequency used during adaptation to aftereffect generalizing across some octaves. Here, we present an extension of a model of visual-auditory interaction we previously developed. The new model is able to simulate the ventriloquism effect and, via Hebbian learning rules, the ventriloquism aftereffect and can be used to investigate aftereffect generalization across frequencies. The model includes auditory neurons coding both for the spatial and spectral features of the auditory stimuli and mimicking properties of biological auditory neurons. The model suggests that different extent of aftereffect generalization across frequencies can be obtained by changing the intensity of the auditory stimulus that induces different amounts of activation in the auditory layer. The model provides a coherent theoretical framework to explain the apparently contradictory results found in the literature. Model mechanisms and hypotheses are discussed in relation to neurophysiological and psychophysical data.
Collapse
|
16
|
Bridging the gap between theories of sensory cue integration and the physiology of multisensory neurons. Nat Rev Neurosci 2013; 14:429-42. [PMID: 23686172 DOI: 10.1038/nrn3503] [Citation(s) in RCA: 185] [Impact Index Per Article: 16.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Abstract
The richness of perceptual experience, as well as its usefulness for guiding behaviour, depends on the synthesis of information across multiple senses. Recent decades have witnessed a surge in our understanding of how the brain combines sensory cues. Much of this research has been guided by one of two distinct approaches: one is driven primarily by neurophysiological observations, and the other is guided by principles of mathematical psychology and psychophysics. Conflicting results and interpretations have contributed to a conceptual gap between psychophysical and physiological accounts of cue integration, but recent studies of visual-vestibular cue integration have narrowed this gap considerably.
Collapse
|
17
|
Yu L, Xu J, Rowland BA, Stein BE. Development of cortical influences on superior colliculus multisensory neurons: effects of dark-rearing. Eur J Neurosci 2013; 37:1594-601. [PMID: 23534923 DOI: 10.1111/ejn.12182] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2012] [Revised: 02/08/2013] [Accepted: 02/11/2013] [Indexed: 11/27/2022]
Abstract
Rearing cats from birth to adulthood in darkness prevents neurons in the superior colliculus (SC) from developing the capability to integrate visual and non-visual (e.g. visual-auditory) inputs. Presumably, this developmental anomaly is due to a lack of experience with the combination of those cues, which is essential to form associative links between them. The visual-auditory multisensory integration capacity of SC neurons has also been shown to depend on the functional integrity of converging visual and auditory inputs from the ipsilateral association cortex. Disrupting these cortico-collicular projections at any stage of life results in a pattern of outcomes similar to those found after dark-rearing; SC neurons respond to stimuli in both sensory modalities, but cannot integrate the information they provide. Thus, it is possible that dark-rearing compromises the development of these descending tecto-petal connections and the essential influences they convey. However, the results of the present experiments, using cortical deactivation to assess the presence of cortico-collicular influences, demonstrate that dark-rearing does not prevent the association cortex from developing robust influences over SC multisensory responses. In fact, dark-rearing may increase their potency over that observed in normally-reared animals. Nevertheless, their influences are still insufficient to support SC multisensory integration. It appears that cross-modal experience shapes the cortical influence to selectively enhance responses to cross-modal stimulus combinations that are likely to be derived from the same event. In the absence of this experience, the cortex develops an indiscriminate excitatory influence over its multisensory SC target neurons.
Collapse
Affiliation(s)
- Liping Yu
- School of Life Science, East China Normal University, Shanghai, China, 2000062
| | | | | | | |
Collapse
|
18
|
Cuppini C, Magosso E, Rowland B, Stein B, Ursino M. Hebbian mechanisms help explain development of multisensory integration in the superior colliculus: a neural network model. BIOLOGICAL CYBERNETICS 2012; 106:691-713. [PMID: 23011260 PMCID: PMC3552306 DOI: 10.1007/s00422-012-0511-9] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/29/2011] [Accepted: 07/11/2012] [Indexed: 06/01/2023]
Abstract
The superior colliculus (SC) integrates relevant sensory information (visual, auditory, somatosensory) from several cortical and subcortical structures, to program orientation responses to external events. However, this capacity is not present at birth, and it is acquired only through interactions with cross-modal events during maturation. Mathematical models provide a quantitative framework, valuable in helping to clarify the specific neural mechanisms underlying the maturation of the multisensory integration in the SC. We extended a neural network model of the adult SC (Cuppini et al., Front Integr Neurosci 4:1-15, 2010) to describe the development of this phenomenon starting from an immature state, based on known or suspected anatomy and physiology, in which: (1) AES afferents are present but weak, (2) Responses are driven from non-AES afferents, and (3) The visual inputs have a marginal spatial tuning. Sensory experience was modeled by repeatedly presenting modality-specific and cross-modal stimuli. Synapses in the network were modified by simple Hebbian learning rules. As a consequence of this exposure, (1) Receptive fields shrink and come into spatial register, and (2) SC neurons gained the adult characteristic integrative properties: enhancement, depression, and inverse effectiveness. Importantly, the unique architecture of the model guided the development so that integration became dependent on the relationship between the cortical input and the SC. Manipulations of the statistics of the experience during the development changed the integrative profiles of the neurons, and results matched well with the results of physiological studies.
Collapse
Affiliation(s)
- C Cuppini
- Department of Electronics, Computer Science and Systems, University of Bologna, Bologna, Italy.
| | | | | | | | | |
Collapse
|
19
|
Yu L, Rowland BA, Xu J, Stein BE. Multisensory plasticity in adulthood: cross-modal experience enhances neuronal excitability and exposes silent inputs. J Neurophysiol 2012; 109:464-74. [PMID: 23114212 DOI: 10.1152/jn.00739.2012] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/27/2023] Open
Abstract
Multisensory superior colliculus neurons in cats were found to retain substantial plasticity to short-term, site-specific experience with cross-modal stimuli well into adulthood. Following cross-modal exposure trials, these neurons substantially increased their sensitivity to the cross-modal stimulus configuration as well as to its individual component stimuli. In many cases, the exposure experience also revealed a previously ineffective or "silent" input channel, rendering it overtly responsive. These experience-induced changes required relatively few exposure trials and could be retained for more than 1 h. However, their induction was generally restricted to experience with cross-modal stimuli. Only rarely were they induced by exposure to a modality-specific stimulus and were never induced by stimulating a previously ineffective input channel. This short-term plasticity likely provides substantial benefits to the organism in dealing with ongoing and sequential events that take place at a given location in space and may reflect the ability of multisensory superior colliculus neurons to rapidly alter their response properties to accommodate to changes in environmental challenges and event probabilities.
Collapse
Affiliation(s)
- Liping Yu
- Department of Neurobiology and Anatomy, Wake Forest University School of Medicine, Winston-Salem, North Carolina 27157-1010, USA
| | | | | | | |
Collapse
|
20
|
Ghose D, Barnett ZP, Wallace MT. Impact of response duration on multisensory integration. J Neurophysiol 2012; 108:2534-44. [PMID: 22896723 DOI: 10.1152/jn.00286.2012] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Multisensory neurons in the superior colliculus (SC) have been shown to have large receptive fields that are heterogeneous in nature. These neurons have the capacity to integrate their different sensory inputs, a process that has been shown to depend on the physical characteristics of the stimuli that are combined (i.e., spatial and temporal relationship and relative effectiveness). Recent work has highlighted the interdependence of these factors in driving multisensory integration, adding a layer of complexity to our understanding of multisensory processes. In the present study our goal was to add to this understanding by characterizing how stimulus location impacts the temporal dynamics of multisensory responses in cat SC neurons. The results illustrate that locations within the spatial receptive fields (SRFs) of these neurons can be divided into those showing short-duration responses and long-duration response profiles. Most importantly, discharge duration appears to be a good determinant of multisensory integration, such that short-duration responses are typically associated with a high magnitude of multisensory integration (i.e., superadditive responses) while long-duration responses are typically associated with low integrative capacity. These results further reinforce the complexity of the integrative features of SC neurons and show that the large SRFs of these neurons are characterized by vastly differing temporal dynamics, dynamics that strongly shape the integrative capacity of these neurons.
Collapse
Affiliation(s)
- Dipanwita Ghose
- Department of Psychology, Vanderbilt University, Nashville, Tennessee 37240, USA.
| | | | | |
Collapse
|
21
|
Hun Ki Lim, Keniston LP, Cios KJ. Modeling of Multisensory Convergence with a Network of Spiking Neurons: A Reverse Engineering Approach. IEEE Trans Biomed Eng 2011; 58:1940-9. [DOI: 10.1109/tbme.2011.2125962] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
22
|
Cuppini C, Stein BE, Rowland BA, Magosso E, Ursino M. A computational study of multisensory maturation in the superior colliculus (SC). Exp Brain Res 2011; 213:341-9. [PMID: 21556818 DOI: 10.1007/s00221-011-2714-z] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2010] [Accepted: 04/26/2011] [Indexed: 10/18/2022]
Abstract
Multisensory neurons in cat SC exhibit significant postnatal maturation. The first multisensory neurons to appear have large receptive fields (RFs) and cannot integrate information across sensory modalities. During the first several months of postnatal life RFs contract, responses become more robust and neurons develop the capacity for multisensory integration. Recent data suggest that these changes depend on both sensory experience and active inputs from association cortex. Here, we extend a computational model we developed (Cuppini et al. in Front Integr Neurosci 22: 4-6, 2010) using a limited set of biologically realistic assumptions to describe how this maturational process might take place. The model assumes that during early life, cortical-SC synapses are present but not active and that responses are driven by non-cortical inputs with very large RFs. Sensory experience is modeled by a "training phase" in which the network is repeatedly exposed to modality-specific and cross-modal stimuli at different locations. Cortical-SC synaptic weights are modified during this period as a result of Hebbian rules of potentiation and depression. The result is that RFs are reduced in size and neurons become capable of responding in adult-like fashion to modality-specific and cross-modal stimuli.
Collapse
Affiliation(s)
- Cristiano Cuppini
- Department of Electronics, Computer Science and Systems, University of Bologna, Bologna, Italy.
| | | | | | | | | |
Collapse
|
23
|
A normalization model of multisensory integration. Nat Neurosci 2011; 14:775-82. [PMID: 21552274 PMCID: PMC3102778 DOI: 10.1038/nn.2815] [Citation(s) in RCA: 181] [Impact Index Per Article: 13.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2010] [Accepted: 03/21/2011] [Indexed: 11/08/2022]
Abstract
Responses of neurons that integrate multiple sensory inputs are traditionally characterized in terms of a set of empirical principles. However, a simple computational framework that accounts for these empirical features of multisensory integration has not been established. We propose that divisive normalization, acting at the stage of multisensory integration, can account for many of the empirical principles of multisensory integration shown by single neurons, such as the principle of inverse effectiveness and the spatial principle. This model, which uses a simple functional operation (normalization) for which there is considerable experimental support, also accounts for the recent observation that the mathematical rule by which multisensory neurons combine their inputs changes with cue reliability. The normalization model, which makes a strong testable prediction regarding cross-modal suppression, may therefore provide a simple unifying computational account of the important features of multisensory integration by neurons.
Collapse
|
24
|
Cuppini C, Magosso E, Ursino M. Organization, maturation, and plasticity of multisensory integration: insights from computational modeling studies. Front Psychol 2011; 2:77. [PMID: 21687448 PMCID: PMC3110383 DOI: 10.3389/fpsyg.2011.00077] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2010] [Accepted: 04/12/2011] [Indexed: 11/15/2022] Open
Abstract
In this paper, we present two neural network models – devoted to two specific and widely investigated aspects of multisensory integration – in order to evidence the potentialities of computational models to gain insight into the neural mechanisms underlying organization, development, and plasticity of multisensory integration in the brain. The first model considers visual–auditory interaction in a midbrain structure named superior colliculus (SC). The model is able to reproduce and explain the main physiological features of multisensory integration in SC neurons and to describe how SC integrative capability – not present at birth – develops gradually during postnatal life depending on sensory experience with cross-modal stimuli. The second model tackles the problem of how tactile stimuli on a body part and visual (or auditory) stimuli close to the same body part are integrated in multimodal parietal neurons to form the perception of peripersonal (i.e., near) space. The model investigates how the extension of peripersonal space – where multimodal integration occurs – may be modified by experience such as use of a tool to interact with the far space. The utility of the modeling approach relies on several aspects: (i) The two models, although devoted to different problems and simulating different brain regions, share some common mechanisms (lateral inhibition and excitation, non-linear neuron characteristics, recurrent connections, competition, Hebbian rules of potentiation and depression) that may govern more generally the fusion of senses in the brain, and the learning and plasticity of multisensory integration. (ii) The models may help interpretation of behavioral and psychophysical responses in terms of neural activity and synaptic connections. (iii) The models can make testable predictions that can help guiding future experiments in order to validate, reject, or modify the main assumptions.
Collapse
Affiliation(s)
- Cristiano Cuppini
- Department of Electronics, Computer Science and Systems, University of Bologna Bologna, Italy
| | | | | |
Collapse
|
25
|
Lim HK, Keniston LP, Shin JH, Allman BL, Meredith MA, Cios KJ. Connectional parameters determine multisensory processing in a spiking network model of multisensory convergence. Exp Brain Res 2011; 213:329-39. [PMID: 21484394 DOI: 10.1007/s00221-011-2671-6] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2010] [Accepted: 03/30/2011] [Indexed: 02/02/2023]
Abstract
For the brain to synthesize information from different sensory modalities, connections from different sensory systems must converge onto individual neurons. However, despite being the definitive, first step in the multisensory process, little is known about multisensory convergence at the neuronal level. This lack of knowledge may be due to the difficulty for biological experiments to manipulate and test the connectional parameters that define convergence. Therefore, the present study used a computational network of spiking neurons to measure the influence of convergence from two separate projection areas on the responses of neurons in a convergent area. Systematic changes in the proportion of extrinsic projections, the proportion of intrinsic connections, or the amount of local inhibitory contacts affected the multisensory properties of neurons in the convergent area by influencing (1) the proportion of multisensory neurons generated, (2) the proportion of neurons that generate integrated multisensory responses, and (3) the magnitude of multisensory integration. These simulations provide insight into the connectional parameters of convergence that contribute to the generation of populations of multisensory neurons in different neural regions as well as indicate that the simple effect of multisensory convergence is sufficient to generate multisensory properties like those of biological multisensory neurons.
Collapse
Affiliation(s)
- H K Lim
- Department of Computer Science, School of Engineering, Virginia Commonwealth University School of Medicine, Richmond, VA, USA
| | | | | | | | | | | |
Collapse
|