51
|
Henschke JU, Oelschlegel AM, Angenstein F, Ohl FW, Goldschmidt J, Kanold PO, Budinger E. Early sensory experience influences the development of multisensory thalamocortical and intracortical connections of primary sensory cortices. Brain Struct Funct 2018; 223:1165-1190. [PMID: 29094306 PMCID: PMC5871574 DOI: 10.1007/s00429-017-1549-1] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2017] [Accepted: 09/29/2017] [Indexed: 12/21/2022]
Abstract
The nervous system integrates information from multiple senses. This multisensory integration already occurs in primary sensory cortices via direct thalamocortical and corticocortical connections across modalities. In humans, sensory loss from birth results in functional recruitment of the deprived cortical territory by the spared senses but the underlying circuit changes are not well known. Using tracer injections into primary auditory, somatosensory, and visual cortex within the first postnatal month of life in a rodent model (Mongolian gerbil) we show that multisensory thalamocortical connections emerge before corticocortical connections but mostly disappear during development. Early auditory, somatosensory, or visual deprivation increases multisensory connections via axonal reorganization processes mediated by non-lemniscal thalamic nuclei and the primary areas themselves. Functional single-photon emission computed tomography of regional cerebral blood flow reveals altered stimulus-induced activity and higher functional connectivity specifically between primary areas in deprived animals. Together, we show that intracortical multisensory connections are formed as a consequence of sensory-driven multisensory thalamocortical activity and that spared senses functionally recruit deprived cortical areas by an altered development of sensory thalamocortical and corticocortical connections. The functional-anatomical changes after early sensory deprivation have translational implications for the therapy of developmental hearing loss, blindness, and sensory paralysis and might also underlie developmental synesthesia.
Collapse
Affiliation(s)
- Julia U Henschke
- Department Systems Physiology of Learning, Leibniz Institute for Neurobiology, Brenneckestr. 6, 39118, Magdeburg, Germany
- German Center for Neurodegenerative Diseases Within the Helmholtz Association, Leipziger Str. 44, 39120, Magdeburg, Germany
- Institute of Cognitive Neurology and Dementia Research (IKND), Otto-von-Guericke-University Magdeburg, Leipziger Str. 44, 39120, Magdeburg, Germany
- Center for Behavioral Brain Sciences, Universitätsplatz 2, 39120, Magdeburg, Germany
| | - Anja M Oelschlegel
- Research Group Neuropharmacology, Leibniz Institute for Neurobiology, Brenneckestr. 6, 39118, Magdeburg, Germany
- Institute of Anatomy, Otto-von-Guericke-University Magdeburg, Leipziger Str. 44, 39120, Magdeburg, Germany
| | - Frank Angenstein
- Functional Neuroimaging Group, German Center for Neurodegenerative Diseases Within the Helmholtz Association, Leipziger Str. 44, 39120, Magdeburg, Germany
- Center for Behavioral Brain Sciences, Universitätsplatz 2, 39120, Magdeburg, Germany
| | - Frank W Ohl
- Department Systems Physiology of Learning, Leibniz Institute for Neurobiology, Brenneckestr. 6, 39118, Magdeburg, Germany
- Institute of Biology, Otto-von-Guericke-University Magdeburg, Leipziger Str. 44, 39120, Magdeburg, Germany
- Center for Behavioral Brain Sciences, Universitätsplatz 2, 39120, Magdeburg, Germany
| | - Jürgen Goldschmidt
- Department Systems Physiology of Learning, Leibniz Institute for Neurobiology, Brenneckestr. 6, 39118, Magdeburg, Germany
- Center for Behavioral Brain Sciences, Universitätsplatz 2, 39120, Magdeburg, Germany
| | - Patrick O Kanold
- Department of Biology, University of Maryland, College Park, MD, 20742, USA
| | - Eike Budinger
- Department Systems Physiology of Learning, Leibniz Institute for Neurobiology, Brenneckestr. 6, 39118, Magdeburg, Germany.
- Center for Behavioral Brain Sciences, Universitätsplatz 2, 39120, Magdeburg, Germany.
| |
Collapse
|
52
|
Atilgan H, Town SM, Wood KC, Jones GP, Maddox RK, Lee AKC, Bizley JK. Integration of Visual Information in Auditory Cortex Promotes Auditory Scene Analysis through Multisensory Binding. Neuron 2018; 97:640-655.e4. [PMID: 29395914 PMCID: PMC5814679 DOI: 10.1016/j.neuron.2017.12.034] [Citation(s) in RCA: 82] [Impact Index Per Article: 13.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2017] [Revised: 10/28/2017] [Accepted: 12/22/2017] [Indexed: 12/29/2022]
Abstract
How and where in the brain audio-visual signals are bound to create multimodal objects remains unknown. One hypothesis is that temporal coherence between dynamic multisensory signals provides a mechanism for binding stimulus features across sensory modalities. Here, we report that when the luminance of a visual stimulus is temporally coherent with the amplitude fluctuations of one sound in a mixture, the representation of that sound is enhanced in auditory cortex. Critically, this enhancement extends to include both binding and non-binding features of the sound. We demonstrate that visual information conveyed from visual cortex via the phase of the local field potential is combined with auditory information within auditory cortex. These data provide evidence that early cross-sensory binding provides a bottom-up mechanism for the formation of cross-sensory objects and that one role for multisensory binding in auditory cortex is to support auditory scene analysis. Visual stimuli can shape how auditory cortical neurons respond to sound mixtures Temporal coherence between senses enhances sound features of a bound multisensory object Visual stimuli elicit changes in the phase of the local field potential in auditory cortex Vision-induced phase effects are lost when visual cortex is reversibly silenced
Collapse
Affiliation(s)
- Huriye Atilgan
- The Ear Institute, University College London, London, UK
| | - Stephen M Town
- The Ear Institute, University College London, London, UK
| | | | - Gareth P Jones
- The Ear Institute, University College London, London, UK
| | - Ross K Maddox
- Department of Biomedical Engineering and Department of Neuroscience, Del Monte Institute for Neuroscience, University of Rochester, Rochester, NY, USA; Institute for Learning and Brain Sciences and Department of Speech and Hearing Sciences, University of Washington, Seattle, WA, USA
| | - Adrian K C Lee
- Institute for Learning and Brain Sciences and Department of Speech and Hearing Sciences, University of Washington, Seattle, WA, USA
| | | |
Collapse
|
53
|
Boyle SC, Kayser SJ, Kayser C. Neural correlates of multisensory reliability and perceptual weights emerge at early latencies during audio-visual integration. Eur J Neurosci 2017; 46:2565-2577. [PMID: 28940728 PMCID: PMC5725738 DOI: 10.1111/ejn.13724] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2017] [Revised: 09/11/2017] [Accepted: 09/18/2017] [Indexed: 12/24/2022]
Abstract
To make accurate perceptual estimates, observers must take the reliability of sensory information into account. Despite many behavioural studies showing that subjects weight individual sensory cues in proportion to their reliabilities, it is still unclear when during a trial neuronal responses are modulated by the reliability of sensory information or when they reflect the perceptual weights attributed to each sensory input. We investigated these questions using a combination of psychophysics, EEG‐based neuroimaging and single‐trial decoding. Our results show that the weighted integration of sensory information in the brain is a dynamic process; effects of sensory reliability on task‐relevant EEG components were evident 84 ms after stimulus onset, while neural correlates of perceptual weights emerged 120 ms after stimulus onset. These neural processes had different underlying sources, arising from sensory and parietal regions, respectively. Together these results reveal the temporal dynamics of perceptual and neural audio‐visual integration and support the notion of temporally early and functionally specific multisensory processes in the brain.
Collapse
Affiliation(s)
- Stephanie C Boyle
- Institute of Neuroscience and Psychology, University of Glasgow, Hillhead Street 58, Glasgow, G12 8QB, UK
| | - Stephanie J Kayser
- Institute of Neuroscience and Psychology, University of Glasgow, Hillhead Street 58, Glasgow, G12 8QB, UK
| | - Christoph Kayser
- Institute of Neuroscience and Psychology, University of Glasgow, Hillhead Street 58, Glasgow, G12 8QB, UK
| |
Collapse
|
54
|
Gohil K, Bluschke A, Roessner V, Stock A, Beste C. Sensory processes modulate differences in multi-component behavior and cognitive control between childhood and adulthood. Hum Brain Mapp 2017; 38:4933-4945. [PMID: 28660637 PMCID: PMC6867046 DOI: 10.1002/hbm.23705] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2017] [Revised: 06/09/2017] [Accepted: 06/18/2017] [Indexed: 12/24/2022] Open
Abstract
Many everyday tasks require executive functions to achieve a certain goal. Quite often, this requires the integration of information derived from different sensory modalities. Children are less likely to integrate information from different modalities and, at the same time, also do not command fully developed executive functions, as compared to adults. Yet still, the role of developmental age-related effects on multisensory integration processes has not been examined within the context of multicomponent behavior until now (i.e., the concatenation of different executive subprocesses). This is problematic because differences in multisensory integration might actually explain a significant amount of the developmental effects that have traditionally been attributed to changes in executive functioning. In a system, neurophysiological approach combining electroencephaloram (EEG) recordings and source localization analyses, we therefore examined this question. The results show that differences in how children and adults accomplish multicomponent behavior do not solely depend on developmental differences in executive functioning. Instead, the observed developmental differences in response selection processes (reflected by the P3 ERP) were largely dependent on the complexity of integrating temporally separated stimuli from different modalities. This effect was related to activation differences in medial frontal and inferior parietal cortices. Primary perceptual gating or attentional selection processes (P1 and N1 ERPs) were not affected. The results show that differences in multisensory integration explain parts of transformations in cognitive processes between childhood and adulthood that have traditionally been attributed to changes in executive functioning, especially when these require the integration of multiple modalities during response selection. Hum Brain Mapp 38:4933-4945, 2017. © 2017 Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Krutika Gohil
- Department of Child and Adolescent PsychiatryFaculty of Medicine, TU Dresden, Cognitive NeurophysiologyGermany
| | - Annet Bluschke
- Department of Child and Adolescent PsychiatryFaculty of Medicine, TU Dresden, Cognitive NeurophysiologyGermany
| | - Veit Roessner
- Department of Child and Adolescent PsychiatryFaculty of Medicine, TU Dresden, Cognitive NeurophysiologyGermany
| | - Ann‐Kathrin Stock
- Department of Child and Adolescent PsychiatryFaculty of Medicine, TU Dresden, Cognitive NeurophysiologyGermany
| | - Christian Beste
- Department of Child and Adolescent PsychiatryFaculty of Medicine, TU Dresden, Cognitive NeurophysiologyGermany
- Experimental Neurobiology, National Institute of Mental HealthKlecanyCzech Republic
| |
Collapse
|
55
|
Abstract
Swift action is often required in the face of indeterminate sensory evidence. In this issue of Neuron, Song et al. (2017) describe an inhibitory circuit in the posterior parietal cortex that evaluates conflicting auditory and visual cues and supports resolute perceptual decision making.
Collapse
Affiliation(s)
- Daniel B Polley
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear Infirmary, Department of Otolaryngology, Harvard Medical School, Boston, MA 02114, USA.
| |
Collapse
|
56
|
Serruya MD. Connecting the Brain to Itself through an Emulation. Front Neurosci 2017; 11:373. [PMID: 28713235 PMCID: PMC5492113 DOI: 10.3389/fnins.2017.00373] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2017] [Accepted: 06/15/2017] [Indexed: 01/03/2023] Open
Abstract
Pilot clinical trials of human patients implanted with devices that can chronically record and stimulate ensembles of hundreds to thousands of individual neurons offer the possibility of expanding the substrate of cognition. Parallel trains of firing rate activity can be delivered in real-time to an array of intermediate external modules that in turn can trigger parallel trains of stimulation back into the brain. These modules may be built in software, VLSI firmware, or biological tissue as in vitro culture preparations or in vivo ectopic construct organoids. Arrays of modules can be constructed as early stage whole brain emulators, following canonical intra- and inter-regional circuits. By using machine learning algorithms and classic tasks known to activate quasi-orthogonal functional connectivity patterns, bedside testing can rapidly identify ensemble tuning properties and in turn cycle through a sequence of external module architectures to explore which can causatively alter perception and behavior. Whole brain emulation both (1) serves to augment human neural function, compensating for disease and injury as an auxiliary parallel system, and (2) has its independent operation bootstrapped by a human-in-the-loop to identify optimal micro- and macro-architectures, update synaptic weights, and entrain behaviors. In this manner, closed-loop brain-computer interface pilot clinical trials can advance strong artificial intelligence development and forge new therapies to restore independence in children and adults with neurological conditions.
Collapse
Affiliation(s)
- Mijail D Serruya
- Neurology, Thomas Jefferson UniversityPhiladelphia, PA, United States
| |
Collapse
|
57
|
Kayser SJ, Philiastides MG, Kayser C. Sounds facilitate visual motion discrimination via the enhancement of late occipital visual representations. Neuroimage 2017; 148:31-41. [PMID: 28082107 PMCID: PMC5349847 DOI: 10.1016/j.neuroimage.2017.01.010] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2016] [Revised: 12/12/2016] [Accepted: 01/05/2017] [Indexed: 12/24/2022] Open
Abstract
Sensory discriminations, such as judgements about visual motion, often benefit from multisensory evidence. Despite many reports of enhanced brain activity during multisensory conditions, it remains unclear which dynamic processes implement the multisensory benefit for an upcoming decision in the human brain. Specifically, it remains difficult to attribute perceptual benefits to specific processes, such as early sensory encoding, the transformation of sensory representations into a motor response, or to more unspecific processes such as attention. We combined an audio-visual motion discrimination task with the single-trial mapping of dynamic sensory representations in EEG activity to localize when and where multisensory congruency facilitates perceptual accuracy. Our results show that a congruent sound facilitates the encoding of motion direction in occipital sensory - as opposed to parieto-frontal - cortices, and facilitates later - as opposed to early (i.e. below 100 ms) - sensory activations. This multisensory enhancement was visible as an earlier rise of motion-sensitive activity in middle-occipital regions about 350 ms from stimulus onset, which reflected the better discriminability of motion direction from brain activity and correlated with the perceptual benefit provided by congruent multisensory information. This supports a hierarchical model of multisensory integration in which the enhancement of relevant sensory cortical representations is transformed into a more accurate choice. Feature specific multisensory integration occurs in sensory not amodal cortex. Feature specific integration occurs late, i.e. around 350 ms post stimulus onset. Acoustic and visual representations interact in occipital motion regions.
Collapse
Affiliation(s)
- Stephanie J Kayser
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK.
| | | | - Christoph Kayser
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK
| |
Collapse
|
58
|
Chandrasekaran C. Computational principles and models of multisensory integration. Curr Opin Neurobiol 2016; 43:25-34. [PMID: 27918886 DOI: 10.1016/j.conb.2016.11.002] [Citation(s) in RCA: 51] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2016] [Revised: 10/27/2016] [Accepted: 11/09/2016] [Indexed: 12/22/2022]
Abstract
Combining information from multiple senses creates robust percepts, speeds up responses, enhances learning, and improves detection, discrimination, and recognition. In this review, I discuss computational models and principles that provide insight into how this process of multisensory integration occurs at the behavioral and neural level. My initial focus is on drift-diffusion and Bayesian models that can predict behavior in multisensory contexts. I then highlight how recent neurophysiological and perturbation experiments provide evidence for a distributed redundant network for multisensory integration. I also emphasize studies which show that task-relevant variables in multisensory contexts are distributed in heterogeneous neural populations. Finally, I describe dimensionality reduction methods and recurrent neural network models that may help decipher heterogeneous neural populations involved in multisensory integration.
Collapse
|