1
|
Schryver HM, Mysore SP. Distinct neural mechanisms construct classical versus extraclassical inhibitory surrounds in an inhibitory nucleus in the midbrain attention network. Nat Commun 2023; 14:3400. [PMID: 37296109 PMCID: PMC10256684 DOI: 10.1038/s41467-023-39073-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2020] [Accepted: 05/24/2023] [Indexed: 06/12/2023] Open
Abstract
Inhibitory neurons in the midbrain spatial attention network, called isthmi pars magnocellularis (Imc), control stimulus selection by the sensorimotor and attentional hub, the optic tectum (OT). Here, we investigate in the barn owl how classical as well as extraclassical (global) inhibitory surrounds of Imc receptive fields (RFs), fundamental units of Imc computational function, are constructed. We find that focal, reversible blockade of GABAergic input onto Imc neurons disconnects their extraclassical inhibitory surrounds, but leaves intact their classical inhibitory surrounds. Subsequently, with paired recordings and iontophoresis, first at spatially aligned site-pairs in Imc and OT, and then, at mutually distant site-pairs within Imc, we demonstrate that classical inhibitory surrounds of Imc RFs are inherited from OT, but their extraclassical inhibitory surrounds are constructed within Imc. These results reveal key design principles of the midbrain spatial attention circuit and highlight the critical importance of competitive interactions within Imc for its operation.
Collapse
Affiliation(s)
- Hannah M Schryver
- Department of Psychological and Brain Sciences, Johns Hopkins University, Baltimore, MD, 21218, USA
- Currently, Allen Institute, Seattle, WA, USA
| | - Shreesh P Mysore
- Department of Psychological and Brain Sciences, Johns Hopkins University, Baltimore, MD, 21218, USA.
- The Solomon H. Snyder Department of Neuroscience, Johns Hopkins School of Medicine, Baltimore, MD, 21205, USA.
- Kavli Neuroscience Discovery Institute, Johns Hopkins University, Baltimore, MD, 21218, USA.
| |
Collapse
|
2
|
Shadron K, Peña JL. Development of frequency tuning shaped by spatial cue reliability in the barn owl's auditory midbrain. eLife 2023; 12:e84760. [PMID: 37166099 PMCID: PMC10238092 DOI: 10.7554/elife.84760] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2022] [Accepted: 05/10/2023] [Indexed: 05/12/2023] Open
Abstract
Sensory systems preferentially strengthen responses to stimuli based on their reliability at conveying accurate information. While previous reports demonstrate that the brain reweighs cues based on dynamic changes in reliability, how the brain may learn and maintain neural responses to sensory statistics expected to be stable over time is unknown. The barn owl's midbrain features a map of auditory space where neurons compute horizontal sound location from the interaural time difference (ITD). Frequency tuning of midbrain map neurons correlates with the most reliable frequencies for the neurons' preferred ITD (Cazettes et al., 2014). Removal of the facial ruff led to a specific decrease in the reliability of high frequencies from frontal space. To directly test whether permanent changes in ITD reliability drive frequency tuning, midbrain map neurons were recorded from adult owls, with the facial ruff removed during development, and juvenile owls, before facial ruff development. In both groups, frontally tuned neurons were tuned to frequencies lower than in normal adult owls, consistent with the change in ITD reliability. In addition, juvenile owls exhibited more heterogeneous frequency tuning, suggesting normal developmental processes refine tuning to match ITD reliability. These results indicate causality of long-term statistics of spatial cues in the development of midbrain frequency tuning properties, implementing probabilistic coding for sound localization.
Collapse
Affiliation(s)
- Keanu Shadron
- Dominick P Purpura Department of Neuroscience, Albert Einstein College of MedicineBronxUnited States
| | - José Luis Peña
- Dominick P Purpura Department of Neuroscience, Albert Einstein College of MedicineBronxUnited States
| |
Collapse
|
3
|
Giret N, Rolland M, Del Negro C. Multisensory processes in birds: from single neurons to the influence of social interactions and sensory loss. Neurosci Biobehav Rev 2022; 143:104942. [DOI: 10.1016/j.neubiorev.2022.104942] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2022] [Revised: 10/14/2022] [Accepted: 10/31/2022] [Indexed: 11/09/2022]
|
4
|
Abstract
In the rapid serial visual presentation (RSVP) paradigm, response accuracy for the target decreases when it appears within a short time window (200~500 ms) after the previous target. This phenomenon is termed the attentional blink (AB). Although mechanisms of cross-modal processing that reduce the AB have been documented, researchers have not explored the differences across modal attentional conditions. In the present study, we used the RSVP paradigm to investigate the effect of auditory-driven visual target perceptual enhancement on the AB under modality-specific selective attention (Experiment 1) and bimodal-divided attention (Experiment 2). The results showed that cross-modal attentional enhancement was not moderated by stimulus salience. Moreover, the results also showed that accuracy was higher when the attended sound appeared simultaneously with the target. These results indicated that audiovisual enhancement reduced AB and that stronger attentional enhancement in the bimodal-divided attentional condition led to the disappearance of AB.
Collapse
|
5
|
Vestibular Stimulation May Drive Multisensory Processing: Principles for Targeted Sensorimotor Therapy (TSMT). Brain Sci 2021; 11:brainsci11081111. [PMID: 34439730 PMCID: PMC8393350 DOI: 10.3390/brainsci11081111] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2021] [Revised: 08/20/2021] [Accepted: 08/20/2021] [Indexed: 12/01/2022] Open
Abstract
At birth, the vestibular system is fully mature, whilst higher order sensory processing is yet to develop in the full-term neonate. The current paper lays out a theoretical framework to account for the role vestibular stimulation may have driving multisensory and sensorimotor integration. Accordingly, vestibular stimulation, by activating the parieto-insular vestibular cortex, and/or the posterior parietal cortex may provide the cortical input for multisensory neurons in the superior colliculus that is needed for multisensory processing. Furthermore, we propose that motor development, by inducing change of reference frames, may shape the receptive field of multisensory neurons. This, by leading to lack of spatial contingency between formally contingent stimuli, may cause degradation of prior motor responses. Additionally, we offer a testable hypothesis explaining the beneficial effect of sensory integration therapies regarding attentional processes. Key concepts of a sensorimotor integration therapy (e.g., targeted sensorimotor therapy (TSMT)) are also put into a neurological context. TSMT utilizes specific tools and instruments. It is administered in 8-weeks long successive treatment regimens, each gradually increasing vestibular and postural stimulation, so sensory-motor integration is facilitated, and muscle strength is increased. Empirically TSMT is indicated for various diseases. Theoretical foundations of this sensorimotor therapy are discussed.
Collapse
|
6
|
Abstract
Coordination between different sensory systems is a necessary element of sensory processing. Where and how signals from different sense organs converge onto common neural circuitry have become topics of increasing interest in recent years. In this article, we focus specifically on visual-auditory interactions in areas of the mammalian brain that are commonly considered to be auditory in function. The auditory cortex and inferior colliculus are two key points of entry where visual signals reach the auditory pathway, and both contain visual- and/or eye movement-related signals in humans and other animals. The visual signals observed in these auditory structures reflect a mixture of visual modulation of auditory-evoked activity and visually driven responses that are selective for stimulus location or features. These key response attributes also appear in the classic visual pathway but may play a different role in the auditory pathway: to modify auditory rather than visual perception. Finally, while this review focuses on two particular areas of the auditory pathway where this question has been studied, robust descending as well as ascending connections within this pathway suggest that undiscovered visual signals may be present at other stages as well. Expected final online publication date for the Annual Review of Vision Science, Volume 7 is September 2021. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Collapse
Affiliation(s)
- Meredith N Schmehl
- Department of Neurobiology, Duke University, Durham, North Carolina 27708, USA; , .,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Duke Institute for Brain Sciences, Duke University, Durham, North Carolina 27708, USA
| | - Jennifer M Groh
- Department of Neurobiology, Duke University, Durham, North Carolina 27708, USA; , .,Department of Psychology & Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Department of Computer Science, Duke University, Durham, North Carolina 27708, USA.,Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708, USA.,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Duke Institute for Brain Sciences, Duke University, Durham, North Carolina 27708, USA
| |
Collapse
|
7
|
Visual Influences on Auditory Behavioral, Neural, and Perceptual Processes: A Review. J Assoc Res Otolaryngol 2021; 22:365-386. [PMID: 34014416 PMCID: PMC8329114 DOI: 10.1007/s10162-021-00789-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2020] [Accepted: 02/07/2021] [Indexed: 01/03/2023] Open
Abstract
In a naturalistic environment, auditory cues are often accompanied by information from other senses, which can be redundant with or complementary to the auditory information. Although the multisensory interactions derived from this combination of information and that shape auditory function are seen across all sensory modalities, our greatest body of knowledge to date centers on how vision influences audition. In this review, we attempt to capture the state of our understanding at this point in time regarding this topic. Following a general introduction, the review is divided into 5 sections. In the first section, we review the psychophysical evidence in humans regarding vision's influence in audition, making the distinction between vision's ability to enhance versus alter auditory performance and perception. Three examples are then described that serve to highlight vision's ability to modulate auditory processes: spatial ventriloquism, cross-modal dynamic capture, and the McGurk effect. The final part of this section discusses models that have been built based on available psychophysical data and that seek to provide greater mechanistic insights into how vision can impact audition. The second section reviews the extant neuroimaging and far-field imaging work on this topic, with a strong emphasis on the roles of feedforward and feedback processes, on imaging insights into the causal nature of audiovisual interactions, and on the limitations of current imaging-based approaches. These limitations point to a greater need for machine-learning-based decoding approaches toward understanding how auditory representations are shaped by vision. The third section reviews the wealth of neuroanatomical and neurophysiological data from animal models that highlights audiovisual interactions at the neuronal and circuit level in both subcortical and cortical structures. It also speaks to the functional significance of audiovisual interactions for two critically important facets of auditory perception-scene analysis and communication. The fourth section presents current evidence for alterations in audiovisual processes in three clinical conditions: autism, schizophrenia, and sensorineural hearing loss. These changes in audiovisual interactions are postulated to have cascading effects on higher-order domains of dysfunction in these conditions. The final section highlights ongoing work seeking to leverage our knowledge of audiovisual interactions to develop better remediation approaches to these sensory-based disorders, founded in concepts of perceptual plasticity in which vision has been shown to have the capacity to facilitate auditory learning.
Collapse
|
8
|
Cheng L, Guo ZY, Qu YL. Cross-modality modulation of auditory midbrain processing of intensity information. Hear Res 2020; 395:108042. [PMID: 32810721 DOI: 10.1016/j.heares.2020.108042] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/03/2020] [Revised: 06/12/2020] [Accepted: 07/08/2020] [Indexed: 02/03/2023]
Abstract
In nature, animals constantly receive a multitude of sensory stimuli, such as visual, auditory, and somatosensory. The integration across sensory modalities is advantageous for the precise processing of sensory inputs which is essential for animals to survival. Although some principles of cross-modality integration have been revealed by many studies, little insight has been gained into its functional potentials. In this study, the functional influence of cross-modality modulation on auditory processing of intensity information was investigated via recording neuronal activity in the auditory midbrain (i.e., inferior colliculus, IC) under the conditions of visual, auditory, and audiovisual stimuli, respectively. Results demonstrated that combined audiovisual stimuli either enhanced or suppressed the responses of IC neurons compared to auditory stimuli alone, even though the same visual stimuli alone induced no response. Audiovisual modulation appeared to be strongest when the combined audiovisual stimuli were located at the best auditory azimuth of neurons as well as when presented with intensity at near-threshold levels. Additionally, the rate-intensity function of IC neurons to auditory stimuli was expanded or compressed by audiovisual modulation, which was highly dependent on the minimal threshold (MT) of neurons. Lowering of the MT and greater audiovisual modulation for the neuron indicated an intensity-specific enhancement of auditory intensity sensitivity by cross-modality modulation. Overall, evidence suggests a potential functional role of cross-modality modulation in IC that serves to instruct adaptive plasticity to enhance the auditory perception of intensity information.
Collapse
Affiliation(s)
- Liang Cheng
- School of Psychology & Key Laboratory of Adolescent Cyberpsycology and Behavior (CCNU) of Ministry of Education, Central China Normal University, Wuhan, 430079, China; School of Life Sciences & Hubei Key Lab of Genetic Regulation and Integrative Biology, Central China Normal University, Wuhan, 430079, China.
| | - Zhao-Yang Guo
- School of Psychology & Key Laboratory of Adolescent Cyberpsycology and Behavior (CCNU) of Ministry of Education, Central China Normal University, Wuhan, 430079, China
| | - Yi-Li Qu
- School of Psychology & Key Laboratory of Adolescent Cyberpsycology and Behavior (CCNU) of Ministry of Education, Central China Normal University, Wuhan, 430079, China
| |
Collapse
|
9
|
Dutta A, Lev-Ari T, Barzilay O, Mairon R, Wolf A, Ben-Shahar O, Gutfreund Y. Self-motion trajectories can facilitate orientation-based figure-ground segregation. J Neurophysiol 2020; 123:912-926. [PMID: 31967932 DOI: 10.1152/jn.00439.2019] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Segregation of objects from the background is a basic and essential property of the visual system. We studied the neural detection of objects defined by orientation difference from background in barn owls (Tyto alba). We presented wide-field displays of densely packed stripes with a dominant orientation. Visual objects were created by orienting a circular patch differently from the background. In head-fixed conditions, neurons in both tecto- and thalamofugal visual pathways (optic tectum and visual Wulst) were weakly responsive to these objects in their receptive fields. However, notably, in freely viewing conditions, barn owls occasionally perform peculiar side-to-side head motions (peering) when scanning the environment. In the second part of the study we thus recorded the neural response from head-fixed owls while the visual displays replicated the peering conditions; i.e., the displays (objects and backgrounds) were shifted along trajectories that induced a retinal motion identical to sampled peering motions during viewing of a static object. These conditions induced dramatic neural responses to the objects, in the very same neurons that where unresponsive to the objects in static displays. By reverting to circular motions of the display, we show that the pattern of the neural response is mostly shaped by the orientation of the background relative to motion and not the orientation of the object. Thus our findings provide evidence that peering and/or other self-motions can facilitate orientation-based figure-ground segregation through interaction with inhibition from the surround.NEW & NOTEWORTHY Animals frequently move their sensory organs and thereby create motion cues that can enhance object segregation from background. We address a special example of such active sensing, in barn owls. When scanning the environment, barn owls occasionally perform small-amplitude side-to-side head movements called peering. We show that the visual outcome of such peering movements elicit neural detection of objects that are rotated from the dominant orientation of the background scene and which are otherwise mostly undetected. These results suggest a novel role for self-motions in sensing objects that break the regular orientation of elements in the scene.
Collapse
Affiliation(s)
- Arkadeb Dutta
- The Ruth and Bruce Rappaport Faculty of Medicine and Research Institute, The Technion, Haifa, Israel
| | - Tidhar Lev-Ari
- The Ruth and Bruce Rappaport Faculty of Medicine and Research Institute, The Technion, Haifa, Israel
| | - Ouriel Barzilay
- Faculty of Mechanical Engineering, The Technion, Haifa, Israel
| | - Rotem Mairon
- Department of Computer Science, Ben-Gurion University of the Negev, Beer-Sheva, Israel
| | - Alon Wolf
- Faculty of Mechanical Engineering, The Technion, Haifa, Israel
| | - Ohad Ben-Shahar
- Department of Computer Science, Ben-Gurion University of the Negev, Beer-Sheva, Israel.,The Zlotowski Center for Neuroscience Research, Ben-Gurion University of the Negev, Beer-Sheva, Israel
| | - Yoram Gutfreund
- The Ruth and Bruce Rappaport Faculty of Medicine and Research Institute, The Technion, Haifa, Israel
| |
Collapse
|
10
|
Dong CM, Leong ATL, Manno FA, Lau C, Ho LC, Chan RW, Feng Y, Gao PP, Wu EX. Functional MRI Investigation of Audiovisual Interactions in Auditory Midbrain. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2019; 2018:5527-5530. [PMID: 30441589 DOI: 10.1109/embc.2018.8513629] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
The brain integrates information from different sensory modalities to form a representation of the environment and facilitate behavioral responses. The auditory midbrain or inferior colliculus (IC) is a pivotal station in the auditory system, integrating ascending and descending information from various auditory sources and cortical systems. The present study investigated the modulation of auditory responses in the IC by visual stimuli of different frequencies and intensities in rats using functional MRI (fMRI). Low-frequency (1 Hz) high-intensity visual stimulus suppressed IC auditory responses. However, high-frequency (10 Hz) or low-intensity visual stimuli did not alter the IC auditory responses. This finding demonstrates that cross-modal processing occurs in the IC in a manner that depends on the stimulus. Furthermore, only low-frequency high-intensity visual stimulus elicited responses in non-visual cortical regions, suggesting that the above cross-modal modulation effect may arise from top-down cortical feedback. These fMRI results provide insight to guide future studies of cross-modal processing in sensory pathways.
Collapse
|
11
|
Visual input shapes the auditory frequency responses in the inferior colliculus of mouse. Hear Res 2019; 381:107777. [PMID: 31430633 DOI: 10.1016/j.heares.2019.107777] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/16/2019] [Revised: 07/30/2019] [Accepted: 08/02/2019] [Indexed: 11/23/2022]
Abstract
The integration of visual and auditory information is important for humans or animals to build an accurate and coherent perception of the external world. Although some evidence has shown some principles of the audiovisual integration, little insight has been gained into its functional purpose. In this study, we investigated the functional influence of dynamic visual input on auditory frequency processing by recording single unit activity in the central nucleus of the inferior colliculus (ICc). Results showed that the auditory responses of ICc neurons to sound frequencies could be enhanced or suppressed by visual stimuli even though the same visual stimuli induced no neural responses when presented alone. For each ICc neuron, the most effective visual stimuli were located in the same azimuth as for auditory stimuli and preceded for ∼20 ms. Additionally, visual stimuli could steepen or flatten the frequency tuning curves (FTCs) of ICc neurons by various visual effects at each responsive frequency. The modulation degree of auditory FTCs was dependent on the minimal thresholds (MTs) of ICc neurons, i.e., with MTs increasing, the modulation degree decreased. Due to the non-homogeneous distribution of MTs which was lowest at 10 kHz, visual modulation of auditory FTCs exhibited a frequency-specific manner, the closer it reached the characteristic frequency (CF) of 10 kHz, the greater modulation. Thus, visual modulation of auditory frequency responses in ICc is dependent not only on the visual stimulus but also on the auditory characteristics of ICc neurons. These results suggest a moment-to-moment visual modulation of auditory frequency responses that in real time increase auditory frequency sensitivity to audiovisual stimuli. Furthermore, in the long term such modulation could serve to instruct auditory adaptive plasticity to maintain necessary and accurate auditory detection and perceptual behavior.
Collapse
|
12
|
A Bayesian Account of Vocal Adaptation to Pitch-Shifted Auditory Feedback. PLoS One 2017; 12:e0169795. [PMID: 28135267 PMCID: PMC5279726 DOI: 10.1371/journal.pone.0169795] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2016] [Accepted: 12/21/2016] [Indexed: 11/19/2022] Open
Abstract
Motor systems are highly adaptive. Both birds and humans compensate for synthetically induced shifts in the pitch (fundamental frequency) of auditory feedback stemming from their vocalizations. Pitch-shift compensation is partial in the sense that large shifts lead to smaller relative compensatory adjustments of vocal pitch than small shifts. Also, compensation is larger in subjects with high motor variability. To formulate a mechanistic description of these findings, we adapt a Bayesian model of error relevance. We assume that vocal-auditory feedback loops in the brain cope optimally with known sensory and motor variability. Based on measurements of motor variability, optimal compensatory responses in our model provide accurate fits to published experimental data. Optimal compensation correctly predicts sensory acuity, which has been estimated in psychophysical experiments as just-noticeable pitch differences. Our model extends the utility of Bayesian approaches to adaptive vocal behaviors.
Collapse
|
13
|
Gao PP, Zhang JW, Fan SJ, Sanes DH, Wu EX. Auditory midbrain processing is differentially modulated by auditory and visual cortices: An auditory fMRI study. Neuroimage 2015; 123:22-32. [PMID: 26306991 DOI: 10.1016/j.neuroimage.2015.08.040] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2015] [Revised: 08/15/2015] [Accepted: 08/18/2015] [Indexed: 11/19/2022] Open
Abstract
The cortex contains extensive descending projections, yet the impact of cortical input on brainstem processing remains poorly understood. In the central auditory system, the auditory cortex contains direct and indirect pathways (via brainstem cholinergic cells) to nuclei of the auditory midbrain, called the inferior colliculus (IC). While these projections modulate auditory processing throughout the IC, single neuron recordings have samples from only a small fraction of cells during stimulation of the corticofugal pathway. Furthermore, assessments of cortical feedback have not been extended to sensory modalities other than audition. To address these issues, we devised blood-oxygen-level-dependent (BOLD) functional magnetic resonance imaging (fMRI) paradigms to measure the sound-evoked responses throughout the rat IC and investigated the effects of bilateral ablation of either auditory or visual cortices. Auditory cortex ablation increased the gain of IC responses to noise stimuli (primarily in the central nucleus of the IC) and decreased response selectivity to forward species-specific vocalizations (versus temporally reversed ones, most prominently in the external cortex of the IC). In contrast, visual cortex ablation decreased the gain and induced a much smaller effect on response selectivity. The results suggest that auditory cortical projections normally exert a large-scale and net suppressive influence on specific IC subnuclei, while visual cortical projections provide a facilitatory influence. Meanwhile, auditory cortical projections enhance the midbrain response selectivity to species-specific vocalizations. We also probed the role of the indirect cholinergic projections in the auditory system in the descending modulation process by pharmacologically blocking muscarinic cholinergic receptors. This manipulation did not affect the gain of IC responses but significantly reduced the response selectivity to vocalizations. The results imply that auditory cortical gain modulation is mediated primarily through direct projections and they point to future investigations of the differential roles of the direct and indirect projections in corticofugal modulation. In summary, our imaging findings demonstrate the large-scale descending influences, from both the auditory and visual cortices, on sound processing in different IC subdivisions. They can guide future studies on the coordinated activity across multiple regions of the auditory network, and its dysfunctions.
Collapse
Affiliation(s)
- Patrick P Gao
- Laboratory of Biomedical Imaging and Signal Processing, The University of Hong Kong, Pokfulam, Hong Kong SAR, China; Department of Electrical and Electronic Engineering, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
| | - Jevin W Zhang
- Laboratory of Biomedical Imaging and Signal Processing, The University of Hong Kong, Pokfulam, Hong Kong SAR, China; Department of Electrical and Electronic Engineering, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
| | - Shu-Juan Fan
- Laboratory of Biomedical Imaging and Signal Processing, The University of Hong Kong, Pokfulam, Hong Kong SAR, China; Department of Electrical and Electronic Engineering, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
| | - Dan H Sanes
- Center for Neural Science, New York University, New York, NY 10003, United States
| | - Ed X Wu
- Laboratory of Biomedical Imaging and Signal Processing, The University of Hong Kong, Pokfulam, Hong Kong SAR, China; Department of Electrical and Electronic Engineering, The University of Hong Kong, Pokfulam, Hong Kong SAR, China; Department of Anatomy, The University of Hong Kong, Pokfulam, Hong Kong SAR, China; Department of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China.
| |
Collapse
|
14
|
Lagorce X, Stromatias E, Galluppi F, Plana LA, Liu SC, Furber SB, Benosman RB. Breaking the millisecond barrier on SpiNNaker: implementing asynchronous event-based plastic models with microsecond resolution. Front Neurosci 2015; 9:206. [PMID: 26106288 PMCID: PMC4458614 DOI: 10.3389/fnins.2015.00206] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2015] [Accepted: 05/23/2015] [Indexed: 11/17/2022] Open
Abstract
Spike-based neuromorphic sensors such as retinas and cochleas, change the way in which the world is sampled. Instead of producing data sampled at a constant rate, these sensors output spikes that are asynchronous and event driven. The event-based nature of neuromorphic sensors implies a complete paradigm shift in current perception algorithms toward those that emphasize the importance of precise timing. The spikes produced by these sensors usually have a time resolution in the order of microseconds. This high temporal resolution is a crucial factor in learning tasks. It is also widely used in the field of biological neural networks. Sound localization for instance relies on detecting time lags between the two ears which, in the barn owl, reaches a temporal resolution of 5 μs. Current available neuromorphic computation platforms such as SpiNNaker often limit their users to a time resolution in the order of milliseconds that is not compatible with the asynchronous outputs of neuromorphic sensors. To overcome these limitations and allow for the exploration of new types of neuromorphic computing architectures, we introduce a novel software framework on the SpiNNaker platform. This framework allows for simulations of spiking networks and plasticity mechanisms using a completely asynchronous and event-based scheme running with a microsecond time resolution. Results on two example networks using this new implementation are presented.
Collapse
Affiliation(s)
- Xavier Lagorce
- Equipe de Vision et Calcul Naturel, Centre National de la Recherche Scientifique UMR 7210, UMR S968 Inserm, Vision Institute, CHNO des Quinze-Vingts, Université Pierre et Marie Curie Paris, France
| | - Evangelos Stromatias
- Advanced Processors Technologies Research Group, School of Computer Science, University of Manchester Manchester, UK
| | - Francesco Galluppi
- Equipe de Vision et Calcul Naturel, Centre National de la Recherche Scientifique UMR 7210, UMR S968 Inserm, Vision Institute, CHNO des Quinze-Vingts, Université Pierre et Marie Curie Paris, France
| | - Luis A Plana
- Advanced Processors Technologies Research Group, School of Computer Science, University of Manchester Manchester, UK
| | - Shih-Chii Liu
- Institute of Neuroinformatics, University of Zürich and ETH Zürich Zürich, Switzerland
| | - Steve B Furber
- Advanced Processors Technologies Research Group, School of Computer Science, University of Manchester Manchester, UK
| | - Ryad B Benosman
- Equipe de Vision et Calcul Naturel, Centre National de la Recherche Scientifique UMR 7210, UMR S968 Inserm, Vision Institute, CHNO des Quinze-Vingts, Université Pierre et Marie Curie Paris, France
| |
Collapse
|
15
|
Hazan Y, Kra Y, Yarin I, Wagner H, Gutfreund Y. Visual-auditory integration for visual search: a behavioral study in barn owls. Front Integr Neurosci 2015; 9:11. [PMID: 25762905 PMCID: PMC4327738 DOI: 10.3389/fnint.2015.00011] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2014] [Accepted: 01/28/2015] [Indexed: 12/14/2022] Open
Abstract
Barn owls are nocturnal predators that rely on both vision and hearing for survival. The optic tectum of barn owls, a midbrain structure involved in selective attention, has been used as a model for studying visual-auditory integration at the neuronal level. However, behavioral data on visual-auditory integration in barn owls are lacking. The goal of this study was to examine if the integration of visual and auditory signals contributes to the process of guiding attention toward salient stimuli. We attached miniature wireless video cameras on barn owls' heads (OwlCam) to track their target of gaze. We first provide evidence that the area centralis (a retinal area with a maximal density of photoreceptors) is used as a functional fovea in barn owls. Thus, by mapping the projection of the area centralis on the OwlCam's video frame, it is possible to extract the target of gaze. For the experiment, owls were positioned on a high perch and four food items were scattered in a large arena on the floor. In addition, a hidden loudspeaker was positioned in the arena. The positions of the food items and speaker were changed every session. Video sequences from the OwlCam were saved for offline analysis while the owls spontaneously scanned the room and the food items with abrupt gaze shifts (head saccades). From time to time during the experiment, a brief sound was emitted from the speaker. The fixation points immediately following the sounds were extracted and the distances between the gaze position and the nearest items and loudspeaker were measured. The head saccades were rarely toward the location of the sound source but to salient visual features in the room, such as the door knob or the food items. However, among the food items, the one closest to the loudspeaker had the highest probability of attracting a gaze shift. This result supports the notion that auditory signals are integrated with visual information for the selection of the next visual search target.
Collapse
Affiliation(s)
- Yael Hazan
- Department of Neuroscience, The Ruth and Bruce Rappaport Faculty of Medicine and Research Institute, Technion Haifa, Israel
| | - Yonatan Kra
- Department of Neuroscience, The Ruth and Bruce Rappaport Faculty of Medicine and Research Institute, Technion Haifa, Israel
| | - Inna Yarin
- Department of Neuroscience, The Ruth and Bruce Rappaport Faculty of Medicine and Research Institute, Technion Haifa, Israel
| | - Hermann Wagner
- Department of Zoology and Animal Physiology, Institute for Biology II, RWTH Aachen University Aachen, Germany
| | - Yoram Gutfreund
- Department of Neuroscience, The Ruth and Bruce Rappaport Faculty of Medicine and Research Institute, Technion Haifa, Israel
| |
Collapse
|
16
|
Stitt I, Galindo-Leon E, Pieper F, Hollensteiner KJ, Engler G, Engel AK. Auditory and visual interactions between the superior and inferior colliculi in the ferret. Eur J Neurosci 2015; 41:1311-20. [PMID: 25645363 DOI: 10.1111/ejn.12847] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2014] [Revised: 12/09/2014] [Accepted: 01/03/2015] [Indexed: 11/27/2022]
Abstract
The integration of visual and auditory spatial information is important for building an accurate perception of the external world, but the fundamental mechanisms governing such audiovisual interaction have only partially been resolved. The earliest interface between auditory and visual processing pathways is in the midbrain, where the superior (SC) and inferior colliculi (IC) are reciprocally connected in an audiovisual loop. Here, we investigate the mechanisms of audiovisual interaction in the midbrain by recording neural signals from the SC and IC simultaneously in anesthetized ferrets. Visual stimuli reliably produced band-limited phase locking of IC local field potentials (LFPs) in two distinct frequency bands: 6-10 and 15-30 Hz. These visual LFP responses co-localized with robust auditory responses that were characteristic of the IC. Imaginary coherence analysis confirmed that visual responses in the IC were not volume-conducted signals from the neighboring SC. Visual responses in the IC occurred later than retinally driven superficial SC layers and earlier than deep SC layers that receive indirect visual inputs, suggesting that retinal inputs do not drive visually evoked responses in the IC. In addition, SC and IC recording sites with overlapping visual spatial receptive fields displayed stronger functional connectivity than sites with separate receptive fields, indicating that visual spatial maps are aligned across both midbrain structures. Reciprocal coupling between the IC and SC therefore probably serves the dynamic integration of visual and auditory representations of space.
Collapse
Affiliation(s)
- Iain Stitt
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, 20246, Hamburg, Germany
| | - Edgar Galindo-Leon
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, 20246, Hamburg, Germany
| | - Florian Pieper
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, 20246, Hamburg, Germany
| | - Karl J Hollensteiner
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, 20246, Hamburg, Germany
| | - Gerhard Engler
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, 20246, Hamburg, Germany
| | - Andreas K Engel
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, 20246, Hamburg, Germany
| |
Collapse
|
17
|
Krabichler Q, Vega-Zuniga T, Morales C, Luksch H, Marín GJ. The visual system of a Palaeognathous bird: Visual field, retinal topography and retino-central connections in the Chilean Tinamou (Nothoprocta perdicaria). J Comp Neurol 2014; 523:226-50. [DOI: 10.1002/cne.23676] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2014] [Revised: 09/05/2014] [Accepted: 09/09/2014] [Indexed: 02/05/2023]
Affiliation(s)
- Quirin Krabichler
- Chair of Zoology, Technische Universität München; Freising-Weihenstephan Germany
| | - Tomas Vega-Zuniga
- Chair of Zoology, Technische Universität München; Freising-Weihenstephan Germany
| | - Cristian Morales
- Laboratorio de Neurobiología y Biología del Conocer; Departamento de Biología; Facultad de Ciencias; Universidad de Chile; Santiago de Chile Chile
| | - Harald Luksch
- Chair of Zoology, Technische Universität München; Freising-Weihenstephan Germany
| | - Gonzalo J. Marín
- Laboratorio de Neurobiología y Biología del Conocer; Departamento de Biología; Facultad de Ciencias; Universidad de Chile; Santiago de Chile Chile
- Facultad de Medicina; Universidad Finis Terrae; Santiago de Chile Chile
| |
Collapse
|
18
|
Netser S, Dutta A, Gutfreund Y. Ongoing activity in the optic tectum is correlated on a trial-by-trial basis with the pupil dilation response. J Neurophysiol 2014; 111:918-29. [DOI: 10.1152/jn.00527.2013] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
The selection of the appropriate stimulus to induce an orienting response is a basic task thought to be partly achieved by tectal circuitry. Here we addressed the relationship between neural activity in the optic tectum (OT) and orienting behavioral responses. We recorded multiunit activity in the intermediate/deep layers of the OT of the barn owl simultaneously with pupil dilation responses (PDR, a well-known orienting response common to birds and mammals). A trial-by-trial analysis of the responses revealed that the PDR generally did not correlate with the evoked neural responses but significantly correlated with the rate of ongoing neural activity measured shortly before the stimulus. Following this finding, we characterized ongoing activity in the OT and showed that in the intermediate/deep layers it tended to fluctuate spontaneously. It is characterized by short periods of high ongoing activity during which the probability of a PDR to an auditory stimulus inside the receptive field is increased. These high-ongoing activity periods were correlated with increase in the power of gamma band local field potential oscillations. Through dual recordings, we showed that the correlation coefficients of ongoing activity decreased as a function of distance between recording sites in the tectal map. Significant correlations were also found between recording sites in the OT and the forebrain entopallium. Our results suggest that an increase of ongoing activity in the OT reflects an internal state during which coupling between sensory stimulation and behavioral responses increases.
Collapse
Affiliation(s)
- Shai Netser
- Department of Physiology and Biophysics, The Ruth and Bruce Rappaport Faculty of Medicine and Research Institute, Technion, Haifa, Israel
| | - Arkadeb Dutta
- Department of Physiology and Biophysics, The Ruth and Bruce Rappaport Faculty of Medicine and Research Institute, Technion, Haifa, Israel
| | - Yoram Gutfreund
- Department of Physiology and Biophysics, The Ruth and Bruce Rappaport Faculty of Medicine and Research Institute, Technion, Haifa, Israel
| |
Collapse
|
19
|
Kaiser O, Aliuos P, Wissel K, Lenarz T, Werner D, Reuter G, Kral A, Warnecke A. Dissociated neurons and glial cells derived from rat inferior colliculi after digestion with papain. PLoS One 2013; 8:e80490. [PMID: 24349001 PMCID: PMC3861243 DOI: 10.1371/journal.pone.0080490] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2013] [Accepted: 10/13/2013] [Indexed: 01/10/2023] Open
Abstract
The formation of gliosis around implant electrodes for deep brain stimulation impairs electrode–tissue interaction. Unspecific growth of glial tissue around the electrodes can be hindered by altering physicochemical material properties. However, in vitro screening of neural tissue–material interaction requires an adequate cell culture system. No adequate model for cells dissociated from the inferior colliculus (IC) has been described and was thus the aim of this study. Therefore, IC were isolated from neonatal rats (P3_5) and a dissociated cell culture was established. In screening experiments using four dissociation methods (Neural Tissue Dissociation Kit [NTDK] T, NTDK P; NTDK PN, and a validated protocol for the dissociation of spiral ganglion neurons [SGN]), the optimal media, and seeding densities were identified. Thereafter, a dissociation protocol containing only the proteolytic enzymes of interest (trypsin or papain) was tested. For analysis, cells were fixed and immunolabeled using glial- and neuron-specific antibodies. Adhesion and survival of dissociated neurons and glial cells isolated from the IC were demonstrated in all experimental settings. Hence, preservation of type-specific cytoarchitecture with sufficient neuronal networks only occurred in cultures dissociated with NTDK P, NTDK PN, and fresh prepared papain solution. However, cultures obtained after dissociation with papain, seeded at a density of 2×104 cells/well and cultivated with Neuro Medium for 6 days reliably revealed the highest neuronal yield with excellent cytoarchitecture of neurons and glial cells. The herein described dissociated culture can be utilized as in vitro model to screen interactions between cells of the IC and surface modifications of the electrode.
Collapse
Affiliation(s)
- Odett Kaiser
- Department of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Pooyan Aliuos
- Department of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Kirsten Wissel
- Department of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Thomas Lenarz
- Department of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Darja Werner
- Department of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Günter Reuter
- Department of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Andrej Kral
- Department of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Athanasia Warnecke
- Department of Otolaryngology, Hannover Medical School, Hannover, Germany
- * E-mail:
| |
Collapse
|
20
|
New perspectives on the owl's map of auditory space. Curr Opin Neurobiol 2013; 24:55-62. [PMID: 24492079 DOI: 10.1016/j.conb.2013.08.008] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2013] [Revised: 08/07/2013] [Accepted: 08/13/2013] [Indexed: 11/20/2022]
Abstract
A map of sound direction was found in the owl's midbrain more than three decades ago. This finding suggested that the brain reconstructs spatial coordinates to represent them. Subsequent research elucidated the variables used to compute the map. Here we provide a review of the processes leading to its emergence and an updated perspective on how and what information is represented.
Collapse
|
21
|
Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization. PLoS One 2013; 8:e72562. [PMID: 24009691 PMCID: PMC3757015 DOI: 10.1371/journal.pone.0072562] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2013] [Accepted: 07/12/2013] [Indexed: 11/19/2022] Open
Abstract
A general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location is itself uncertain and the visual scene is rife with possible visual matches. Here, we develop a simple paradigm using visual guidance of sound localization to gain insight into how the brain confronts this type of circularity. We tested two competing hypotheses. 1: The brain guides sound location learning based on the synchrony or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative mechanism. 2: The brain uses a ‘guess and check’ heuristic in which visual feedback that is obtained after an eye movement to a sound alters future performance, perhaps by recruiting the brain’s reward-related circuitry. We assessed the effects of exposure to visual stimuli spatially mismatched from sounds on performance of an interleaved auditory-only saccade task. We found that when humans and monkeys were provided the visual stimulus asynchronously with the sound but as feedback to an auditory-guided saccade, they shifted their subsequent auditory-only performance toward the direction of the visual cue by 1.3–1.7 degrees, or 22–28% of the original 6 degree visual-auditory mismatch. In contrast when the visual stimulus was presented synchronously with the sound but extinguished too quickly to provide this feedback, there was little change in subsequent auditory-only performance. Our results suggest that the outcome of our own actions is vital to localizing sounds correctly. Contrary to previous expectations, visual calibration of auditory space does not appear to require visual-auditory associations based on synchrony/simultaneity.
Collapse
|
22
|
Cang J, Feldheim DA. Developmental mechanisms of topographic map formation and alignment. Annu Rev Neurosci 2013; 36:51-77. [PMID: 23642132 DOI: 10.1146/annurev-neuro-062012-170341] [Citation(s) in RCA: 163] [Impact Index Per Article: 14.8] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Brain connections are organized into topographic maps that are precisely aligned both within and across modalities. This alignment facilitates coherent integration of different categories of sensory inputs and allows for proper sensorimotor transformations. Topographic maps are established and aligned by multistep processes during development, including interactions of molecular guidance cues expressed in gradients; spontaneous activity-dependent axonal and dendritic remodeling; and sensory-evoked plasticity driven by experience. By focusing on the superior colliculus, a major site of topographic map alignment for different sensory modalities, this review summarizes current understanding of topographic map development in the mammalian visual system and highlights recent advances in map alignment studies. A major goal looking forward is to reveal the molecular and synaptic mechanisms underlying map alignment and to understand the physiological and behavioral consequences when these mechanisms are disrupted at various scales.
Collapse
Affiliation(s)
- Jianhua Cang
- Department of Neurobiology, Northwestern University, Evanston, IL 60208, USA.
| | | |
Collapse
|
23
|
Patel M, Reed M. Stimulus encoding within the barn owl optic tectum using gamma oscillations vs. spike rate: a modeling approach. NETWORK (BRISTOL, ENGLAND) 2013; 24:52-74. [PMID: 23406211 DOI: 10.3109/0954898x.2013.763405] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/01/2023]
Abstract
The optic tectum of the barn owl is a multimodal structure with multiple layers, with each layer topographically organized according to spatial receptive field. The response of a site to a stimulus can be measured as either spike rate or local field potential (LFP) gamma (25-90 Hz) power; within superficial layers, spike rate and gamma power spatial tuning curves are narrow and contrast-response functions rise slowly. Within deeper layers, however, spike rate tuning curves broaden and gamma power contrast-response functions sharpen. In this work, we employ a computational model to describe the inputs required to generate these transformations from superficial to deep layers and show that gamma power and spike rate can act as parallel information processing streams.
Collapse
Affiliation(s)
- Mainak Patel
- Department of Mathematics, Duke University, Durham, NC 27708, USA.
| | | |
Collapse
|
24
|
Gruters KG, Groh JM. Sounds and beyond: multisensory and other non-auditory signals in the inferior colliculus. Front Neural Circuits 2012; 6:96. [PMID: 23248584 PMCID: PMC3518932 DOI: 10.3389/fncir.2012.00096] [Citation(s) in RCA: 74] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2012] [Accepted: 11/15/2012] [Indexed: 11/20/2022] Open
Abstract
The inferior colliculus (IC) is a major processing center situated mid-way along both the ascending and descending auditory pathways of the brain stem. Although it is fundamentally an auditory area, the IC also receives anatomical input from non-auditory sources. Neurophysiological studies corroborate that non-auditory stimuli can modulate auditory processing in the IC and even elicit responses independent of coincident auditory stimulation. In this article, we review anatomical and physiological evidence for multisensory and other non-auditory processing in the IC. Specifically, the contributions of signals related to vision, eye movements and position, somatosensation, and behavioral context to neural activity in the IC will be described. These signals are potentially important for localizing sound sources, attending to salient stimuli, distinguishing environmental from self-generated sounds, and perceiving and generating communication sounds. They suggest that the IC should be thought of as a node in a highly interconnected sensory, motor, and cognitive network dedicated to synthesizing a higher-order auditory percept rather than simply reporting patterns of air pressure detected by the cochlea. We highlight some of the potential pitfalls that can arise from experimental manipulations that may disrupt the normal function of this network, such as the use of anesthesia or the severing of connections from cortical structures that project to the IC. Finally, we note that the presence of these signals in the IC has implications for our understanding not just of the IC but also of the multitude of other regions within and beyond the auditory system that are dependent on signals that pass through the IC. Whatever the IC “hears” would seem to be passed both “upward” to thalamus and thence to auditory cortex and beyond, as well as “downward” via centrifugal connections to earlier areas of the auditory pathway such as the cochlear nucleus.
Collapse
Affiliation(s)
- Kurtis G Gruters
- Department of Psychology and Neuroscience, Duke University Durham, NC, USA
| | | |
Collapse
|
25
|
Bulkin DA, Groh JM. Distribution of visual and saccade related information in the monkey inferior colliculus. Front Neural Circuits 2012; 6:61. [PMID: 22973196 PMCID: PMC3433683 DOI: 10.3389/fncir.2012.00061] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2012] [Accepted: 08/18/2012] [Indexed: 11/29/2022] Open
Abstract
The inferior colliculus (IC) is an essential stop early in the ascending auditory pathway. Though normally thought of as a predominantly auditory structure, recent work has uncovered a variety of non-auditory influences on firing rate in the IC. Here, we map the location within the IC of neurons that respond to the onset of a fixation-guiding visual stimulus. Visual/visuomotor associated activity was found throughout the IC (overall, 84 of 199 sites tested or 42%), but with a far reduced prevalence and strength along recording penetrations passing through the tonotopically organized region of the IC, putatively the central nucleus (11 of 42 sites tested, or 26%). These results suggest that visual information has only a weak effect on early auditory processing in core regions, but more strongly targets the modulatory shell regions of the IC.
Collapse
Affiliation(s)
- David A Bulkin
- Department of Psychology, Cornell University Ithaca, NY, USA
| | | |
Collapse
|
26
|
Huo J, Murray A, Wei D. Adaptive visual and auditory map alignment in barn owl superior colliculus and its neuromorphic implementation. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2012; 23:1486-1497. [PMID: 24807931 DOI: 10.1109/tnnls.2012.2204771] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Adaptation is one of the most important phenomena in biology. A young barn owl can adapt to imposed environmental changes, such as artificial visual distortion caused by wearing a prism. This adjustment process has been modeled mathematically and the model replicates the sensory map realignment of barn owl superior colliculus (SC) through axonogenesis and synaptogenesis. This allows the biological mechanism to be transferred to an artificial computing system and thereby imbue it with a new form of adaptability to the environment. The model is demonstrated in a real-time robot environment. Results of the experiments are compared with and without prism distortion of vision, and show improved adaptability for the robot. However, the computation speed of the embedded system in the robot is slow. A digital and analog mixed signal very-large-scale integration (VLSI) circuit has been fabricated to implement adaptive sensory pathway changes derived from the SC model at higher speed. VLSI experimental results are consistent with simulation results.
Collapse
|
27
|
Singheiser M, Gutfreund Y, Wagner H. The representation of sound localization cues in the barn owl's inferior colliculus. Front Neural Circuits 2012; 6:45. [PMID: 22798945 PMCID: PMC3394089 DOI: 10.3389/fncir.2012.00045] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2012] [Accepted: 06/21/2012] [Indexed: 11/13/2022] Open
Abstract
The barn owl is a well-known model system for studying auditory processing and sound localization. This article reviews the morphological and functional organization, as well as the role of the underlying microcircuits, of the barn owl's inferior colliculus (IC). We focus on the processing of frequency and interaural time (ITD) and level differences (ILD). We first summarize the morphology of the sub-nuclei belonging to the IC and their differentiation by antero- and retrograde labeling and by staining with various antibodies. We then focus on the response properties of neurons in the three major sub-nuclei of IC [core of the central nucleus of the IC (ICCc), lateral shell of the central nucleus of the IC (ICCls), and the external nucleus of the IC (ICX)]. ICCc projects to ICCls, which in turn sends its information to ICX. The responses of neurons in ICCc are sensitive to changes in ITD but not to changes in ILD. The distribution of ITD sensitivity with frequency in ICCc can only partly be explained by optimal coding. We continue with the tuning properties of ICCls neurons, the first station in the midbrain where the ITD and ILD pathways merge after they have split at the level of the cochlear nucleus. The ICCc and ICCls share similar ITD and frequency tuning. By contrast, ICCls shows sigmoidal ILD tuning which is absent in ICCc. Both ICCc and ICCls project to the forebrain, and ICCls also projects to ICX, where space-specific neurons are found. Space-specific neurons exhibit side peak suppression in ITD tuning, bell-shaped ILD tuning, and are broadly tuned to frequency. These neurons respond only to restricted positions of auditory space and form a map of two-dimensional auditory space. Finally, we briefly review major IC features, including multiplication-like computations, correlates of echo suppression, plasticity, and adaptation.
Collapse
|
28
|
Interactive Coding of Visual Spatial Frequency and Auditory Amplitude-Modulation Rate. Curr Biol 2012; 22:383-8. [DOI: 10.1016/j.cub.2012.01.004] [Citation(s) in RCA: 43] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2011] [Revised: 12/22/2011] [Accepted: 01/04/2012] [Indexed: 11/20/2022]
|
29
|
Abstract
Habituation is the most basic form of learning, yet many gaps remain in our understanding of its underlying neural mechanisms. We demonstrate that in the owl's optic tectum (OT), a single, low-level, relatively short auditory stimulus is sufficient to induce a significant reduction in the neural response to a stimulus presented up to 60 s later. This type of neural adaptation was absent in neurons from the central nucleus of the inferior colliculus and from the auditory thalamus; however, it was apparent in the OT and the forebrain entopallium. By presenting sequences that alternate between two different auditory stimuli, we show that this long-lasting adaptation is stimulus specific. The response to an odd stimulus in the sequence was not smaller than the response to the same stimulus when it was first in the sequence. Finally, we measured the habituation of reflexive eye movements and show that the behavioral habituation is correlated with the neural adaptation. The finding of a long-lasting specific adaptation in areas related to the gaze control system and not elsewhere suggests its involvement in habituation processes and opens new directions for research on mechanisms of habituation.
Collapse
|
30
|
Kuwada S, Bishop B, Alex C, Condit DW, Kim DO. Spatial tuning to sound-source azimuth in the inferior colliculus of unanesthetized rabbit. J Neurophysiol 2011; 106:2698-708. [PMID: 21849611 PMCID: PMC3214120 DOI: 10.1152/jn.00532.2011] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2011] [Accepted: 08/12/2011] [Indexed: 11/22/2022] Open
Abstract
Despite decades of research devoted to the study of inferior colliculus (IC) neurons' tuning to sound-source azimuth, there remain many unanswered questions because no previous study has examined azimuth tuning over a full range of 360° azimuths at a wide range of stimulus levels in an unanesthetized preparation. Furthermore, a comparison of azimuth tuning to binaural and contralateral ear stimulation over ranges of full azimuths and widely varying stimulus levels has not previously been reported. To fill this void, we have conducted a study of azimuth tuning in the IC of the unanesthetized rabbit over a 300° range of azimuths at stimulus levels of 10-50 dB above neural threshold to both binaural and contralateral ear stimulation using virtual auditory space stimuli. This study provides systematic evidence for neural coding of azimuth. We found the following: 1) level-tolerant azimuth tuning was observed in the top 35% regarding vector strength and in the top 15% regarding vector angle of IC neurons; 2) preserved azimuth tuning to binaural stimulation at high stimulus levels was created as a consequence of binaural facilitation in the contralateral sound field and binaural suppression in the ipsilateral sound field; 3) the direction of azimuth tuning to binaural stimulation was primarily in the contralateral sound field, and its center shifted laterally toward -90° with increasing stimulus level; 4) at 10 dB, azimuth tuning to binaural and contralateral stimulation was similar, indicating that it was mediated by monaural mechanisms; and 5) at higher stimulus levels, azimuth tuning to contralateral ear stimulation was severely degraded. These findings form a foundation for understanding neural mechanisms of localizing sound-source azimuth.
Collapse
Affiliation(s)
- Shigeyuki Kuwada
- Dept. of Neuroscience, Univ. of Connecticut Health Center, Farmington, CT 06030, USA.
| | | | | | | | | |
Collapse
|
31
|
Bulkin DA, Groh JM. Distribution of eye position information in the monkey inferior colliculus. J Neurophysiol 2011; 107:785-95. [PMID: 22031775 DOI: 10.1152/jn.00662.2011] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
The inferior colliculus (IC) is thought to have two main subdivisions, a central region that forms an important stop on the ascending auditory pathway and a surrounding shell region that may play a more modulatory role. In this study, we investigated whether eye position affects activity in both the central and shell regions. Accordingly, we mapped the location of eye position-sensitive neurons in six monkeys making spontaneous eye movements by sampling multiunit activity at regularly spaced intervals throughout the IC. We used a functional map based on auditory response patterns to estimate the anatomical location of recordings, in conjunction with structural MRI and histology. We found eye position-sensitive sites throughout the IC, including at 27% of sites in tonotopically organized recording penetrations (putatively the central nucleus). Recordings from surrounding tissue showed a larger proportion of sites indicating an influence of eye position (33-43%). When present, the magnitude of the change in activity due to eye position was often comparable to that seen for sound frequency. Our results indicate that the primary ascending auditory pathway is influenced by the position of the eyes. Because eye position is essential for visual-auditory integration, our findings suggest that computations underlying visual-auditory integration begin early in the ascending auditory pathway.
Collapse
Affiliation(s)
- David A Bulkin
- Department of Psychology, Cornell University, Ithaca, New York, USA.
| | | |
Collapse
|
32
|
Multisensory perceptual learning reshapes both fast and slow mechanisms of crossmodal processing. COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2011; 11:1-12. [PMID: 21264643 DOI: 10.3758/s13415-010-0006-x] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Previous research has shown that sounds facilitate perception of visual patterns appearing immediately after the sound but impair perception of patterns appearing after some delay. Here we examined the spatial gradient of the fast crossmodal facilitation effect and the slow inhibition effect in order to test whether they reflect separate mechanisms. We found that crossmodal facilitation is only observed at visual field locations overlapping with the sound, whereas crossmodal inhibition affects the whole hemifield. Furthermore, we tested whether multisensory perceptual learning with misaligned audio-visual stimuli reshapes crossmodal facilitation and inhibition. We found that training shifts crossmodal facilitation towards the trained location without changing its range. By contrast, training narrows the range of inhibition without shifting its position. Our results suggest that crossmodal facilitation and inhibition reflect separate mechanisms that can both be reshaped by multisensory experience even in adult humans. Multisensory links seem to be more plastic than previously thought.
Collapse
|
33
|
McNally GP, Johansen JP, Blair HT. Placing prediction into the fear circuit. Trends Neurosci 2011; 34:283-92. [PMID: 21549434 DOI: 10.1016/j.tins.2011.03.005] [Citation(s) in RCA: 193] [Impact Index Per Article: 14.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2011] [Revised: 03/28/2011] [Accepted: 03/29/2011] [Indexed: 10/18/2022]
Abstract
Pavlovian fear conditioning depends on synaptic plasticity at amygdala neurons. Here, we review recent electrophysiological, molecular and behavioral evidence suggesting the existence of a distributed neural circuitry regulating amygdala synaptic plasticity during fear learning. This circuitry, which involves projections from the midbrain periaqueductal gray region, can be linked to prediction error and expectation modulation of fear learning, as described by associative and computational learning models. It controls whether, and how much, fear learning occurs by signaling aversive events when they are unexpected. Functional neuroimaging and clinical studies indicate that this prediction circuit is recruited in humans during fear learning and contributes to exposure-based treatments for clinical anxiety. This aversive prediction error circuit might represent a conserved mechanism for regulating fear learning in mammals.
Collapse
Affiliation(s)
- Gavan P McNally
- School of Psychology, The University of New South Wales, Sydney, NSW, Australia.
| | | | | |
Collapse
|
34
|
Abstract
The human brain has accumulated many useful building blocks over its evolutionary history, and the best knowledge of these has often derived from experiments performed in animal species that display finely honed abilities. In this article we review a model system at the forefront of investigation into the neural bases of information processing, plasticity, and learning: the barn owl auditory localization pathway. In addition to the broadly applicable principles gleaned from three decades of work in this system, there are good reasons to believe that continued exploration of the owl brain will be invaluable for further advances in understanding of how neuronal networks give rise to behavior.
Collapse
Affiliation(s)
- Jose L Pena
- Dominick P. Purpura Department of Neuroscience, Albert Einstein College of Medicine, Bronx, New York, USA
| | | |
Collapse
|
35
|
Lomber SG, Meredith MA, Kral A. Adaptive crossmodal plasticity in deaf auditory cortex. PROGRESS IN BRAIN RESEARCH 2011; 191:251-70. [DOI: 10.1016/b978-0-444-53752-2.00001-1] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
|
36
|
Bürck M, Friedel P, Sichert AB, Vossen C, van Hemmen JL. Optimality in mono- and multisensory map formation. BIOLOGICAL CYBERNETICS 2010; 103:1-20. [PMID: 20502911 DOI: 10.1007/s00422-010-0393-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/05/2010] [Accepted: 04/10/2010] [Indexed: 05/29/2023]
Abstract
In the struggle for survival in a complex and dynamic environment, nature has developed a multitude of sophisticated sensory systems. In order to exploit the information provided by these sensory systems, higher vertebrates reconstruct the spatio-temporal environment from each of the sensory systems they have at their disposal. That is, for each modality the animal computes a neuronal representation of the outside world, a monosensory neuronal map. Here we present a universal framework that allows to calculate the specific layout of the involved neuronal network by means of a general mathematical principle, viz., stochastic optimality. In order to illustrate the use of this theoretical framework, we provide a step-by-step tutorial of how to apply our model. In so doing, we present a spatial and a temporal example of optimal stimulus reconstruction which underline the advantages of our approach. That is, given a known physical signal transmission and rudimental knowledge of the detection process, our approach allows to estimate the possible performance and to predict neuronal properties of biological sensory systems. Finally, information from different sensory modalities has to be integrated so as to gain a unified perception of reality for further processing, e.g., for distinct motor commands. We briefly discuss concepts of multimodal interaction and how a multimodal space can evolve by alignment of monosensory maps.
Collapse
Affiliation(s)
- Moritz Bürck
- Technical University of Munich, Munich, Germany.
| | | | | | | | | |
Collapse
|
37
|
Netser S, Ohayon S, Gutfreund Y. Multiple Manifestations of Microstimulation in the Optic Tectum: Eye Movements, Pupil Dilations, and Sensory Priming. J Neurophysiol 2010; 104:108-18. [DOI: 10.1152/jn.01142.2009] [Citation(s) in RCA: 43] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
It is well established that the optic tectum (or its mammalian homologue, the superior colliculus) is involved in directing gaze toward salient stimuli. However, salient stimuli typically induce orienting responses beyond gaze shifts. The role of the optic tectum in generating responses such as pupil dilation, galvanic responses, or covert shifts is not clear. In the present work, we studied the effects of microstimulation in the optic tectum of the barn owl ( Tyto alba) on pupil diameter and on eye shifts. Experiments were conducted in lightly anesthetized head-restrained barn owls. We report that low-level microstimulation in the deep layers of the optic tectum readily induced pupil dilation responses (PDRs), as well as small eye movements. Electrically evoked PDRs, similar to acoustically evoked PDRs, were long-lasting and habituated to repeated stimuli. We further show that microstimulation in the external nucleus of the inferior colliculus also induced PDRs. Finally, in experiments in which tectal microstimulations were coupled with acoustic stimuli, we show a tendency of the microstimulation to enhance pupil responses and eye shifts to previously habituated acoustic stimuli. The enhancement was dependent on the site of stimulation in the tectal spatial map; responses to sounds with spatial cues that matched the site of stimulation were more enhanced compared with sounds with spatial cues that did not match. These results suggest that the optic tectum is directly involved in autonomic orienting reflexes as well as in gaze shifts, highlighting the central role of the optic tectum in mediating the body responses to salient stimuli.
Collapse
Affiliation(s)
- Shai Netser
- The Department of Physiology and Biophysics, The Rappaport Faculty of Medicine and Research Institute, The Technion–Israel Institute of Technology, Haifa, Israel; and
| | - Shay Ohayon
- Computation and Neural Systems, California Institute of Technology, Pasadena, California
| | - Yoram Gutfreund
- The Department of Physiology and Biophysics, The Rappaport Faculty of Medicine and Research Institute, The Technion–Israel Institute of Technology, Haifa, Israel; and
| |
Collapse
|
38
|
Perceptron learning rule derived from spike-frequency adaptation and spike-time-dependent plasticity. Proc Natl Acad Sci U S A 2010; 107:4722-7. [PMID: 20167805 DOI: 10.1073/pnas.0909394107] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
It is widely believed that sensory and motor processing in the brain is based on simple computational primitives rooted in cellular and synaptic physiology. However, many gaps remain in our understanding of the connections between neural computations and biophysical properties of neurons. Here, we show that synaptic spike-time-dependent plasticity (STDP) combined with spike-frequency adaptation (SFA) in a single neuron together approximate the well-known perceptron learning rule. Our calculations and integrate-and-fire simulations reveal that delayed inputs to a neuron endowed with STDP and SFA precisely instruct neural responses to earlier arriving inputs. We demonstrate this mechanism on a developmental example of auditory map formation guided by visual inputs, as observed in the external nucleus of the inferior colliculus (ICX) of barn owls. The interplay of SFA and STDP in model ICX neurons precisely transfers the tuning curve from the visual modality onto the auditory modality, demonstrating a useful computation for multimodal and sensory-guided processing.
Collapse
|
39
|
Abstract
The neural mechanisms underlying attentional selection of competing neural signals for awareness remains an unresolved issue. We studied attentional selection, using perceptually ambiguous stimuli in a novel multisensory paradigm that combined competing auditory and competing visual stimuli. We demonstrate that the ability to select, and attentively hold, one of the competing alternatives in either sensory modality is greatly enhanced when there is a matching cross-modal stimulus. Intriguingly, this multimodal enhancement of attentional selection seems to require a conscious act of attention, as passively experiencing the multisensory stimuli did not enhance control over the stimulus. We also demonstrate that congruent auditory or tactile information, and combined auditory-tactile information, aids attentional control over competing visual stimuli and visa versa. Our data suggest a functional role for recently found neurons that combine voluntarily initiated attentional functions across sensory modalities. We argue that these units provide a mechanism for structuring multisensory inputs that are then used to selectively modulate early (unimodal) cortical processing, boosting the gain of task-relevant features for willful control over perceptual awareness.
Collapse
|
40
|
Bergan JF, Knudsen EI. Visual modulation of auditory responses in the owl inferior colliculus. J Neurophysiol 2009; 101:2924-33. [PMID: 19321633 DOI: 10.1152/jn.91313.2008] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
The barn owl's central auditory system creates a map of auditory space in the external nucleus of the inferior colliculus (ICX). Although the crucial role visual experience plays in the formation and maintenance of this auditory space map is well established, the mechanism by which vision influences ICX responses remains unclear. Surprisingly, previous experiments have found that in the absence of extensive pharmacological manipulation, visual stimuli do not drive neural responses in the ICX. Here we investigated the influence of dynamic visual stimuli on auditory responses in the ICX. We show that a salient visual stimulus, when coincident with an auditory stimulus, can modulate auditory responses in the ICX even though the same visual stimulus may elicit no neural responses when presented alone. For each ICX neuron, the most effective auditory and visual stimuli were located in the same region of space. In addition, the magnitude of the visual modulation of auditory responses was dependent on the context of the stimulus presentation with novel visual stimuli eliciting consistently larger response modulations than frequently presented visual stimuli. Thus the visual modulation of ICX responses is dependent on the characteristics of the visual stimulus as well as on the spatial and temporal correspondence of the auditory and visual stimuli. These results demonstrate moment-to-moment visual enhancements of auditory responsiveness that, in the short-term, increase auditory responses to salient bimodal stimuli and in the long-term could serve to instruct the adaptive auditory plasticity necessary to maintain accurate auditory orienting behavior.
Collapse
Affiliation(s)
- Joseph F Bergan
- Department of Neurobiology, Stanford University, Stanford, California 94305, USA
| | | |
Collapse
|
41
|
Zahar Y, Reches A, Gutfreund Y. Multisensory enhancement in the optic tectum of the barn owl: spike count and spike timing. J Neurophysiol 2009; 101:2380-94. [PMID: 19261710 DOI: 10.1152/jn.91193.2008] [Citation(s) in RCA: 43] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Temporal and spatial correlations between auditory and visual stimuli facilitate the perception of unitary events and improve behavioral responses. However, it is not clear how combined visual and auditory information is processed in single neurons. Here we studied responses of multisensory neurons in the barn owl's optic tectum (the avian homologue of the superior colliculus) to visual, auditory, and bimodal stimuli. We specifically focused on responses to sequences of repeated stimuli. We first report that bimodal stimulation tends to elicit more spikes than in the responses to its unimodal components (a phenomenon known as multisensory enhancement). However, this tendency was found to be history-dependent; multisensory enhancement was mostly apparent in the first stimulus of the sequence and to a much lesser extent in the subsequent stimuli. Next, a vector-strength analysis was applied to quantify the phase locking of the responses to the stimuli. We report that in a substantial number of multisensory neurons responses to sequences of bimodal stimuli elicited spike trains that were better phase locked to the stimulus than spike trains elicited by stimulating with the unimodal counterparts (visual or auditory). We conclude that multisensory enhancement can be manifested in better phase locking to the stimulus as well as in more spikes.
Collapse
Affiliation(s)
- Yael Zahar
- Dept. of Physiology and Biophysics, The Bruce Rappaport Medical School, The Technion, Haifa 31096, Israel
| | | | | |
Collapse
|
42
|
Huo J, Murray A. The adaptation of visual and auditory integration in the barn owl superior colliculus with Spike Timing Dependent Plasticity. Neural Netw 2008; 22:913-21. [PMID: 19084371 DOI: 10.1016/j.neunet.2008.10.007] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2008] [Revised: 07/20/2008] [Accepted: 10/29/2008] [Indexed: 11/24/2022]
Abstract
To localize a seen object, the superior colliculus of the barn owl integrates the visual and auditory localization cues which are accessed from the sensory system of the brain. These cues are formed as visual and auditory maps. The alignment between visual and auditory maps is very important for accurate localization in prey behavior. Blindness or prism wearing may interfere this alignment. The juvenile barn owl could adapt its auditory map to this mismatch after several weeks training. Here we investigate this process by building a computational model of auditory and visual integration in deep Superior Colliculus (SC). The adaptation of the map alignment is based on activity dependent axon developing in Inferior Colliculus (IC). This axon growing process is instructed by an inhibitory network in SC while the strength of the inhibition is adjusted by Spike Timing Dependent Plasticity (STDP). The simulation results of this model are in line with the biological experiment and support the idea that STDP is involved in the alignment of sensory maps. This model also provides a new spiking neuron based mechanism capable of eliminating the disparity in visual and auditory map integration.
Collapse
Affiliation(s)
- Juan Huo
- Doctoral Training Center, School of Informatics, The University of Edinburgh, Mayfield Road, Edinburgh, UK.
| | | |
Collapse
|
43
|
Bidirectional regulation of the cAMP response element binding protein encodes spatial map alignment in prism-adapting barn owls. J Neurosci 2008; 28:9898-909. [PMID: 18829948 DOI: 10.1523/jneurosci.1385-08.2008] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
The barn owl midbrain contains mutually aligned maps of auditory and visual space. Throughout life, map alignment is maintained through the actions of an instructive signal that encodes the magnitude of auditory-visual mismatch. The intracellular signaling pathways activated by this signal are unknown. Here we tested the hypothesis that CREB (cAMP response element-binding protein) provides a cell-specific readout of instructive information. Owls were fitted with prismatic or control spectacles and provided rich auditory-visual experience: hunting live mice. CREB activation was analyzed within 30 min of hunting using phosphorylation state-specific CREB (pCREB) and CREB antibodies, confocal imaging, and immunofluorescence measurements at individual cell nuclei. In control owls or prism-adapted owls, which experience small instructive signals, the frequency distributions of pCREB/CREB values obtained for cell nuclei within the external nucleus of the inferior colliculus (ICX) were unimodal. In contrast, in owls adapting to prisms or readapting to normal conditions, the distributions were bimodal: certain cells had received a signal that positively regulated CREB and, by extension, transcription of CREB-dependent genes, whereas others received a signal that negatively regulated it. These changes were restricted to the subregion of the inferior colliculus that received optically displaced input, the rostral ICX, and were not evident in the caudal ICX or central nucleus. Finally, the topographic pattern of CREB regulation was patchy, not continuous, as expected from the actions of a topographically precise signal encoding discrete events. These results support a model in which the magnitude of CREB activation within individual cells provides a readout of the instructive signal that guides plasticity and learning.
Collapse
|
44
|
Friedel P, van Hemmen JL. Inhibition, not excitation, is the key to multimodal sensory integration. BIOLOGICAL CYBERNETICS 2008; 98:597-618. [PMID: 18491169 DOI: 10.1007/s00422-008-0236-y] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/18/2008] [Accepted: 04/28/2008] [Indexed: 05/26/2023]
Abstract
Multimodal neuronal maps, combining input from two or more sensory systems, play a key role in the processing of sensory and motor information. For such maps to be of any use, the input from all participating modalities must be calibrated so that a stimulus at a specific spatial location is represented at an unambiguous position in the multimodal map. Here we discuss two methods based on supervised spike-timing-dependent plasticity (STDP) to gauge input from different sensory modalities so as to ensure a proper map alignment. The first uses an excitatory teacher input. It is therefore called excitation-mediated learning. The second method is based on an inhibitory teacher signal, as found in the barn owl, and is called inhibition-mediated learning. Using detailed analytical calculations and numerical simulations, we demonstrate that inhibitory teacher input is essential if high-quality multimodal integration is to be learned rapidly. Furthermore, we show that the quality of the resulting map is not so much limited by the quality of the teacher signal but rather by the accuracy of the input from other sensory modalities.
Collapse
Affiliation(s)
- Paul Friedel
- Physik Department T35, Technische Universität München, 85748, Garching bei München, Germany.
| | | |
Collapse
|
45
|
Swofford JA, DeBello WM. Transcriptome changes associated with instructed learning in the barn owl auditory localization pathway. Dev Neurobiol 2007; 67:1457-77. [PMID: 17526003 DOI: 10.1002/dneu.20458] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Owls reared wearing prismatic spectacles learn to make adaptive orienting movements. This instructed learning depends on re-calibration of the midbrain auditory space map, which in turn involves the formation of new synapses. Here we investigated whether these processes are associated with differential gene expression, using longSAGE. Newly fledged owls were reared for 8-36 days with prism or control lenses at which time the extent of learning was quantified by electrophysiological mapping. Transciptome profiles were obtained from the inferior colliculus (IC), the major site of synaptic plasticity, and the optic tectum (OT), which provides an instructive signal that controls the direction and extent of plasticity. Twenty-two differentially expressed sequence tags were identified in IC and 36 in OT, out of more than 35,000 unique tags. Of these, only four were regulated in both structures. These results indicate that regulation of two largely independent gene clusters is associated with synaptic remodeling (in IC) and generation of the instructive signal (in OT). Real-time PCR data confirmed the changes for two transcripts, ubiquitin/polyubiquitin and tyrosine 3-monooxgenase/tryotophan 5-monooxygenase activation protein, theta subunit (YWHAQ; also referred to as 14-3-3 protein). Ubiquitin was downregulated in IC, consistent with a model in which protein degradation pathways act as an inhibitory constraint on synaptogenesis. YWHAQ was up-regulated in OT, indicating a role in the synthesis or delivery of instructive information. In total, our results provide a path towards unraveling molecular cascades that link naturalistic experience with synaptic remodeling and, ultimately, with the expression of learned behavior.
Collapse
Affiliation(s)
- Janet A Swofford
- Department of Neurobiology, Physiology, and Behavior, Center for Neuroscience, University of California-Davis, Davis, CA 95616, USA
| | | |
Collapse
|
46
|
Porter KK, Metzger RR, Groh JM. Visual- and saccade-related signals in the primate inferior colliculus. Proc Natl Acad Sci U S A 2007; 104:17855-60. [PMID: 17978183 PMCID: PMC2077072 DOI: 10.1073/pnas.0706249104] [Citation(s) in RCA: 47] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2007] [Indexed: 11/18/2022] Open
Abstract
The inferior colliculus (IC) is normally thought of as a predominantly auditory structure because of its early position in the ascending auditory pathway just before the auditory thalamus. Here, we show that a majority of IC neurons (64% of 180 neurons) in awake monkeys carry visual- and/or saccade-related signals in addition to their auditory responses (P < 0.05). The response patterns involve primarily excitatory visual responses, but also increased activity time-locked to the saccade, slow rises in activity time-locked to the onset of the visual stimulus, and inhibitory responses. The presence of these visual-related signals suggests that the IC plays a role in integrating visual and auditory information. More broadly, our results show that interactions between sensory pathways can occur at very early points in sensory processing streams, which implies that multisensory integration may be a low-level rather than an exclusively high-level process.
Collapse
Affiliation(s)
- Kristin Kelly Porter
- Center for Cognitive Neuroscience, Department of Psychology and Neuroscience and Department of Neurobiology, Duke University, Durham, NC 27708
| | - Ryan R. Metzger
- Center for Cognitive Neuroscience, Department of Psychology and Neuroscience and Department of Neurobiology, Duke University, Durham, NC 27708
| | - Jennifer M. Groh
- Center for Cognitive Neuroscience, Department of Psychology and Neuroscience and Department of Neurobiology, Duke University, Durham, NC 27708
| |
Collapse
|
47
|
Molter C, Salihoglu U, Bersini H. The Road to Chaos by Time-Asymmetric Hebbian Learning in Recurrent Neural Networks. Neural Comput 2007; 19:80-110. [PMID: 17134318 DOI: 10.1162/neco.2007.19.1.80] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
This letter aims at studying the impact of iterative Hebbian learning algorithms on the recurrent neural network's underlying dynamics. First, an iterative supervised learning algorithm is discussed. An essential improvement of this algorithm consists of indexing the attractor information items by means of external stimuli rather than by using only initial conditions, as Hopfield originally proposed. Modifying the stimuli mainly results in a change of the entire internal dynamics, leading to an enlargement of the set of attractors and potential memory bags. The impact of the learning on the network's dynamics is the following: the more information to be stored as limit cycle attractors of the neural network, the more chaos prevails as the background dynamical regime of the network. In fact, the background chaos spreads widely and adopts a very unstructured shape similar to white noise.Next, we introduce a new form of supervised learning that is more plausible from a biological point of view: the network has to learn to react to an external stimulus by cycling through a sequence that is no longer specified a priori. Based on its spontaneous dynamics, the network decides “on its own” the dynamical patterns to be associated with the stimuli. Compared with classical supervised learning, huge enhancements in storing capacity and computational cost have been observed. Moreover, this new form of supervised learning, by being more “respectful” of the network intrinsic dynamics, maintains much more structure in the obtained chaos. It is still possible to observe the traces of the learned attractors in the chaotic regime. This complex but still very informative regime is referred to as “frustrated chaos.”
Collapse
Affiliation(s)
- Colin Molter
- Laboratory for Dynamics of Emergent Intelligence, RIKEN Brain Science Institute, Wako, Saitama, 351-0198, Japan.
| | | | | |
Collapse
|
48
|
Abstract
Auditory neurons in the owl’s external nucleus of the inferior colliculus (ICX) integrate information across frequency channels to create a map of auditory space. This study describes a powerful, sound-driven adaptation of unit responsiveness in the ICX and explores the implications of this adaptation for sensory processing. Adaptation in the ICX was analyzed by presenting lightly anesthetized owls with sequential pairs of dichotic noise bursts. Adaptation occurred in response even to weak, threshold-level sounds and remained strong for more than 100 ms after stimulus offset. Stimulation by one range of sound frequencies caused adaptation that generalized across the entire broad range of frequencies to which these units responded. Identical stimuli were used to test adaptation in the lateral shell of the central nucleus of the inferior colliculus (ICCls), which provides input directly to the ICX. Compared with ICX adaptation, adaptation in the ICCls was substantially weaker, shorter lasting, and far more frequency specific, suggesting that part of the adaptation observed in the ICX was attributable to processes resident to the ICX. The sharp tuning of ICX neurons to space, along with their broad tuning to frequency, allows ICX adaptation to preserve a representation of stimulus location, regardless of the frequency content of the sound. The ICX is known to be a site of visually guided auditory map plasticity. ICX adaptation could play a role in this cross-modal plasticity by providing a short-term memory of the representation of auditory localization cues that could be compared with later-arriving, visual–spatial information from bimodal stimuli.
Collapse
Affiliation(s)
- Yoram Gutfreund
- Department of Neurobiology, Stanford University, Stanford, California, USA.
| | | |
Collapse
|
49
|
Bulkin DA, Groh JM. Seeing sounds: visual and auditory interactions in the brain. Curr Opin Neurobiol 2006; 16:415-9. [PMID: 16837186 DOI: 10.1016/j.conb.2006.06.008] [Citation(s) in RCA: 77] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2006] [Accepted: 06/29/2006] [Indexed: 11/22/2022]
Abstract
Objects and events can often be detected by more than one sensory system. Interactions between sensory systems can offer numerous benefits for the accuracy and completeness of the perception. Recent studies involving visual-auditory interactions have highlighted the perceptual advantages of combining information from these two modalities and have suggested that predominantly unimodal brain regions play a role in multisensory processing.
Collapse
Affiliation(s)
- David A Bulkin
- Department of Psychology and Neuroscience, Center for Cognitive Neuroscience, Duke University, LSRC Room B203, Box 90999, Durham, NC 27708 USA
| | | |
Collapse
|
50
|
Abstract
Human sound localization results primarily from the processing of binaural differences in sound level and arrival time for locations in the horizontal plane (azimuth) and of spectral shape cues generated by the head and pinnae for positions in the vertical plane (elevation). The latter mechanism incorporates two processing stages: a spectral-to-spatial mapping stage and a binaural weighting stage that determines the contribution of each ear to perceived elevation as function of sound azimuth. We demonstrated recently that binaural pinna molds virtually abolish the ability to localize sound-source elevation, but, after several weeks, subjects regained normal localization performance. It is not clear which processing stage underlies this remarkable plasticity, because the auditory system could have learned the new spectral cues separately for each ear (spatial-mapping adaptation) or for one ear only, while extending its contribution into the contralateral hemifield (binaural-weighting adaptation). To dissociate these possibilities, we applied a long-term monaural spectral perturbation in 13 subjects. Our results show that, in eight experiments, listeners learned to localize accurately with new spectral cues that differed substantially from those provided by their own ears. Interestingly, five subjects, whose spectral cues were not sufficiently perturbed, never yielded stable localization performance. Our findings indicate that the analysis of spectral cues may involve a correlation process between the sensory input and a stored spectral representation of the subject's ears and that learning acts predominantly at a spectral-to-spatial mapping level rather than at the level of binaural weighting.
Collapse
Affiliation(s)
- Marc M Van Wanrooij
- Department of Medical Physics and Biophysics, Institute for Neuroscience, Radboud University Nijmegen, 6525 EZ Nijmegen, The Netherlands
| | | |
Collapse
|